Difference between revisions of "Episode265"
|Line 14:||Line 14:|
You can donate or get more involved via the [http://www.hackersforcharity.org/hackers-for-charity/get-involved/ Hackers for Charity website].
You can donate or get more involved via the [http://www.hackersforcharity.org/hackers-for-charity/get-involved/ Hackers for Charity website].
= Kick off with Johnny Long =
= Kick off with Johnny Long =
Revision as of 14:07, 28 October 2011
- 1 Announcements & Shameless Plugs
- 2 Kick off with Johnny Long
- 3 Marcus Ranum Interview
- 4 The Stogie Geeks Podcast
- 5 Ron Gula Interview
- 6 Tech Segment: Concealing Storage in Windows Volume Shadow Copy Service
- 7 Robert Graham Interview
- 8 Tech Segment: Ruby honeyports and the new anti-kit for android
- 9 Tech Segment: Busting Directories: Dirbuster and Alternatives
- 10 Kevin Mitnick Interview
- 11 Tech Segment: TOP SECRET
- 12 HackNaked TV/ PaulDotCom Espanol /Dinner Break
- 13 Tech Segment: Google Hacking Diggity Project
- 14 Ancient alien beings, hypervisors and virtualization
- 15 Tech Segment: New ways to Persist with Metasploit
- 16 Pen Testing War Stories with Kevin Fiscus
- 17 Drunken Security News
Announcements & Shameless Plugs
PaulDotCom Security Weekly - Episode 265 Hackers for Charity Twelve hour podcast for Friday October 28th, 2011.
Watch the show live below or at http://securityweekly.com/live August 31, 2012 10AM-6PM EDT
NOTE: The video will play the most recent show up until we are live!
The HFC group:
- Feeds children through a "food for work" program.
- Builds computer labs to help students learn skills and land jobs that are key to disrupting poverty's vicious cycle.
- Provides technical assistance to charities that can't afford IT services.
- Provides job experience and references to the Ugandan volunteers.
You can donate or get more involved via the Hackers for Charity website.
- Check out Hack Naked TV
- Friday October 28th is our 12 hour podcast for Hackers for Charity - we have a special interview with Johnny Long, Kevin Mitnick and other special guests in the works.
- Larry is teaching SEC580 Metasploit Kung Fu for Enterprise Pen Testing in San Antonio, TX December 4-5. Tell them (and us) that we sent you!
- Don't forget to Read our blog, Participate on our mailing list, Visit PaulDotCom Insider, Follow us on Twitter, Join the IRC channel at irc.freenode.net #pauldotcom, Watch our Videos and Add us on Facebook where we can be "friends"
- BSides, BSides, BSides everywhere
Kick off with Johnny Long
10 AM EDT
Johnny calls in to update us on his Hackers for Charity project.
Marcus Ranum Interview
- Marcus, what's the latest on cyberwar? Has stuxnet changed some of your views on Cyberwar?
- Speaking of SCADA, what can we do to improve the security of SCADA systems? It seems every month there is a new "SCADA hack" and vendors and organizations that aren't paying attention to security.
- Penetration tests are successful, and one of the primary ways in which we are gaining access to systems is through "client-side attacks". Essentially, we are tricking the user into running code, no vulnerabilities or "exploits" required. What can organizations do to protect against this threat?
- Speaking of vulnerabilities, there still seems to be a mindset in the community of "become vulnerable, exploit, apply patch, rinse, repeat". What can we do to shift people away from the "patch mindset" to defensive measures that actually work?
- Speaking of defensive measures, what are your thoughts on "smart firewalls"? Are they still the stop-gap measure that is masking the real problems?
Hosts: Paul Asadoorian & Tim "Bugbear" Mugherini
Noon - 1PM
Paul & Tim will smoke some cigars and tell you all about them, talk about what they's been smoking, and feature a "Stogie How-To" segment titled "The Top Ten Things You Should Not To Do With Your Humidor".
Ron Gula Interview
10 Things You Shouldn’t Do For Cyber-Security Awareness Month - According to the Department of Homeland Security, October 2011 is national CyberSecurity Awareness month. With the best intentions, I’ve laid out ten items that you shouldn’t do in an attempt to raise awareness. I’ve seen these items backfire, cause disruption and raise awareness of how security can make our life less convenient and questionably more secure.
Ron's Top Ten List:
10 – Perform a Client Side Penetration Test
9 – Switch to IPv6
8 – Learn Government Compliance Standards
7 – Read Computer Security Related Books of Fiction and Fact
6 – Engage in a religious debate about the most secure OS, phone or Web Browser
5 – Run a Honeypot
4 – Blame any attacks or viruses outbreaks on China
3 - Publish lists of People’s Cracked Passwords
2 – Patch all of those systems that haven’t been patched in a long while
1 - Turn off your Anti-Virus Product
Tech Segment: Concealing Storage in Windows Volume Shadow Copy Service
Authors: Mark Baggett and Tim "LaNMaSteR53" Tomes
Robert Graham Interview
- How did you get your start in information security?
- Tell us about what you learned about Occupy Wall Street
- What is your policy on disclosure and what are the merits to Digital Bond's policy?
- What can Apple do to improve the security of its products?
Tech Segment: Ruby honeyports and the new anti-kit for android
Author: John Strand
Tech Segment: Busting Directories: Dirbuster and Alternatives
Author: Larry Pesce
Ok, we've covered DirBuster before. Depending on options this could take a long time. One advantage with Dirbuster, is a GUI. (Yeah, Girls Use It). Another is the ability to brute force directories and files without a wordlist…and have it take forever.
So what happens if you want to do it form the command line?
Well, we can still use DirBuster from the command line, but it is not documented terribly well. Let's get some better documentation:
java -jar DirBuster-0.12.jar -h Usage: java -jar DirBuster-1.0-RC1 -u <URL http://example.com/> [Options] Options: -h : Display this help message -H : Start DirBuster in headless mode (no gui), report will be auto saved on exit -l <Word list to use> : The Word list to use for the list based brute force. Default: /Users/larry/Desktop/DirBuster-1.0-RC1/directory-list-2.3-small.txt -g : Only use GET requests. Default Not Set -e <File Extention list> : File Extention list eg asp,aspx. Default: php -t <Number of Threads> : Number of connection threads to use. Default: 10 -s <Start point> : Start point of the scan. Default: / -v : Verbose output, Default: Not set -P : Don't Parse html, Default: Not Set -R : Don't be recursive, Default: Not Set -r <location> : File to save report to. Default: /Users/larry/Desktop/DirBuster-1.0-RC1/DirBuster-Report-[hostname]-[port].txt
Ok, so now we can begin to put together some command line options:
java -jar DirBuster-1.0-RC1 -u http://www.somesite.com -H -r output.txt
In this case we have started it with -H for headless operation (don't start the GUI). In order to save some typing, we have also omitted the -l switch (to use the default wordlist). What if we want to brute force filenames as well?
java -jar DirBuster-1.0-RC1 -u http://www.somesite.com -H -r output.txt -e asp,aspx,html,htm
Now there are a few interesting caveats with this. First off, DirBuster is Java, which can be a little heavy. I haven't been able to make it work well/successfully. Sure, It will run anywhere, but it is java. Also, no command line realtime feedback, and seeing soemthing about having to parse XML to do some reporting on. Yuck. How about a different command line option?
It is command line only and has most of the functionality of DirBuster, but without the overhead of java or a GUI. It retains most of the functionality of DirBuster - the only thing I cannot find is the ability to do brute forcing without a wordlist.
It should compile on just about any posix system that has access to libcurl. It installed without issue on my OSX systems and Ubuntu. I believe libcurl were already installed in both cases for other projects.
So, lets see how it works
$ ./dirb http://www.somesite.com ./wordlists/big.txt,./wordlists/vulns/sharepoint.txt,./wordlists/vulns/iis.txt ----------------- DIRB v2.03 By The Dark Raver ----------------- START_TIME: Wed Oct 26 10:18:42 2011 URL_BASE: http://www.somesite.com/ WORDLIST_FILES: ./wordlists/big.txt,./wordlists/vulns/sharepoint.txt,./wordlists/vulns/iis.txt ----------------- GENERATED WORDS: 4712 ---- Scanning URL: http://www.somesite.com/ ---- + http://www.somesite.com// (FOUND: 200 [Ok] - Size: 29435) + http://www.somesite.com/Admin/ ==> DIRECTORY + http://www.somesite.com/aspnet_client (FOUND: 403 [Forbidden] - Size: 218) + http://www.somesite.com/components/ ==> DIRECTORY + http://www.somesite.com/config/ ==> DIRECTORY + http://www.somesite.com/controls/ …
Pretty simple, eh?
We can also use it to brute force filenames, and we give the extensions we want to test with the -X switch, based on the words in the wordlists specified:
$ ./dirb http://www.somesite.com ./wordlists/big.txt,./wordlists/vulns/sharepoint.txt,./wordlists/vulns/iis.txt -X .asp,.aspx,.html,.htm ----------------- DIRB v2.03 By The Dark Raver ----------------- START_TIME: Wed Oct 26 10:25:52 2011 URL_BASE: http://www.somesite.com/ WORDLIST_FILES: ./wordlists/big.txt,./wordlists/vulns/sharepoint.txt,./wordlists/vulns/iis.txt EXTENSIONS_LIST: (.asp,.aspx,.html,.htm) | (.asp)(.aspx)(.html)(.htm) [NUM = 4] ----------------- GENERATED WORDS: 4712 ---- Scanning URL: http://www.somesite.com/ ---- --> Testing: http://www.somesite.com/2002.htm …
$ ./dirb http://www.somesite.com ./wordlists/big.txt,./wordlists/vulns/sharepoint.txt,./wordlists/vulns/iis.txt -o outfile.txt -S
We need the -S for silent to make the report readable….of course this can be combines with this command as well:
$ ./dirb http://www.somesite.com ./wordlists/big.txt,./wordlists/vulns/sharepoint.txt,./wordlists/vulns/iis.txt -X .asp,.aspx,.html,.htm -o outfile.txt -S
The -S is really needed. Why? If it is not included the log includes EVERY test that was against the host, successful or not. If the -S silent switch is added, it only includes (at the terminal and the log) the successful finds. Oh, and the report is in plain text, great for additional reporting and or post processing with unix text processing.
As far as real time feed back, that happens too, including status codes, and size of the pages returned. That's helpful for knocking out unusual pages from standard responses such as "Directory Listing not allowed" or 30x Moves.
One other helpful switch that I found would be to use the -i switch. Thsi will launch a case insensitive search and can cut down on the amout of requests, especiually when headed to an IIS system or apache on Windows (yeah, windows is case insensitive, unlike posix OSes). One way that I like to determine Webserver type is to use a firefox plugin "Header Spy", which places it on the bottom bar of the browser. This of course does not accutatley identify Apache on Windows all of the time. nor is it completely accurate.
So, lets find another way around that and using command line tools.
Yay nmap, and thanks to Ron Bowes for the http-headers NSE script. Let's fire this off like so:
Hiroshige:~ lpesce$ nmap -sV --script=http-headers -p 80 www.healthcomp.com Starting Nmap 5.51 ( http://nmap.org ) at 2011-10-27 16:44 EDT Note: Host seems down. If it is really up, but blocking our ping probes, try -Pn Nmap done: 1 IP address (0 hosts up) scanned in 3.06 seconds Hiroshige:~ lpesce$ nmap -sV --script=http-headers -p 80 www.somesite.com Starting Nmap 5.51 ( http://nmap.org ) at 2011-10-27 16:46 EDT Nmap scan report for www.somesite.com (188.8.131.52) Host is up (0.067s latency). rDNS record for 184.108.40.206: 208-87-35-101.securehost.com PORT STATE SERVICE VERSION 80/tcp open http Apache httpd 2.2.17 ((Ubuntu)) | http-headers: | Date: Thu, 27 Oct 2011 20:46:42 GMT | Server: Apache/2.2.17 (Ubuntu) | X-Powered-By: PHP/5.3.5-1ubuntu7.2 | Set-Cookie: uid=www4ea9c332892637.06471309; expires=Sat, 26-Nov-2011 20:46:42 GMT | Vary: Accept-Encoding | Connection: close | Content-Type: text/html | Set-Cookie: WEB=W3; path=/ | |_ (Request type: HEAD) Service detection performed. Please report any incorrect results at http://nmap.org/submit/ . Nmap done: 1 IP address (1 host up) scanned in 6.73 seconds
In this case we've got Apache on Ubuntu. No need for the DIRB -i switch here.
$ nmap -sV --script=http-headers -p 80 www.someothersite.org Starting Nmap 5.51 ( http://nmap.org ) at 2011-10-27 16:49 EDT Nmap scan report for www.carene.org (220.127.116.11) Host is up (0.019s latency). rDNS record for 18.104.22.168: www.someothersite.org PORT STATE SERVICE VERSION 80/tcp open http Microsoft IIS httpd 7.0 | http-headers: | Content-Type: text/html; charset=UTF-8 | Server: Microsoft-IIS/7.0 | Set-Cookie: CFID=54754419;expires=Sat, 19-Oct-2041 20:49:14 GMT;path=/ | Set-Cookie: CFTOKEN=42783161;expires=Sat, 19-Oct-2041 20:49:14 GMT;path=/ | X-Powered-By: ASP.NET | Date: Thu, 27 Oct 2011 20:49:14 GMT | Connection: close | |_ (Request type: HEAD) Service Info: OS: Windows Service detection performed. Please report any incorrect results at http://nmap.org/submit/ . Nmap done: 1 IP address (1 host up) scanned in 7.12 seconds
This one would be a good candidate for the -i swtich.
$ nmap -sV --script=http-headers -p 80 192.168.10.19 Starting Nmap 5.51 ( http://nmap.org ) at 2011-10-27 17:56 EDT Nmap scan report for 192.168.10.19 Host is up (0.0042s latency). PORT STATE SERVICE VERSION 80/tcp open http Apache httpd 2.2.21 ((Win32)) | http-headers: | Date: Thu, 27 Oct 2011 21:56:31 GMT | Server: Apache/2.2.21 (Win32) | Last-Modified: Sat, 20 Nov 2004 18:16:24 GMT | ETag: "200000001bcee-2c-3e9549efc6e00" | Accept-Ranges: bytes | Content-Length: 44 | Connection: close | Content-Type: text/html | X-Pad: avoid browser bug | |_ (Request type: HEAD) Service detection performed. Please report any incorrect results at http://nmap.org/submit/ . Nmap done: 1 IP address (1 host up) scanned in 7.60 seconds
As would this one.
Ok, so the only thing that I'm finding to be an issue with DIRB is the lack of ability to do directory discover, then to filename with extension discovery in the new directories. It appears to only be one or the other.
Yes, you can do directory bruteforcing with nikto as well. Did you know that? Neither did I. It does require perl and a list of directories to be scanned. In this case I've used one from DIRB, as nikto does not come with one standard.
While it may not be as configurable, is does perform a bunch of additional tests, that can give a bunch of other interesting information. Let's get it rockin':
First, command line options:
perl ./nikto.pl - Nikto v2.1.4 --------------------------------------------------------------------------- + ERROR: No host specified -config+ Use this config file -Cgidirs+ scan these CGI dirs: 'none', 'all', or values like "/cgi/ /cgi-a/" -dbcheck check database and other key files for syntax errors -Display+ Turn on/off display outputs -evasion+ ids evasion technique -Format+ save file (-o) format -host+ target host -Help Extended help information -id+ Host authentication to use, format is id:pass or id:pass:realm -list-plugins List all available plugins -mutate+ Guess additional file names -mutate-options+ Provide extra information for mutations -output+ Write output to this file -nocache Disables the URI cache -nossl Disables using SSL -no404 Disables 404 checks -port+ Port to use (default 80) -Plugins+ List of plugins to run (default: ALL) -root+ Prepend root value to all requests, format is /directory -ssl Force ssl mode on port -Single Single request mode -timeout+ Timeout (default 2 seconds) -Tuning+ Scan tuning -update Update databases and plugins from CIRT.net -vhost+ Virtual host (for Host header) -Version Print plugin and database versions + requires a value Note: This is the short help output. Use -H for full help.
Now some usage:
$ perl nikto.pl -mutate 6 -mutate-options small.txt -output outpput.txt -host www.somesite.com - Mutate is deprecated, use -Plugins instead - Nikto v2.1.4 --------------------------------------------------------------------------- + Target IP: 10.14.171.123 + Target Hostname: www.somesite.com + Target Port: 80 + Using Mutation: Attempt to guess directory names from the supplied dictionary file + Start Time: 2011-10-28 21:27:58 --------------------------------------------------------------------------- + Server: Microsoft-IIS/6.0 + Retrieved x-aspnet-version header: 2.0.50727 + No CGI Directories found (use '-C all' to force check all possible dirs) + ERROR: Unable to open dictionary file : No such file or directory. + robots.txt contains 1 entry which should be manually viewed. + OSVDB-630: IIS may reveal its internal or real IP in the Location header via a request to the /images directory. The value is "http://192.168.254.10/images/". + ETag header found on server, fields: 0xa05cd8197221cc1:2902 + Microsoft-IIS/6.0 appears to be outdated (4.0 for NT 4, 5.0 for Win2k, current is at least 7.5) + Allowed HTTP Methods: OPTIONS, TRACE, GET, HEAD, POST + Public HTTP Methods: OPTIONS, TRACE, GET, HEAD, POST …
Pretty cool eh?
I'd never heard of this one, and at an initial stab, seems to be the least robust of the options but is an option nonetheless. This one requires perl as well.
First a little cmd line love:
perl ./http-dir-enum.pl http-dir-enum v0.4.2 ( http://portcullis-security.com/16.php ) Copyright (C) 2006 Mark Lowe ( firstname.lastname@example.org ) Given a URL and a wordlist, http-dir-enum will attempt to determine names of directories that exist on a website. Usage: http-dir-enum.pl [options] -f dir-file url options are: -m n Maximum number of worker processes (default: 8) -f file File of potential directory names -k file File of known directory names -c 0|1 Close connection between each attempt (default: 0) -r 0|1 Recursively enumerate sub directories (default: 1) -t n Wait a maximum of n seconds for reply (default: 20) -u user Username to use for basic authentication -p pass Password to use for basic authentication -H g|h HTTP method g=GET, h=HEAD (default: head) -i code Ignore HTTP response code (e.g. 404 or '404|200') -U str Set User-Agent header to str (default based on Firefox 22.214.171.124/Linux) -s 0|1 Add a trailing slash to the URL (default: 1) -S 0|1 Case sensitive directory names (default: 1) -a 0|1 Automatically determine HTTP response code to ignore (default: 1) -l n Limit scan to n attempts per second (default: unlimited) -R 0|1 Follow redirects (default: 0) -q Quiet. Don't print out info ("[I]") messages -n n Only read first n lines of dirs file (default: unlimited) -o file Save XML report of dirs found to file (default: don't save a report) -x regx Return only results that match this regular expression -X regx Ignore results that match this regular expression -P url Proxy URL -C str Use cookie -v Verbose -d Debugging output -D code Print out whole response if it has HTTP code "code" (e.g. 500) -h This help message The default options should be suitable most of the time, so the typical usage would be: http-dir-enum.pl -f dirs.txt http://host
Well, 2006 eh? I suppose that not much has really changed, except for maybe needing an updated directory file, which it includes. For me, the bad part? XML output. Well, let's give it a go:
$ perl ./http-dir-enum.pl -o output.xml -f directory-names.txt http://www.healthcomp.com Starting http-dir-enum v0.4.2 ( http://portcullis-security.com/16.php ) Copyright (C) 2006 Mark Lowe ( email@example.com ) ---------------------------------------------------------- | Scan Information | ---------------------------------------------------------- URL .................... http://www.somesite.com Processes .............. 8 Directory name file .... directory-names.txt Query timeout .......... 20 secs HTTP Method ............ HEAD Max Queries / sec ...... unlimited Trailing slash ......... On Recursive dir search ... On Close connections ...... Off Follow redirects ....... Off Case sensistive dirs ... On Auto-ignore ............ On Output file ............ output.xml ######## Scan started on Thu Oct 27 21:49:23 2011 ######### [I] Processing directory: / (0 dirs remaining) [I] Auto-ignoring HTTP code 404 for http://www.somesite.com admin 403 documentation 403 images 403 config 403 aspnet_client 403 …
Well, if it isn't terribly robust, at least it is in perl, and modified by many, and at least yet another option.
That is all I've got, have fun busting directories.
Kevin Mitnick Interview
Tech Segment: TOP SECRET
HackNaked TV/ PaulDotCom Espanol /Dinner Break
Tech Segment: Google Hacking Diggity Project
Author: with Jack "Tenacious" D.aniel
Ancient alien beings, hypervisors and virtualization
Your host for this journey will be none other than Eric Fitterman!
"Researchers in Central America recently discovered an ancient underground lake containing many new discoveries about the Mayan civilization. Among these discoveries were many mysterious glyphs depicting what appear to be other-worldy beings handing compact disks to Mayan priests. Excavation unearthed some ancient, but usable, compact disks, containing what appeared to be bootable Linux environments designed to reset passwords in VMware's ESX hypervisor. Eric Fiterman, of Rogue Networks, has extensively studied the artifacts from the project, and has learned that ancient alien beings may have given humans knowledge of hypervisors and virtualization long ago.
Eric believes that virtualization and computing were ancient technologies used by the Mayans, and that among this lost knowledge were the secrets of how to recover VMware hypervisor systems without a password. Eric will be discussing this recently discovered artifact, and is releasing a bootable ISO that allows users to restore ESX systems without re-installing the hypervisor."
Tech Segment: New ways to Persist with Metasploit
Author: Carlos "DarkOperator" Perez
Pen Testing War Stories with Kevin Fiscus
Drunken Security News
(This segment can go anywhere during the 12 hours, it should last an hour)
AWS almost pwnage - [Larry] - apparently due to some faulty crypto, attackers could have executed administrative tasks on any infrastructure. The basis for the attack was discovered in 2005, before AWS was even a glimmer.
Squid proxy vulns - [Larry] - I keep waiting for things like this, as I know of a bunch of things that use squid proxy that would be loads of fun to pwn. I won;t mention it, as it is an ongoing project for me, nor do I know if the information I have is still current. This may be one that reverst to the P in APT.
THC pwns SSL - [Larry] - Want to DOS an SSL enabled website? This tool will let you do that, using the built in itels for SSL re-negotiation. This feature was intended to keep things more secure, but can be used to do bad things as well.
10PM show over.