> Appendix A. Tools


A Tools

When I was young, I had a lot of fun playing a game called Neuromancer, which takes place in a world created by William Gibson, in the book with the same name. The game was very good at giving a similar feeling (I now know) to that of a hacker learning about and making his way through a system for the first time. The Internet was young at the time (1989), but the game had it all: email, newsgroups, servers, hacking, and artificial intelligence. (I am still waiting for that last one to appear in real life.) I was already interested in programming at that time, but I think the game pushed me somewhat toward computer security.

In the game, your success revolved around having the right tools at the right time. It did not allow you to create your own tools, so the action was mostly in persuading shady individuals to give, trade, or sell tools. In real life, these tools would be known under the name exploits. (It was acceptable to use them in the game because the player was fighting the evil AI.) Now, many years later, it is funny to realize that real life is much more interesting and creative than any game will ever be. Still, the security business feels much the same as in that game I played ages ago. For both, it is important to do the following:

This appendix contains a list of tools you may find useful to perform the activities mentioned throughout the book. While some of these are not essential (meaning there are lower-level tools that would get the work done), they are great time-savers.

The best way to learn about web application security is to practice development and assessment. This may prove difficult as not everyone has a web application full of vulnerabilities lying around. (Assessing someone else’s application without her consent is unacceptable.) The answer is to use a controlled environment in which programming mistakes have been planted on purpose.

Two such environments are available:

On Unix systems, most information gathering tools are available straight from the command line. It is the same on Windows, provided Cygwin (http://www.cygwin.com) is installed.

SiteDigger (http://www.foundstone.com/resources/proddesc/sitedigger.htm and shown in Figure A-5) is a free tool from Foundstone (http://www.foundstone.com) that uses the Google API to automate search engine information gathering. (Refer to Chapter 11 for a discussion on the subject of using search engines for reconnaissance.) In its first release, it performs a set of searches using a predefined set of signatures (stored as XML, so you can create your own signatures if you want) and exports results as an HTML page.

SSLDigger is another free utility from Foundstone (http://www.foundstone.com/resources/proddesc/ssldigger.htm). It performs automatic analysis of SSL-enabled web servers, testing them for a number of ciphers. Properly configured servers should not support weak ciphers. Figure A-6 shows results from analysis of the Amazon web site. Amazon only got a B grade because it supports many weaker (40-bit) ciphers. In its case, the B grade is the best it can achieve since it has to support the weaker ciphers for compatibility with older clients (Amazon does not want to turn the customers away).

Httprint (http://net-square.com/httprint/) is a web server fingerprinting tool (not free for commercial use). Unlike other tools, it does not use the forgeable Server header. Instead, it relies on web server characteristics (subtle differences in the implementation of the HTTP protocol) to match the server being analyzed to the servers stored in its database. It calculates the likelihood of the target server being one of the servers it has seen previously. The end result given is the one with the best match. When running Httprint against my own web server, I was impressed that it not only matched the brand, but the minor release version, too. For the theory behind web server fingerprinting, see:

“An Introduction to HTTP fingerprinting” by Saumil Shah (http://net-square.com/httprint/httprint_paper.html)

In Figure A-7, you can see how I used Httprint to discover the real identity of the server running www.modsecurity.org. (I already knew this, of course, but it proves Httprint works well.) As you can see, under “Banner Reported,” it tells what the Server header reports (in this case, the fake identity I gave it: Microsoft IIS) while the “Banner Deduced” correctly specifies Apache/1.3.27, with an 84.34% confidence rating.

You will need a range of network-level tools for your day-to-day activities. These command-line tools are designed to monitor and analyze traffic or allow you to create new traffic (e.g., HTTP requests).

Using a simple Telnet client will work well for most manually executed HTTP requests but it pays off to learn the syntax of Netcat. Netcat is a TCP and UDP client and server combined in a single binary, designed to be scriptable and used from a command line.

Netcat is available in two versions:

To use it as a port scanner, invoke it with the -z switch (to initiate a scan) and -v to tell it to report its findings:

$ nc -v -z www.modsecurity.org 1-1023
Warning: inverse host lookup failed for 
         Host name lookup failure
www.modsecurity.org [] 995 (pop3s) open
www.modsecurity.org [] 993 (imaps) open
www.modsecurity.org [] 443 (https) open
www.modsecurity.org [] 143 (imap) open
www.modsecurity.org [] 110 (pop3) open
www.modsecurity.org [] 80 (http) open
www.modsecurity.org [] 53 (domain) open
www.modsecurity.org [] 25 (smtp) open
www.modsecurity.org [] 23 (telnet) open
www.modsecurity.org [] 22 (ssh) open
www.modsecurity.org [] 21 (ftp) open

To create a TCP server on port 8080 (as specified by the -p switch), use the -l switch:

$ nc -l -p 8080

To create a TCP proxy, forwarding requests from port 8080 to port 80, type the following. (We need the additional pipe to take care of the flow of data back from the web server.)

$ mknod ncpipe p
$ nc -l -p 8080 < ncpipe | nc localhost 80 > ncpipe

Stunnel (http://www.stunnel.org) is a universal SSL driver. It can wrap any TCP connection into an SSL channel. This is handy when you want to use your existing, non-SSL tools, to connect to an SSL-enabled server. If you are using Stunnel Versions 3.x and older, all parameters can be specified on the command line. Here is an example:

$ stunnel -c -d 8080 -r www.amazon.com:443

By default, Stunnel stays permanently active in the background. This command line tells Stunnel to go into client mode (-c), listen locally on port 8080 (-d) and connect to the remote server www.amazon.com on port 443 (-r). You can now use any plaintext tool to connect to the SSL server through Stunnel running on port 8080. I will use telnet and perform a HEAD request to ensure it works:

$ telnet localhost 8080
Connected to debian.
Escape character is '^]'.

HTTP/1.1 302 Found
Date: Mon, 08 Nov 2004 11:45:15 GMT
Server: Stronghold/2.4.2 Apache/1.3.6 C2NetEU/2412 (Unix) amarewrite/0.1
Location: http://www.amazon.com/
Connection: close
Content-Type: text/html; charset=iso-8859-1

Connection closed by foreign host.

Stunnel Versions 4.x and above require all configuration options to be put in a configuration file. The configuration file equivalent to the pre-4.x syntax is:

# run as a client
client = yes

# begin new service definition

# accept plaintext connections on 8080
accept = 8080

# connect to a remote SSL-enabled server
connect = www.apachesecurity.net:443

Assuming you have put the configuration into a file called stunnel.conf, run Stunnel with:

$ stunnel stunnel.conf

Similar to how network security scanners operate, web security scanners try to analyze publicly available web resources and draw conclusions from the responses.

Web security scanners have a more difficult job to do. Traditional network security revolves around publicly known vulnerabilities in well-known applications providing services (it is rare to have custom applications on the TCP level). Though there are many off-the-shelf web applications in use, most web applications (or at least the interesting ones) are written for specific purposes, typically by in-house teams.

Web security tools provide four types of functionality, and there is a growing trend to integrate all the types into a single package. The four different types are:

Many free (and some open source) web security tools are available:

These tools are rich in functionality but lacking in documentation and quality control. Some functions in their user interfaces can be less than obvious (this is not to say commercial tools are always user friendly), so expect to spend some time figuring out how they work. The trend is to use Java on the client side, making the tools work on most desktop platforms.

Paros and WebScarab compete for the title of the most useful and complete free tool. The Burp tools show potential, but lack integration and polish.

Paros (see Figure A-9) will probably fill most of your web security assessment needs. It can be used to do the following:

  • Work as a proxy with support for HTTP and HTTPS

  • Crawl the site to discover links

  • Visualize the application

  • Intercept (and optionally modify) requests and responses

  • Run filters on requests and responses

  • Examine recorded traffic

  • Perform automated tests on dynamic pages

If you are more interested in commercial tools than in open source ones, many are available. Categorizing them is sometimes difficult because they often include all features of interest to web security professionals in one single package. Most tools are a combination of scanner and proxy, with a bunch of utilities thrown in. So, unlike the open source tools where you have to use many applications from different authors, with a commercial tool you are likely to find all you need in one place. Commercial web security tools offer many benefits:

One significant disadvantage is the cost. The area of web application security is still very young, so it is natural that tools are expensive. From looking at the benefits above, employees of larger companies and web security consultants are the most likely to buy commercial tools. Members of these groups are faced with the unknown, have limited time available, and must present themselves well. An expensive commercial tool often increases a consultant’s credibility in the eyes of a client.

Here are some of the well-known commercial tools:

When all else fails, you may have to resort to programming to perform a request or a series of requests that would be impossible otherwise. If you are familiar with shell scripting, then the combination of expect (a tool that can control interactive programs programmatically), netcat, curl, and stunnel may work well for you. (If you do not already have expect installed, download it from http://expect.nist.gov.)

For those of you who are more programming-oriented, turning to one of the available HTTP programming libraries will allow you to do what you need fast: