When I was young, I had a lot of fun playing a game called Neuromancer, which takes place in a world created by William Gibson, in the book with the same name. The game was very good at giving a similar feeling (I now know) to that of a hacker learning about and making his way through a system for the first time. The Internet was young at the time (1989), but the game had it all: email, newsgroups, servers, hacking, and artificial intelligence. (I am still waiting for that last one to appear in real life.) I was already interested in programming at that time, but I think the game pushed me somewhat toward computer security.
In the game, your success revolved around having the right tools at the right time. It did
not allow you to create your own tools, so the action was mostly in persuading shady
individuals to give, trade, or sell tools. In real life, these tools would be known under
the name exploits
. (It was acceptable to use them in the game because
the player was fighting the evil AI.) Now, many years later, it is funny to realize that
real life is much more interesting and creative than any game will ever be. Still, the
security business feels much the same as in that game I played ages ago. For both, it is
important to do the following:
Start with a solid understanding of the technology
Have and use the correct tools
Write your own tools
This appendix contains a list of tools you may find useful to perform the activities mentioned throughout the book. While some of these are not essential (meaning there are lower-level tools that would get the work done), they are great time-savers.
The best way to learn about web application security is to practice development and assessment. This may prove difficult as not everyone has a web application full of vulnerabilities lying around. (Assessing someone else’s application without her consent is unacceptable.) The answer is to use a controlled environment in which programming mistakes have been planted on purpose.
Two such environments are available:
WebMaven (http://www.mavensecurity.com/webmaven/
)
WebGoat (http://www.owasp.org/software/webgoat.html
)
WebMaven is a simple interactive learning environment for web application security. It was originally developed by David Rhoades from Maven Security and subsequently released as open source. Written in Perl, the application is easy to install on Unix and Windows computers.
WebMaven simulates an online banking system (“Buggy Bank”), which offers customers the ability to log in, log out, view account status, and transfer funds. As you can imagine, the application contains many (ten, according to the user manual) intentional errors. Your task is to find them. If you get stuck, you can find the list of vulnerabilities at the end of the user manual. Looking at the vulnerability list defeats the purpose of the learning environment so I strongly encourage you to try it on your own for as long as you can. You can see the welcome page of the Buggy Bank in Figure A-1.
WebGoat (Figure A-2) is a Java-based web security environment for learning. The installation script is supposed to install Tomcat if it is not already installed, but as of this writing, it doesn’t work. (It attempts to download an older version of Tomcat that is not available for download any more.) You should install Tomcat manually first.
Unlike WebMaven, WebGoat does not attempt to emulate a real web site. Instead, it offers 12 lessons in web security:
HTTP Basics
Encoding Basics
Fail Open Authentication
HTML Clues
Parameter Injection
Unchecked Email
SQL Injection
Thread Safety
Weak Authentication Cookie
Database XSS
Hidden Field Tampering
Weak Access Control
Each lesson consists of a lesson plan, several hints, the application source code, and practical work with the ability to look into the data exchanged between the client and the server.
Working with WebGoat is great fun, and I recommend it even if you have web security experience. After you complete the lessons, you can take up the challenge, which is a simulated real-life problem where you can test your skills.
On Unix systems, most information gathering tools are available straight from the
command line. It is the same on Windows, provided
Cygwin (http://www.cygwin.com
) is
installed.
If all you have is a browser, TechnicalInfo contains a set of links (http://www.technicalinfo.net/tools/
) to various
information-gathering tools hosted elsewhere. Using them can be cumbersome and slow,
but they get the job done.
Netcraft (http://www.netcraft.co.uk
) is famous
for its “What is that site running?” service, which identifies web servers using the
Server
header. (This is not completely reliable since some
sites hide or change this information, but many sites do not.) Netcraft is
interesting not because it tells you which web server is running at the site, but
because it keeps historical information around. In some cases, this information can
reveal the real identity of the web server.
This is exactly what happened with the web server hosting my web site
www.modsecurity.org
. I changed the web server signature some
time ago, but the old signature still shows in Netcraft results.
Figure A-3 reveals another problem with changing server signatures. It lists my server as running Linux and Internet Information Server simultaneously, which is implausible. In this case, I am using the signature “Microsoft-IIS/5.0” as a bit of fun. If I were to use it seriously, I would need to pay more attention to what signature I was choosing.
Sam Spade (http://www.samspade.org/ssw/
), a
freeware network query tool from Steve Atkins will probably provide you with all the
network tools you need if your desktop is running Windows. Sam Spade includes all
the passive tools you would expect, plus some advanced features on top of
those:
Simple multiaddress port scanning.
Web site crawling, including the ability to apply a regular expression against the content of every page crawled.
Simple web site browsing. It does not do HTML rendering, but it does display headers.
Sam Spade’s biggest asset comes from integration. It parses query results and
understands what bits of information mean, allowing further actions to be performed
quickly via a right-click context menu. Figure
A-4 shows output from a whois
query. Some queries
are semi-automated; Sam will automatically perform further queries as you would
typically want them done anyway. To save time, queries are performed in parallel
where possible.
Automatic activity logging is a big plus. Each query has its own window, but with a single click, you can choose whether to log its output.
The Sam Spade web site contains a large library (http://www.samspade.org/d/
) of document links. It can help to form
a deeper understanding of the network and the way network query tools work.
SiteDigger (http://www.foundstone.com/resources/proddesc/sitedigger.htm
and
shown in Figure A-5) is a free tool from
Foundstone (http://www.foundstone.com
) that uses
the Google API to automate search engine information gathering. (Refer to Chapter 11 for a discussion on the subject of
using search engines for reconnaissance.) In its first release, it performs a set of
searches using a predefined set of signatures (stored as XML, so you can create your
own signatures if you want) and exports results as an HTML page.
SSLDigger is another free utility from Foundstone (http://www.foundstone.com/resources/proddesc/ssldigger.htm
). It
performs automatic analysis of SSL-enabled web servers, testing them for a number of
ciphers. Properly configured servers should not support weak ciphers. Figure A-6 shows results from analysis of
the Amazon web site. Amazon only got a B grade because it supports many weaker
(40-bit) ciphers. In its case, the B grade is the best it can achieve since it has
to support the weaker ciphers for compatibility with older clients (Amazon does not
want to turn the customers away).
Httprint (http://net-square.com/httprint/
) is
a web server fingerprinting tool (not free for commercial use). Unlike other tools,
it does not use the forgeable Server
header. Instead, it relies
on web server characteristics (subtle differences in the implementation of the HTTP
protocol) to match the server being analyzed to the servers stored in its database.
It calculates the likelihood of the target server being one of the servers it has
seen previously. The end result given is the one with the best match. When running
Httprint against my own web server, I was impressed that it not only matched the
brand, but the minor release version, too. For the theory behind web server
fingerprinting, see:
“An Introduction to HTTP fingerprinting” by Saumil Shah (http://net-square.com/httprint/httprint_paper.html ) |
In Figure A-7, you can see how I used
Httprint to discover the real identity of the server running
www.modsecurity.org
. (I already knew this, of course, but it
proves Httprint works well.) As you can see, under “Banner Reported,” it tells what
the Server
header reports (in this case, the fake identity I gave
it: Microsoft IIS) while the “Banner Deduced” correctly specifies Apache/1.3.27,
with an 84.34% confidence rating.
You will need a range of network-level tools for your day-to-day activities. These command-line tools are designed to monitor and analyze traffic or allow you to create new traffic (e.g., HTTP requests).
Using a simple Telnet client will work well for most manually executed HTTP requests but it pays off to learn the syntax of Netcat. Netcat is a TCP and UDP client and server combined in a single binary, designed to be scriptable and used from a command line.
Netcat is available in two versions:
@stake Netcat (the original, http://www.securityfocus.com/tools/137
)
GNU Netcat (http://netcat.sourceforge.net/
)
To use it as a port scanner, invoke it with the -z
switch (to
initiate a scan) and -v
to tell it to report its findings:
$ nc -v -z www.modsecurity.org 1-1023
Warning: inverse host lookup failed for 217.160.182.153:
Host name lookup failure
www.modsecurity.org [217.160.182.153] 995 (pop3s) open
www.modsecurity.org [217.160.182.153] 993 (imaps) open
www.modsecurity.org [217.160.182.153] 443 (https) open
www.modsecurity.org [217.160.182.153] 143 (imap) open
www.modsecurity.org [217.160.182.153] 110 (pop3) open
www.modsecurity.org [217.160.182.153] 80 (http) open
www.modsecurity.org [217.160.182.153] 53 (domain) open
www.modsecurity.org [217.160.182.153] 25 (smtp) open
www.modsecurity.org [217.160.182.153] 23 (telnet) open
www.modsecurity.org [217.160.182.153] 22 (ssh) open
www.modsecurity.org [217.160.182.153] 21 (ftp) open
To create a TCP server on port 8080 (as specified by the -p
switch), use the -l
switch:
$ nc -l -p 8080
To create a TCP proxy, forwarding requests from port 8080 to port 80, type the following. (We need the additional pipe to take care of the flow of data back from the web server.)
$mknod ncpipe p
$nc -l -p 8080 < ncpipe | nc localhost 80 > ncpipe
Stunnel (http://www.stunnel.org
) is a
universal SSL driver. It can wrap any TCP connection into an SSL channel. This is
handy when you want to use your existing, non-SSL tools, to connect to an
SSL-enabled server. If you are using Stunnel Versions 3.x and older, all parameters
can be specified on the command line. Here is an example:
$ stunnel -c -d 8080 -r www.amazon.com:443
By default, Stunnel stays permanently active in the background. This command line
tells Stunnel to go into client mode (-c
), listen locally on port
8080 (-d
) and connect to the remote server
www.amazon.com
on port 443 (-r
). You can
now use any plaintext tool to connect to the SSL server through Stunnel running on
port 8080. I will use telnet and perform a HEAD
request to ensure
it works:
$telnet localhost 8080
Trying 127.0.0.1... Connected to debian. Escape character is '^]'.HEAD / HTTP/1.0
HTTP/1.1 302 Found Date: Mon, 08 Nov 2004 11:45:15 GMT Server: Stronghold/2.4.2 Apache/1.3.6 C2NetEU/2412 (Unix) amarewrite/0.1 mod_fastcgi/2.2.12 Location: http://www.amazon.com/ Connection: close Content-Type: text/html; charset=iso-8859-1 Connection closed by foreign host.
Stunnel Versions 4.x and above require all configuration options to be put in a configuration file. The configuration file equivalent to the pre-4.x syntax is:
# run as a client client = yes # begin new service definition [https_client] # accept plaintext connections on 8080 accept = 8080 # connect to a remote SSL-enabled server connect = www.apachesecurity.net:443
Assuming you have put the configuration into a file called
stunnel.conf
, run Stunnel with:
$ stunnel stunnel.conf
Curl (http://curl.haxx.se
) is a command-line
tool that works with the HTTP and HTTPS protocols on a higher level. (It understands
many other protocols, but they are not very interesting for what we are doing here.)
You will want to use Curl for anything other than the most trivial HTTP requests.
Things such as POST
and PUT
requests or file
uploads are much simpler with Curl.
For example, uploading a file archive.tar.gz
(assuming the
file upload field is named filename
) to script
upload.php
is as simple as:
$ curl -F filename=@archive.tar.gz http://www.example.com/upload.php
The following is a brief but informative tutorial on HTTP scripting with Curl:
“The Art Of Scripting HTTP Requests Using Curl” by Daniel Stenberg
(http://curl.haxx.se/docs/httpscripting.html ) |
When HTTP traffic flows over an unprotected channel, network-level traffic monitoring can be used for various purposes. Some of the possible uses are:
Monitoring who accesses what and when
Stealing authentication credentials
Stealing session tokens
It does not matter if the network is switched or not, if data is traveling unprotected, it can be sniffed. Here are the most popular network-monitoring tools:
Tcpdump (http://www.tcpdump.org
)
Ethereal (http://www.ethereal.com
)
Ettercap (http://ettercap.sourceforge.net
)
Dsniff (http://monkey.org/~dugsong/dsniff/
)
Ngrep (http://ngrep.sourceforge.net
)
The combination of Tcpdump plus Ethereal has worked well for me in the past, and I propose you try them first.
There are a few commercial Windows-based network-monitoring tools (designed to work with HTTP) available. They are inexpensive, so you may want to give them a try.
HTTP Sniffer (http://www.effetech.com/sniffer/
)
HTTPLook (http://www.httpsniffer.com
)
SSLDump (http://www.rtfm.com/ssldump/
) is an
SSL network protocol analyzer. It can be used where most other network sniffing
tools cannot, which is to look into the SSL traffic:
# ssldump port 443
I did say look, but the previous command will only be able to examine the
structure of SSL traffic and not display the application data. That would defeat the
point of SSL. However, ssldump
can display application data,
too, but only if it is provided with the private server key:
# ssldump -d -k key.pem host www.apachesecurity.net port 443
Similar to how network security scanners operate, web security scanners try to analyze publicly available web resources and draw conclusions from the responses.
Web security scanners have a more difficult job to do. Traditional network security revolves around publicly known vulnerabilities in well-known applications providing services (it is rare to have custom applications on the TCP level). Though there are many off-the-shelf web applications in use, most web applications (or at least the interesting ones) are written for specific purposes, typically by in-house teams.
Nikto (http://www.cirt.net/code/nikto.shtml
)
is a free web security scanner. It is an open source tool available under the GPL
license. There is no support for GUI operation, but the command-line options work on
Unix and Windows systems. Nikto focuses on three web-related issues:
Web server misconfiguration
Default files and scripts (which are sometimes insecure)
Outdated software
Known vulnerabilities
Nikto cannot be aware of vulnerabilities in custom applications, so you will have to look for them yourself. Looking at how it is built and what features it supports, Nikto is very interesting:
Written in Perl, uses libwhisker
Supports HTTP and HTTPS
Comes with a built-in signature database, showing patterns that suggest attacks; this database can be automatically updated
Allows the use of a custom signature database
Supports Perl-based plug-ins
Supports TXT, HTML, or CVS output
If Perl is your cup of tea you will find Nikto very useful. With some knowledge of libwhisker, and the internal workings of Nikto, you should be able to automate the boring parts of web security assessment by writing custom plug-ins.
Nikto’s greatest weakness is that it relies on the pre-built signature database to be effective. As is often the case with open source projects, this database does not seem to be frequently updated.
Nessus (http://www.nessus.org
) is a well-known
open source (GPL) security scanner. Scanning web servers is only one part of what it
does, but it does it well. It consists of two parts. The server part performs the
testing. The client part is responsible for talking to the user. You can use the
existing client applications, or you can automate scanning through the direct use of
the communication protocol (documented in several documents available from the web
site).
Nessus relies heavily on its plug-in architecture. Plug-ins can be written in C,
or in its custom NASL (short for Nessus Attack Scripting Language). A GUI-based
client is available for Nessus (NessusWX, http://nessuswx.nessus.org
), which makes it a bit easier to use.
This client is shown in Figure
A-8.
The problem with Nessus (from our web security point of view) is that it is designed as a generic security scanner, but the test categorization does not allow us to turn off the tests that are not web-related.
Web security tools provide four types of functionality, and there is a growing trend to integrate all the types into a single package. The four different types are:
Execute a predetermined set of requests, analyzing responses to detect configuration errors and known vulnerabilities. They can discover vulnerabilities in custom applications by mutating request parameters.
Map the web site and analyze the source code of every response to discover “invisible” information: links, email addresses, comments, hidden form fields, etc.
Standing in the middle, between a browser and the target, assessment proxies record the information that passes by, and allow requests to be modified on the fly.
Utilities used for brute-force password attacks, DoS attacks, encoding and decoding of data.
Many free (and some open source) web security tools are available:
Paros (http://www.parosproxy.org
)
Burp proxy (http://www.portswigger.net/proxy/
)
Brutus (password cracker; http://www.hoobie.net/brutus/
)
Burp spider (http://portswigger.net/spider/
)
Sock (http://portswigger.net/sock/
)
WebScarab (http://www.owasp.org/software/webscarab.html
)
These tools are rich in functionality but lacking in documentation and quality control. Some functions in their user interfaces can be less than obvious (this is not to say commercial tools are always user friendly), so expect to spend some time figuring out how they work. The trend is to use Java on the client side, making the tools work on most desktop platforms.
Paros and WebScarab compete for the title of the most useful and complete free tool. The Burp tools show potential, but lack integration and polish.
Paros (see Figure A-9) will probably fill most of your web security assessment needs. It can be used to do the following:
Work as a proxy with support for HTTP and HTTPS
Crawl the site to discover links
Visualize the application
Intercept (and optionally modify) requests and responses
Run filters on requests and responses
Examine recorded traffic
Perform automated tests on dynamic pages
If you are more interested in commercial tools than in open source ones, many are available. Categorizing them is sometimes difficult because they often include all features of interest to web security professionals in one single package. Most tools are a combination of scanner and proxy, with a bunch of utilities thrown in. So, unlike the open source tools where you have to use many applications from different authors, with a commercial tool you are likely to find all you need in one place. Commercial web security tools offer many benefits:
You get all the tools you need in a single, consistent, often easy-to-use package.
Base signatures cover common configuration problems and web security vulnerabilities. These signatures can be very important if you are just starting to do web security and you do not know where to look.
Having an up-to-data database of signatures, which covers web server vulnerabilities and vulnerabilities in dozens of publicly available software packages, is a big plus if you need to perform black-box assessment quickly.
With a good commercial tool, it is easy to create a comprehensive and good-looking report. If your time is limited and you need to please the customer (or the boss), a commercial tool is practically the only way to go.
One significant disadvantage is the cost. The area of web application security is still very young, so it is natural that tools are expensive. From looking at the benefits above, employees of larger companies and web security consultants are the most likely to buy commercial tools. Members of these groups are faced with the unknown, have limited time available, and must present themselves well. An expensive commercial tool often increases a consultant’s credibility in the eyes of a client.
Here are some of the well-known commercial tools:
When all else fails, you may have to resort to programming to perform a request or a
series of requests that would be impossible otherwise. If you are familiar with shell
scripting, then the combination of expect
(a tool that can control
interactive programs programmatically), netcat
,
curl
, and stunnel
may work well for you.
(If you do not already have expect
installed, download it from
http://expect.nist.gov
.)
For those of you who are more programming-oriented, turning to one of the available HTTP programming libraries will allow you to do what you need fast:
http://lwp.linpro.no/lwp/
)A collection of Perl modules that provide the functionality needed to programmatically generate HTTP traffic.
http://curl.haxx.se/libcurl/
)The core library used to implement curl. Bindings for 23 languages are available.
http://www.wiretrip.net/rfp/lw.asp
)A Perl library that automates many HTTP-related tasks. It even supports
some IDS evasion techniques transparently. A SecurityFocus article on
libwhisker, “Using Libwhisker” by Neil Desai (http://www.securityfocus.com/infocus/1798
), provides
useful information on the subject.
http://jakarta.apache.org/commons/httpclient/
)If you are a Java fan, you will want to go pure Java, and you can with HttpClient. Feature-wise, the library is very complete. Unfortunately, every release comes with an incompatible programming interface.