Performing Analysis on a Budget (Windows Forensic Analysis) Part 3

Packet Capture and Analysis

Another incident response activity that you may encounter is capturing and analyzing network traffic captures. Regardless of whether you capture network traffic yourself, work with on-site IT staff to ensure that network traffic is captured, or receive network traffic captures as data from someone else, you may be faced with the opportunity to capture and analyze network traffic.

Two popular network packet capture and analysis tools for Windows are Wireshark (www.wireshark.org) and NetworkMiner (http://sourceforge.net/projects/networkminer/). Both tools are freely available and extremely valuable to the responder’s toolkit.

At the time of this writing, Wireshark Version 1.0.3 was available for the Windows platform. Wireshark will not only allow you to capture network traffic (based on the network interface you choose) but also allow you to analyze network traffic captures. Figure 9.5 illustrates an excerpt of the GUI for Wireshark.

Figure 9.5 Excerpt of Wireshark v1.0.3 GUI

Excerpt of Wireshark v1.0.3 GUI


One of the capabilities of Wireshark that I have found to be extremely useful is its ability to completely reassemble TCP streams. To do this, with a network capture loaded into Wireshark, click Analyze in the menu bar and choose Follow TCP Stream from the drop-down menu. Wireshark will follow the stream and completely reassemble the contents of the TCP communications. This can be very useful in isolating a single communication, as well as in reconstructing a conversation. For example, you can reconstruct Web pages seen by a user, emails, botnet command-and-control communications, or unencrypted instant messaging exchanges. Wireshark provides the capability to do similar analysis with UDP and SSL packets.

Tools & Traps …

Network Traffic Captures

While we’re discussing network traffic captures, I’ll mention here where these captures fit into the overall incident response picture. Many incidents will involve a network component of some kind—a system gets infected by something downloaded from the Internet and then the infection spreads to other systems on the network, an intruder gains access to a system and controls it, or a bot gets onto a system and reaches out to a command-and-control server to await commands. Regardless of the type of incident, many incidents will involve a network component to some degree. That’s where network traffic captures can be extremely valuable sources of data. First, you will find information about the packets themselves, including source and destination IP addresses and ports. This information allows you to identify (1) the hosts involved (based on the IP address) and (2) which programs may be involved, if you are able to correlate the port information with volatile data (output of tcpvcon.exe, netstat.exe, or data parsed from a memory dump) that was collected from at least one of the hosts involved. Second, information within the packets often reassembled from the TCP "conversations" can show a great deal about what data was exchanged. This is valuable information should the question of data exfiltration (i.e., what data was taken off of a system) arise.

Wireshark also includes a Statistics menu bar option, which provides you with some tools to help you narrow your focus or filter through a vast amount of data to find that literal needle in a haystack. You can look at overall statistics of the packet capture, a detailed listing of network conversations within the packet capture, or just get a listing of endpoints. All of this can be very useful in helping you dig through kilobytes or even megabytes of data.

Sometimes a GUI may be a bit more than you’re interested in working with, and a CLI tool may be more preferable. If that’s the case, Wireshark ships with several CLI tools, including tshark, tcpdump, and dumpcap. According to the information available on the Wireshark Web site, like any tool, each of these has its strengths and weaknesses. Although the CLI tools are great for loading onto remote systems and launching to capture network traffic from an alternate location within the network, tcpdump by default will only capture the first 68 bytes of a packet, truncating the information. Another CLI tool is windump (www.winpcap.org/windump/), which not only captures network traffic in a manner similar to tshark and dumpcap but also can be used with the appropriate drivers to capture network traffic via wireless access points.

NetworkMiner Version 0.85 (beta) is available for Windows. NetworkMiner is described on the Sourceforge project Web site as "a Network Forensic Analysis Tool (NFAT) for Windows that can detect the OS, hostname and open ports of network hosts through packet sniffing or by parsing a PCAP file." In addition, "NetworkMiner can also extract transmitted files from network traffic." These capabilities make NetworkMiner extremely valuable to an incident responder. As illustrated in Figure 9.6, NetworkMiner’s GUI has a number of tabs for displaying automatically parsed information from within network packet captures, including files, user credentials, images, etc.

Figure 9.6 Excerpt of NetworkMiner 0.85 (Beta) UI

Excerpt of NetworkMiner 0.85 (Beta) UI

There is a tool available for Linux systems called tcpxtract (http://tcpxtract.sourceforge. net/; by Nick Harbour) that is a file carving utility intended for network traffic captures. Tcpxtract allows you to scan through a network traffic capture looking for files based on a library of file signatures. Although tcpxtract is not available for Windows systems, NetworkMiner provides similar functionality.

The screenshot from the NetworkMiner project page on Sourceforge illustrates the tool’s capability to identify the host OS by analyzing the captured packets. NetworkMiner uses functionality derived from p0f (http://lcamtuf.coredump.cx/p0f.shtml) to determine the host OS passively from a network packet capture rather than by actively scanning the system (Nmap). Figure 9.7 illustrates NetworkMiner’s ability to identify the OS of a host from a packet capture provided as part of one of Lance Mueller’s forensic practical exercises.

Figure 9.7 Excerpt from NetworkMiner Showing Host OS Identification

Excerpt from NetworkMiner Showing Host OS Identification

Yet another graphical tool for capturing and analyzing network traffic is PacketMon (www.analogx.com/contents/download/network/pmon.htm). PacketMon did not appear to have been updated at the time of this writing, nor does it appear to be as feature rich as Wireshark or NetworkMiner, but it is a useful tool to get you started down the road to analyzing network traffic captures.

A command-line tool (for those who enjoy that sort of thing) that may be helpful with network traffic analysis is ngrep (http://ngrep.sourceforge.net/download.html), a version of the GNU grep utility applied to the network layer allowing you to use extended regular or hexadecimal expressions to search for patterns within a network traffic capture.

All four of these tools allow you to access network traffic captures; the first three are GUIs that allow you to capture network traffic, in addition to parsing and analyzing it. These tools allow you to not only capture the network traffic but also work with network traffic captures provided by another source. This means that if you’re responding to an incident and on-site staff or first responders have already captured network traffic, you can use these tools to analyze that data, provided that the capture is in an acceptable format. Most tools readily provide access to tcpdump format captures, which, much like dd format for image acquisition, can be considered a de facto standard for network traffic captures.

Tip::

In fall 2008, NetWitness released its Investigator product available for free at http://download.netwitness.com/download.php?src=DIRECT. Investigator allows an analyst to quickly import a .pcap file containing a network traffic capture or simply to capture the network traffic through the application. Investigator is described as an "interactive threat analysis application of the NetWitness NextGen product suite," allowing the analyst to perform "free-form contextual analysis of raw network data." This is an extremely powerful tool for the analyst, but be sure to read the end user license agreement carefully before downloading and using the tool.

Tools & Traps …

Snort

One freeware tool that is often overlooked for use in forensic analysis of network traffic captures is Snort (www.snort.org). This application is widely known as a freely available intrusion detection system (IDS). When I first became familiar with Snort many years ago, it was a straight IDS, and in the ensuing years, there has been a great deal of development effort put into the tool that has resulted in it also being referred to as an intrusion prevention system. One of the capabilities of Snort is to not only work by listening on a network interface (placed in promiscuous mode) capturing and filtering traffic from a live network but also work equally well when using just network traffic capture files. By directing Snort to read the network traffic from the .pcap file and process the captured network traffic through the rule-sets, you’ve achieved a capability similar to using ngrep with a prepackaged set of filters, some of which are much more complex than a regular expression. This capability can provide a great deal of data reduction. Consider an instance in which there is a network worm proliferating within a network infrastructure. Network traffic captures can be used to determine which systems are communicating on the network, and in large networks there can be a great deal of "normal" network communications that can easily overwhelm the analyst. Using Snort (and assuming that there is a signature for the worm’s network communications), you can quickly sort the needle from the haystack, performing a great deal of data reduction with a level of granularity that exceeds that of other tools.

Search Utilities

Whenever I talk to fellow analysts about core capabilities of forensic analysis and applications that support such activity, one of the primary capabilities I hear as a requirement for any such application is the ability to run searches, to include keyword searches and grep-style searches using regular expressions. Searches are generally used as a data reduction technique; using keywords or regular expressions, an analyst can comb through megabytes or even gigabytes of data, looking for items (files, log file entries, etc.) that are specific to his analysis goals.

Tools & Traps …

Searching the Registry

Searching most files on Windows systems for ASCII- or Unicode-formatted data is generally a straightforward process. However, Registry hive files can pose some interesting issues. For instance, you as an analyst need to pay very close attention to paths within the Registry. The path "SessionManager" is very different from "Session Manager". Similarly, if a key or value is in the path that includes "Windows NT", do not make the mistake of looking for "WindowsNT". As with most searches, spelling is very important. Aside from that, however, not all information is maintained in the Registry in a tidy ASCII or Unicode format. Some information is encoded in a DWORD (4 byte) value, and the value must be mapped to a key to be interpreted. In some cases, a DWORD value of ’0′ may mean that the functionality is enabled, whereas for other values, a value of ’1′ may mean "enabled." In still other cases, certain functionality will be encoded in a binary value in some manner. So do not be surprised if a keyword or regular expression search does not provide indications of what you’re looking for within the hive files. The key to searching the Registry sometimes is to know what you’re looking for and to focus the search with a manual process.

The search utilities listed in this section are meant to be utilized against live files, meaning that they can be used in a live response situation, or after you’ve mounted an acquired image as a live file system. Most commercial forensic analysis applications provide a built-in search capability (several, such as FTK and X-Ways Forensics, also provide indexing capabilities), as well as some preconfigured regular expression search strings.

An excellent source of utilities for searching files and file systems is the GNU utilities for Win32 Web site (http://unxutils.sourceforge.net/). This site provides access to an archive of UNIX-style utilities that provide a great deal of functionality through utilities that many UNIX administrators are familiar with, albeit the fact that these are all native Windows versions of the tools. These tools can be easily added to batch files and scripts for use in performing searches and other data reduction activities.

In addition to these tools, there are several versions of the grep utility available for the Windows platform. In fact, there are two such versions of this utility, both called "grep for Windows"; one is available from Sourceforge (http://gnuwin32.sourceforge.net/packages/grep.htm), and the other is available from InterLog (http://pages.interlog.com/~tcharron/grep.html). Both provide similar functionality.

There are instances in which you may want to search for specific items or terms, such as Social Security numbers (SSNs) or credit card numbers (CCNs). These items fall within the definition of "sensitive data," as defined by such things as California’s state law SB-1386 and the Payment Card Industry’s (PCI) data security standard (DSS), respectively. As such, there may be times when you will need to search for this kind of data as a specific function of your analysis. Fortunately, there are several tools available that can be used to meet these needs. One such tool is the Cornell Spider (www.cit.cornell.edu/security/tools/), which was designed to scan collections of files (files on a hard drive, Web sites, etc.) for sensitive data such as SSNs and CCNs. Running Spider results in a log file of all files containing sensitive data.

Another useful tool for searching for CCNs is ccsrch (http://sourceforge.net/projects/ ccsrch). Ccsrch is a Windows-based command-line utility that can search for contiguous and unencrypted CCNs, as well as track data. Formatting specifications for credit card track 1 and 2 data include the CCN or primary account number (PAN) as contiguous data; that is, a sequence of numbers with no breaks, spaces, or dashes. Ccsrch results include the file name as well as the number found, send to standard output (STDOUT), which allows the results to be easily redirected to a file.

The following are useful resources for assisting you in your searches:

■ Regular expression reference (www.regular-expressions.info/)

■ Credit card number formats (http://en.wikipedia.org/wiki/Credit_card_number)

■ Regular expressions for credit card numbers (www.regular-expressions.info/ creditcard.html)

Tools & Traps …

Searching for Sensitive Data

When searching for sensitive data of any kind, you need to ensure that you fully understand the nature and format of that data, as well as what the results of your search really mean. Specifically, SSNs and CCNs pose some interesting challenges with regard to formatting. Most analysts recognize the formats of these numbers, but we need to ensure that 16-digit CCN searches (for example) include searches that meet the necessary criteria not only for a CCN (i.e., length, starting digits, and the Luhn formula/Modulus-10 check) but also for a straight sequence of numbers with no breaks, as well as sequences of numbers with either spaces or dashes in the appropriate locations within the sequence.

Another issue of searching for sensitive data is testing your tools and determining what formats of the data they search for. Some tools may search only for contiguous sequences of numbers (as with CCNs or SSNs), whereas others may include searching for those numbers formatted with spaces or dashes.

Summary

The cornerstone to an examination is not the tool you’re using; it’s your methodology. A good methodology is independent of the tool used, whether it be a commercial forensic analysis suite, a freeware open-source tool, or a custom-crafted Perl script. The keys are to know what questions you need to answer, where to go to get your data, and then how to correctly extract and interpret that data in a report. Keeping that and your core principles in mind will lead you to reach for the right tool, whether it is to extract the data for analysis or to corroborate other findings.

Solutions Fast Track

Documenting Your Analysis

  • Documentation is an extremely important part of any examination. Documentation should be clear, concise, and thorough enough to allow for later replication and verification, particularly by another analyst.
  • Many IT folks and responders must take the data or "evidence" they acquire to court and must be concerned with the standards that need to be met in order to do so. The key differentiator between running around with a CD full of tools and taking your data to court is the documentation you maintain: Do you have a documented process, and did you document your actions to the point that they are understandable and repeatable?

Tools

  • A number of free or low-cost tools are available that can more than adequately replace or even extend the functionality inherent to many of the commercial application bundles. With some knowledge and forethought, you can augment, replace, or even surpass what’s available in those applications.
  • As with other aspects of incident response and computer forensics, the various available commercial applications have their strengths and weaknesses. There are times when it is important to use a commercial application for data analysis and presentation. However, there may also be times when a freely available tool provides a greater degree of depth or visibility into the data and provides answers much faster.

Frequently Asked Questions

Q: I need to perform analysis of the data I’ve collected; what tools do I use?

A: As with everything else, "it depends." Seriously. Before deciding which tool to use, you need to take a close look at and document the goals of your analysis because that’s where everything starts and ends for an examination. Are you looking for illicit images? Are you concerned about who (i.e., which user account) may have downloaded, accessed, or viewed those images? Do you have several megabytes or even gigabytes of IIS logs and you’re interested in determining whether or not a SQL injection attack occurred? There is no one tool that fits every situation, and in many cases, the tool used depends on the personal preference of the analyst; I’ve performed analysis on log files using Perl, whereas others have preferred to use Microsoft’s Log Parser.

Q: How is a network traffic capture useful to an examination?

A: The network traffic capture contains a good deal of useful information that you can tie to a system (IP address) as well as to a process running on that system by correlating the port information in the packet headers to volatile data (network connections, process, process-to-port mapping) from the system. The contents of the packets may tell you what data or information was transmitted to or from the system.

Q: During an examination, I have multiple sources of data that I need to correlate, and I need to maintain the association between them (i.e., network packet captures, network device logs, server logs, and system images). What’s the best way to do this?

A: At this point, documentation. I am not aware of any complete analysis suites that allow you to pull in, analyze, and correlate multiple sources of data, aside from PyFlag.

Next post:

Previous post: