File Analysis (Windows Forensic Analysis) Part 4

Netsetup.log

The netsetup.log file is created during system setup; on Windows XP you can find it in the Windows\Debug folder. The file records information about workgroup and domain membership for the system, maintaining time stamps on all the messages it records. The time stamps within the netsetup.log file occur within the same time frame as those within the setuplog.txt file. Additional entries will be added to the file if the workgroup or domain of the system is changed. For example, I installed the Windows XP operating system for my personal laptop on August 7, 2006, as evidenced by the time stamps in the netsetup.log and setuplog.txt files. On November 19, 2006, I modified the workgroup membership (I moved from workgroup WorkGroup to workgroup Home) of the system by enabling file sharing. This information was recorded in the netsetup.log file, along with the appropriate time stamps. Log entries will also be added to the file if the system is added to or removed from a domain.

Task Scheduler Log

The Task Scheduler service on Windows systems can be accessed through at.exe or the Scheduled Tasks Wizard in the Control Panel. This service allows a user with Administrator privileges to schedule a task to be run at some point in the future or to be run repeatedly at specific times each day, week, or month. This is very beneficial for administering and managing a system or an entire network. This same feature is useful to intruders who want to make a piece of malware run persistently on the compromised system; in fact, a number of examples of malware (e.g., Conficker/Downadup) use this very method as a means of remaining persistent on an infected system. Fortunately, in a file called schedlgu.txt, thisservice keeps a log of the tasks that have been run. This log file is actually the default name associated with the LogFile value located in the following Registry key:


tmpDE153_thumb[2][2]_thumb

On Windows XP, the schedlgu.txt log is located in the Windows directory by default (C:\Windows), whereas on Windows 2003 and Vista, the schedlgu.txt file is located (by default) in the Tasks directory (C:\Windows\Tasks).

Tip::

Microsoft maintains a great deal of information in Knowledge Base articles. One in particular that relates to Scheduled Tasks is information on how to limit the size of the Scheduled Task log file, in Knowledge Base article 169443 (http://support.microsoft.com/kb/169443).

Note that Administrator-level privileges are required to create a scheduled task; very often, malware infecting systems and using this (and other) persistence mechanisms is successful because ordinary users have Administrator-level privileges.

If the Task Scheduler isn’t used by the administrator, the investigator should expect to see entries stating that the Task Scheduler service started and exited on specific dates and times. Because the Task Scheduler service is usually set to start up along with the system, this information can give the investigator a view of when the system was started and shut down.

If a task was scheduled and executed, you will see entries in the schedlgu.txt file that look like the following (excerpted from a Windows XP schedlgu.txt file):

tmp1E1-154_thumb

The first job was set up via at.exe, and the second job (pinball.job) was set up via the Scheduled Tasks Wizard. These .job files are kept in the Windows\Tasks directory.

Notes from the Underground…

Hiding Scheduled Tasks

There’s an effective method for hiding Scheduled Tasks. Create a Scheduled Task via either at.exe or the Scheduled Tasks Wizard. Go to the Control Panel and open the Scheduled Tasks applet and see that the task you just created is listed. Now close the applet, open a command prompt, navigate to the Windows\Tasks directory, and use attrib.exe to set the hidden bit on the .job file. Once you’ve done this, go back to the Scheduled Tasks applet and you won’t see the task listed any longer. Of course, the usual caveats apply to the command prompt (you must use the right switch with the dir command) and Windows Explorer (by default, it will not show files with the hidden attribute set). However, the task will run when you schedule it to do so.

I did some writing while on vacation and ran through the preceding procedure with the Solitaire card game. However, I never deleted the file, so when I got home I was working away one weekend and took a break. When I returned to my office, Solitaire was open on my desktop and at first I thought someone had been in my office! Then it struck me as to what happened, and I deleted the .job file.

Unfortunately, the full path to the executable run by the task is not recorded in the log file, but an indication of when a program was run via the Task Scheduler service is provided.

XP Firewall Logs

Most of us are familiar with the firewall components shipped with Windows XP, perhaps from the news media and issues that were addressed in the release of Windows XP SP2. Most users don’t even see or interact with the XP firewall, and it is enabled by default. The firewall can be disabled (some malware attempts to do this) and this may be part of a corporate configuration scheme to ease management of those systems. The firewall can also be manually configured to allow specific applications to have network access.

The Windows XP firewall has a log file in which it records various activities that occur, but by default, no logging occurs. Figure 5.6 illustrates the default settings via the Log Settings dialog for the firewall.

Figure 5.6 Configure Windows XP Firewall Logging

Configure Windows XP Firewall Logging

As you can see, the logging options are pretty limited. Logging is not enabled by default, so you might not find the firewall log pfirewall.log on most systems. The lack of a log file does not mean the firewall was not enabled. However, should you find a copy of the log file on the system, the firewall log format is straightforward and easy to understand. An excerpt from an example firewall log appears as follows:

tmp1E1-156

 

 

 

 

 

tmp1E1-157

The Fields tag in the firewall log header tells us what the various portions of the log entries refer to and how to interpret the information in the log file. We can see from the entries listed in the excerpt from pfirewall.log that several Internet Control Message Protocol (ICMP) packets (perhaps from the ping.exe application) were dropped, as were several attempts to connect to the computer on port 21, which is the default port for FTP servers.

The Fields tag in the firewall log header tells us what the various portions of the log entries refer to and how to interpret the information in the log file. We can see from the entries listed in the excerpt from pfirewall.log that several Internet Control Message Protocol (ICMP) packets (perhaps from the ping.exe application) were dropped, as were several attempts to connect to the computer on port 21, which is the default port for FTP servers.

Warning::

Often it might be difficult to interpret the activity in a pfirewall.log file without a more detailed understanding of the system and its environment. For example, when viewing other logs of network-based activity, such as corporate firewall or intrusion detection system (IDS) logs, I have been asked by the administrator what the activity represented. In the case of a single system, attempts to access well-known ports such as port 80 (Web server) or 21 (FTP server) do not necessarily indicate something running on that system, but rather that someone might have been trying to determine whether something was running on that port. This can indicate reconnaissance activity such as port scanning. If the logs show that similar activity was directed at several systems, all around the same time, this would indicate widespread port scanning. The point is that just because a log entry shows activity directed at a specific port, it does not necessarily mean the port was open (that a service was listening on that port) on the system. This is a commonly misunderstood phenomenon, particularly when it comes to widespread scanning activity directed toward ports used by Trojan backdoor applications.

For ease of viewing, a number of freely available utilities will parse this file and make it easier to interpret, even to the point of color coding certain entries. You can Google for various combinations of XP and firewall and viewer to locate one that will meet your needs.

Tip::

The accompanying DVD includes a subdirectory within the next topic directory called samples. This subdirectory contains a file named nmap_xp_scan. txt, which contains the command line used to launch an Nmap scan against a Windows XP SP2 system (with the firewall enabled), as well as the results of the scan that were sent to STDOUT. Another file named pfirewall_nmap_ scan.txt contains a portion of the logged packets that were sent to the target system. For ease of viewing, the Nmap scan was launched from 192.168.1.28, and the target system was 192.168.1.6.

Mrt.log

In addition to security software such as firewalls, Microsoft also deploys solutions to address issues with malware, one of which is the Malicious Software Removal Tool (http://support.microsoft.com/kb/890830), or MRT. Much like the Stinger tool from McAfee (http://vil.nai.com/vil/stinger/), MRT is not designed to detect and protect against all malware threats; rather, MRT is designed to scan for and address very specific threats, which are listed in Knowledge Base article 890830. You should note that every month or so, the tool is updated to address one or two additional threats, although in June 2008, Version 1.42 of the tool addressed a total of eight threats.

The log file for MRT is, oddly enough, mrt.log, located in the %WinDir%\Debug directory. This log file contains information about the version of the tool, when it was installed, and the results of the scan, as illustrated here:

tmp1E1-158

This information can be useful to an examiner, giving her some sense while searching for malware of what the system may have been susceptible to and what threats it may have been protected against.

You may also find a file named mrteng.log in the same directory that contains similar information, albeit without the scan results.

Dr. Watson Logs

The Dr. Watson tool (http://support.microsoft.com/kb/308538/) has shipped with versions ofWindows for quite some time, but generally it doesn’t come up in conversation these days. When a program error occurs on a system, the Dr. Watson tool collects information about the system and the program error in a text log file that can then be sent to support personnel for troubleshooting and program resolution. This information can also be useful when you’re investigating an issue on a system.

The text log file produced by Dr. Watson is named drwtsn32.log and is maintained in the following directory:

tmp1E1-159

The configuration information for the Dr. Watson tool is maintained in the following Registry key:

tmp1E1-160

This Registry key contains a number of values that are visible in the Dr. Watson GUI, which is visible when you click Start | Run and type drwtsn32. By default, the log file will maintain information from 10 program exceptions. These values will indicate to the investigator what she should expect to see if any exceptions have occurred on the system.

When an error occurs, the information saved by the Dr. Watson tool is appended to the drwtsn32.log file. Dr. Watson first writes a section that begins with Application exception occurred: to the file. This section contains information about the program that caused the error, along with the date and time the error occurred:

tmp1E1-161

Notice that the name of the program that caused the error can include the full path to the executable image along with a date/time stamp. As we’ve seen in previous topics, this information can be useful to an investigator, particularly in instances in which the program in question is malware or something placed on the system as a result of an intrusion or misuse. Dr. Watson then writes some system information, a list of running processes, a list of modules (DLLs) loaded by the program, and stack dumps to the log file that can be used to troubleshoot the program exception. An investigator can use this information to demonstrate the user that was logged in to the system on a certain date, what processes were running (which could show applications that were installed), and what DLLs were loaded by the program that caused the exception (which might show browser helper objects [BHOs] installed via Internet Explorer, any DLLs that were injected into a process to subvert that process, etc.).

Note

The Dr. Watson log can be extremely beneficial in demonstrating or corroborating a timeline of activity on a system. In one case, an individual who had accessed a system had uploaded tools to that system and, when attempting to run some of those tools, had generated application exceptions. We found logs of his access to the system, logs showing when he’d uploaded the tools (including the IP address from which his connection originated), Event Log entries showing the application exception pop-up message, and the Dr. Watson log that showed the application that had crashed. In addition to this information, we also had the user context for the application when it crashed as well as a list of other applications that were running at the time of the crash. All of this information helped solidify our view of what applications were already in place prior to this person accessing the system, what application he had added to the system, and when he had used them.

Dr. Watson also produces a crash dump file (user.dmp) that is located in the same directory as the text-based log file. This dump file contains private pages used by the process at the time of the exception and does not contain code pages from executable files (EXE, DLL, or the like). The user.dmp file can be opened in the WinDbg tool, which is part of the Microsoft Debugging Tools. However, the user.dmp file is overwritten with each exception, so you will see only the user.dmp file from the last exception. However, the available user.dmp file may contain extremely useful information, such as passwords, plain-text or unencrypted data, or indications of user activity.

Cbs.log

Windows Vista and 2008 systems include a Package Manager application used to install and uninstall various packages on the operating systems. The Package Manager maintains its logs in the file %WinDir%\Logs\Cbs\cbs.log. Microsoft provides some excellent information explaining how to analyze the entries in this file (http://support.microsoft.com/kb/928228), and an analyst may find something useful in the file to help explain an issue. For example, the Windows Resource Checker (sfc.exe) logs entries to this file, verifying during a scan that non-configurable system files have not changed. Microsoft Knowledge Base article 928228 provides an example of a "clean" scan, as well as an example of an issue with a corrupted file being found and addressed. The information in this log can potentially be a good source of information for an analyst, illustrating or ruling out issues with corrupted files. According to Microsoft Knowledge Base article 954402 (http://support.microsoft.com/kb/954402),you may also find in the cbs.log file on Windows 2008 systems that some files were not repaired, even though the scan is reported as completing successfully.

Crash Dump Files

We discussed crash dump files in next topic. I thought it would be a good idea to reference them in this topic, too, for the sake of completeness.

In next topic, we discussed ways to configure and generate crash dump files, but in most cases I’ve found that the systems themselves haven’t been modified at all. During some incidents or investigations, if you do find a crash dump file, it might be a good idea to see what it holds. You can use tools such as dumpchk.exe (for Windows 2000/2003, see http://support.microsoft.com/kb/156280; for XP, see http://support.microsoft.com/ kb/315271) to verify the dump file and ensure that it is valid. You can then load the file into a debugging tool (such as WinDbg) and use commands such as !process 0 0 to view the list of running processes at the time of the crash or lm kv to view a list of loaded kernel mode drivers. Further, you can use tools such as strings.exe, bintext.exe, and grep expressions to locate specific information.

Recycle Bin

Most forensic investigators are aware of the old adage that when a file is deleted, it isn’t really gone. This is even truer with the advent of the Recycle Bin on the Windows desktop. The Recycle Bin exists as a metaphor for throwing files away, as though you’re crumpling them up and tossing them into a wastebasket. The Recycle Bin also allows us to retrieve and restore files that we’ve "accidentally" thrown away. We can open the Recycle Bin, select files that we’ve previously thrown away, and restore them to their previous location.

So, when something is deleted through the shell—that is, when a user selects a file on the desktop or through Windows Explorer and "deletes" it—it isn’t really gone. The file is simply moved to the Recycle Bin, which appears by default in the file structure as the Recycler directory at the root of each drive. In many cases, this directory can provide a significant amount of information relevant to an investigation.

To better understand how information in this directory can be used as evidence, let’s take a look at what happens when a user deletes a file through the shell. Once each user on a system begins to delete files through the shell (as opposed to using the del or erase command at the command line), a subdirectory is created for that user within the Recycler directory; that subdirectory is named with the user’s SID. For example, from the command prompt, the subdirectory will look something like this:

tmp1E1-162

When you open the Recycle Bin from the desktop, the user’s subdirectory is automatically opened for his view. So, if you were to sit down at a user’s laptop with a user’s account logged in and you opened the Recycle Bin to view the contents, you would see the files that the user had "deleted." If you were to switch accounts and repeat the process, you would automatically see the files deleted within the active user account.

When viewing the Recycler directory via an image, you should expect to see a subdirectory for each active user on the system that has deleted files via the shell, as Figure 5.7 illustrates.

Figure 5.7 Example of a Recycle Bin Viewed via ProDiscover

Example of a Recycle Bin Viewed via ProDiscover

Within each subdirectory, you might see a number of files, depending on the user’s activity and how often the user has emptied the Recycle Bin. Files sent to the Recycle Bin are maintained according to a specific naming convention (http://support.microsoft.com/ kb/136517) which, once you understand that convention, makes it relatively easy to identify certain types of files and which ones might be of interest. When the file is moved to the Recycle Bin, it is renamed using the following convention:

tmp1E1-164

The filename starts with the letter D and is followed by the letter of the original drive from which the file was deleted, then a zero-based index for the number of the file (i.e., the fifth file deleted will have the number 4). The file maintains the original extension. Further, a record is added to the INFO2 file within the directory, which is a log file of all files that are currently in the Recycle Bin. The index number of the deleted file serves as a reference to the original filename (and path) maintained in the INFO2 file.

Fortunately, Keith Jones (formerly of Foundstone and Mandiant) was able to document the format of the INFO2 file so that this information would be more useful to forensic analysts. The INFO2 file contains records that correspond to each deleted file in the Recycle Bin; each record contains the record number, the drive designator, the time stamp of when the file was moved to the Recycle Bin, the file size, and the file’s original name and full path, in both ASCII and Unicode.

The INFO2 file begins with a 16-byte header, of which the final DWORD value is the size of each record. This value is 0×320 (little endian), which translates to 800 bytes. The first record begins immediately following the header and is a total of 800 bytes in length.

The first DWORD (four bytes) of the record can be disregarded. The file’s original full path and name, in ASCII format, is a null-terminated string beginning after the first DWORD and taking up the first 260 bytes of the record. Opening an INFO2 file, you’ll see that most of the space consumed by the ASCII format of the filename is zeros. These zeros can be stripped out to retrieve only the filename. The rest of the items within the record appear as follows:

■ The record number is the DWORD located at offset 264 within the record.

■ The drive designator is the DWORD located at offset 268 within the record. The drive designator is used to determine which drive the file was deleted from; 2 = C:\, 3 = D:\, and so on.

■ The time stamp for when the file was moved to the Recycle Bin is the 64-bit FILETIME object located at offset 272 within the record.

■ The size of the deleted file (in increments of a cluster size) is the DWORD located at offset 280 within the record.

The original filename in Unicode format consumes the rest of the record, from offset 284 within the record to the end (516 bytes). Simply stripping out the null bytes will give you the path and name of the file in English ASCII format. (The Unicode format is two bytes wide, and removing the null bytes from the second half of the Unicode format will leave you with just the ASCII format, in English.)

The recbin.pl Perl script located on the accompanying DVD will retrieve the various elements from each record, displaying the record number, the time stamp telling when the file was moved to the Recycle Bin (in UTC format; the time zone settings for the system are not taken into account), and the original name and path of the file. The script takes the path to an INFO2 file as its only argument, and the output can be easily manipulated to provide any structure and format that the investigator requires.

Keith Jones has also provided a tool called Rifiuti (the name means trash in Italian) for parsing the contents of an INFO2 file. Rifiuti.exe is freely available from Foundstone.com and will parse the INFO2 file in a format that is easily opened for viewing in spreadsheet format.

Notes from the Underground…

Looking Closely in the Recycle Bin

Investigators should also be on the lookout for files that have been added to the Recycler directory but are not stored within one of the user SID subdirectories, as well as files that do not meet the naming convention for files moved to the Recycle Bin. This could indicate malicious activity by a user or by malware, intending to purposely hide a file. Investigators should also be aware that applications such as Norton AntiVirus might use the Recycle Bin; Norton’s Recycle Bin Protector will place a file called nprotect.log in the directory. Datalifter, a company that produces forensic analysis tools, has an NProtect Viewer (www.datalifter.com/tutorial/bt/NProtect_ Using_NProtect.htm) that will parse the contents of the nprotect.log file. The NProtect Viewer is part of the Datalifter .Net Bonus Tools pack.

One of the things I do when digging into an image is to check the last modification time on the INFO2 file. This will tell me when the last record was added to the INFO2 file, which approximates to the time that the file in question was moved to the Recycle Bin. If the user’s subdirectory within the Recycler directory contains only the desktop.ini and INFO2 files and the INFO2 file is small, the last modification time refers to the time at which the user cleared the Recycle Bin (i.e., right-clicked the Recycle Bin and chose Empty Recycle Bin from the context menu).

Next post:

Previous post: