Performing Analysis on a Budget (Windows Forensic Analysis) Part 1

Introduction

To a number of folks, performing incident response and computer forensic analysis appears to be simply out of reach due to the cost associated with the various commercially available tools. This affects more than just hobbyists and those interested in delving into this fascinating (to me, anyway) realm, however. This affects schools: Computer forensic courses are offered not only at major universities but also at community colleges. The cost of commercial tools affects law enforcement officers and even consultants (such as myself—hey, we all have budgets we have to adhere to). Would it be nice to have access to all of the commercial tools? Sure, it would be, but from a budgetary perspective it just isn’t practical.

It isn’t particularly necessary, either. Commercial applications (EnCase, FTK, ProDiscover, etc.) are just that—tools. Every tool or application has its strengths and its weaknesses, and trained, knowledgeable analysts understand what they need to do before selecting the tool or application to assist them in their analysis. The key to forensic analysis isn’t pushing the button on an application user interface. After all, as I’ve said time and time again, the age of Nintendo forensics is over! The key to forensic analysis is understanding what artifacts are available to you and having a logical, reasoned, and comprehensive plan or process for collecting and interpreting the data. From that perspective, you’re not tied to a particular commercial application (in the absence of some specific requirement that forces you to use it) and can instead explore the use of low-cost or freely available tools and applications that your analysis needs.


Tip::

Now is a good time to discuss the topic of anti-forensic tools. Anti-forensic tools are those tools (and in some cases, techniques) that a bad guy will use to make our jobs more difficult, such as modifying file MAC times or wiping data (or "evidence") from the system. Many people think that anti-forensic tools target a particular commercial application; this simply is not the case. Sure, there have been public presentations at popular computer security conferences discussing how to subvert an analyst who uses EnCase, but the fact is that anti-forensic tools and techniques target the analyst, not the tools. An analyst who realizes this is one step ahead of, not behind, the bad guy.

Throughout this topic, each topic has presented, described, and/or demonstrated tools used for particular purposes, but in each case that presentation was simply, "hey, look at what this tool does and see how it’s useful." My goal in this topic is to fill in the gaps for many readers with a number of other tools that will get them started—tools such as hex editors, packet capture and analysis tools, etc. There are many tools that you can use, and many of them simply weren’t designed with analysis in mind. However, someone has found these tools useful due to some functionality they provide. You should not consider this topic, or even this topic, a comprehensive and complete guide to everything and every tool you could possibly want. At best, this topic serves to open that door a bit—to show you that there are options beyond those that are out of reach due to the cost of the product or cost of the training associated with that product.

Finally, if you know me or have read any of my previous topics, you know I’m a fan of Perl. Some might even say that Perl is my hammer, and everything I see is a nail. And they’d be right. All humor aside, however, Perl can be an extremely powerful tool, such as when you have to parse several hundred megabytes of Web server logs for indications of a SQL injection attack and decode the hexadecimal or character encoding to decipher the actual injected commands to locate other affected systems. What might have taken you days now takes just minutes. I have seen and demonstrated this through scripts I’ve written, not the least of which ended up as RegRipper (www.regripper.net). I’ve heard others say the same thing—how a coworker’s abilities with Perl reduced days of data manipulation by hand to a couple of hours using a Perl script. This not to say that Perl is the only programming language available, because any programming language you’re comfortable with, such as Python, will work for you. Thanks to Dave Roth, I prefer Perl.

Documenting Your Analysis

This is because the subject of documentation is extremely important, particularly because it is something extremely technical people do not like to do. From my perspective, I never liked documenting anything, until I started to see what happens when I didn’t document my analysis. For example, I’d come across a great idea or find some great tool or technique for analysis, and three months later I wouldn’t remember what it was. And I hadn’t documented it! Documentation is a recurring theme throughout this topic for the simple fact that it is so vital.

Another important theme throughout this topic is the need for repeatability in work, be it data collection or analysis, which is achieved through documentation. Repeatability, which is essentially being able to take the same data, follow the same process, using the same tools, and achieving the same results, is a fundamental principle of forensic science. One reason that documentation needs to lead to repeatability is that analysts are not always around. One analyst or examiner may perform work, and then several months later when that analyst is on vacation or assigned to another task, someone may have a question about the results. Another analyst should be able to step in and, with original data and the previous analyst’s notes, be able to repeat the same process and (it is hoped) achieve the same results. The same thing applies to work that an analyst may need to revisit a year later; without proper documentation, it is unlikely that the analyst will be able to remember exactly what he or she did.

The first step in performing any forensic analysis is to have a method for documenting what you do. After all, if you perform some analysis but fail to document it, it didn’t happen. As much as technically oriented folks seem to hate to do this, documenting your analysis is paramount to what you do. Documentation must (not should) be detailed and clear enough to allow another analyst or examiner to understand what you did, as well as verify it. In addition, the documentation should be detailed and clear enough for you to pick up your own analysis notes from a year ago (or more) and be able to verify what you did. By verify, I mean using the same data and the same tools (because in your analysis notes, you listed the tools and versions used … right?), you or someone else should get the same results.

Think about that for a moment. Let’s say you perform an analysis, and when the examination is complete and the final report has been delivered, you lock the drive in your safe awaiting final disposition. Then, a month later, a question about something in your report arises, and you need to go back to that data and verify some aspect of your analysis. But it’s been a month (or six months or one year), and your operational tempo is such that not only have you performed several exams since then but also you’ve been assigned another examination, and your original data needs to be provided to another examiner. How embarrassing and difficult would it be if another examiner could not take the same data and, using the same tools, replicate your findings? How would you explain that? Most of us would probably say something like "You didn’t do it right" or "You didn’t use the right version of the tool", right? Well, what would it look like if you couldn’t replicate your own results?

Documenting what you do is important, but documenting what you do to the point where someone else can replicate your findings is even more important. So, how do you do that? I’ve always found being concise to be the best approach. I’ve seen many folks who have been way too verbose in their notes, and what they actually did simply got lost in the prose. Let’s say that you suspect that you may have a Web server that was subject to a SQL injection attack. The most logical place to look for indications of such an attack would be in the Web server logs. If the Web server is Microsoft’s Internet Information Server (IIS) and the backend database is MS SQL Server, then a logical place to start would be to look for the use of the SQL extended stored procedure xp_cmdshell in the Web server logs because that would not be something that you would normally see in the Web server logs. So let’s say that you created a new ProDiscover project, added the image of the Web server to it, and then ran a search across the Web server logs for "xp_cmdshell". Your case notes might look as follows:

- Created ProDiscover 5.0 project "intrusion_20081030". Added Web server image, saved project.

- Searched Web server logs (note full path) for "xp_cmdshell" using PD Search function; several hits found ex081002.log and ex081003.log.

There you go. Simple, to the point, concise, yet clear and thorough. In the case of this search, you’ve listed what you searched for (xp_cmdshell), what you used to perform the search (ProDiscover 5.0 Search function), and what you found (hits in two log files). For example, "analyzed log files" or "searched log files" says nothing about what you did. Which log files did you search? What did you search for? If you did a keyword search, what were your keywords? How did you conduct the search? Using grep, or using the Windows Search tool (which I’ve done: Export the log files from an image using FTK Imager and then click Start | Search | For Files and Folders)? What were your results? See how incomplete documentation generates rather than answers questions? Also be sure to avoid being overly verbose; as I said, I’ve seen notes so verbose that the actual work done and the results were completely lost. Providing dates on each page or with each entry adds to the veracity of the notes, and if multiple team members are working on a major analysis, having each member initial or sign his or her own work can be a real advantage further down the road.

Including the tools you used and their versions in your documentation makes it easier to re-create and verify the results. The version of the tool you use may make a difference, particularly if there are major updates between versions of the tool. This can be particularly important if you’re using an antivirus scanning application to scan either a mounted image or just a few exported files. Noting the version of the scanning engine as well as the virus definitions file(s) can make a world of difference, particularly when the files you scanned are found two weeks later to be malware.

Another important aspect of documenting your analysis includes substantiating your determinations. Analysts very rarely make a determination of something based on one piece of data alone; in most cases, there are multiple supporting pieces of data that are correlated to develop a determination. For example, if I need to determine when a user was logged into a Windows system, one of the first places I would look is the Security hive file. Parsing a key in this hive file will tell me whether auditing for logins was enabled or not. If it was, I will then look for the appropriate event records in the Security Event Log. I would also check the SAM hive file for indications of the last time the user logged in, as well as the NTUSER.DAT hive file in the user’s profile directory for indications of activity during the time frame in question. All of these can be used to make a determination of when the user was logged in. Furthermore, referencing outside sources, such as Microsoft Knowledge Base articles, is a great way to substantiate the findings of your analysis.

One tool I’ve found to be extremely useful for documentation is Forensic CaseNotes from QCC Information Security in the United Kingdom (www.qccis.com/?section=casenotes). Forensic CaseNotes is a great tool for keeping examination case notes. It’s free, configurable, and pretty versatile. Generally, the first thing I do after downloading and installing Case Notes to a new system is to set up the configurable tabs to suit my needs, as illustrated in Figure 9.1.

Figure 9.1 Excerpt from Forensic CaseNotes GUI, after Configuration

Excerpt from Forensic CaseNotes GUI, after Configuration

As you can see from Figure 9.1, there are a number of tabs available. I keep one for Exhibits: This is all of the media I have, along with any notes with respect to acquisitions. The information for this tab comes from my acquisition worksheets. I also have one for Hours: I use this to record hours spent on work that is directly attributable to the engagement and billed to the customer. This is extremely important for consultants. Perhaps the most important tab is Analysis: This is where I keep track of the actual work I do on a daily basis. Sometimes I’m working analysis on my own, whereas other times I’m working as a team lead and managing work performed not only by me but also by other team members, or perhaps another organization internal to the company. Having everything visible in one place lets me see what’s been done and if additional hours need to be requested.

One beneficial aspect of CaseNotes is that you can add images (screen captures, digital pictures, etc.) into the text fields beneath each tab. When performing analysis, I will often cut and paste command lines used with various command-line interface (CLI) tools as well as excerpts from the output of various CLI tools directly into the contents of the tabs; if I have something special to add, such as pin-out diagrams for adapters, I can add those as well. Having these available for another analyst to review can be beneficial for future reference and can be extremely valuable when it comes to report writing. Often, adding a picture or diagram to a report can save a great deal of explanation and confusion.

What else goes into my case notes? Everything. Seriously. This includes URLs or links to information from the Internet that I used as part of my analysis, such as MS TechNet Knowledge Base articles, and even links to "hacking" sites if they were appropriate to the examination (and supported by corroborating data); screen captures, images of anything pertinent to the examination, etc.; and specific versions of tools used, as well as mention of and references to specific tools/scripts I’ve created to assist me in managing the available data. In some cases, if a number of tools or scripts are used, I will archive them in another location for later reference.

One caveat about the use of CaseNotes is that I know cases in which analysts have been unable to access their CaseNotes file after putting a password on it and closing it. Also, keep in mind that CaseNotes does not keep the contents of your notes in a file that can be necessarily opened in a variety of other formats; that is, if you use CaseNotes, don’t expect to open the file in Notepad and get something that’s easy to read.

Another tool that is available is NoteCase note manager (http://notecase.sourceforge.net/). I have not tried using NoteCase or, for that matter, other tools for maintaining my case notes. However, if you’re looking for something that allows your case notes to be more accessible, one simple way to do this is to set up a format or checklist in MS Word. Most commercial organizations have MS Word, you can save the files in a number of formats, and a variety of commercial (Adobe) and freeware (PDFCreator) tools will allow you to print the files to a PDF format. The tabs I used in CaseNotes can easily be redone as headers in an MS Word document, and you can even use tables or an embedded Excel spreadsheet to manage and record your hours.

The overall idea here is that you do have something, some method for documenting your work in a consistent, verifiable manner. I have not yet mentioned another very good reason for doing this: What happens if you are called to testify about any of your work? Are you going to remember the specifics and the nuances of an examination or engagement six months or one year after the fact? There have been a number of occasions when I’ve been asked a question (not questioned in court or during a deposition) about some work I did awhile ago (four, six, or nine months) and I have needed to refer to my case notes to make sure I got the correct examination and the correct information.

Tools

When performing incident response or forensic analysis, there are a wide variety of activities that either entails. As such, you’ll need the right tool for the right job—but which tool works best for you? The purpose of this section is to present a range of tools that are useful for a variety of purposes so that you can get started on your road to discovery.

The tools listed in this topic are freely available tools, and most are free for your use within the terms of the license agreement (of course). Some tools are evaluation versions, and the full version may require a nominal fee if you want to continue using the program.

Acquiring Images

Acquiring data is a major part of what any incident response analyst does, and acquiring images of systems is just part of that.

DD

DD is the utility most folks think of when it comes to acquiring images. dd, or "data definition," is a native Linux/UNIX command that allows the user to "convert or copy a file" (according to man pages such as http://linuxreviews.org/man/dd/), and it is very often seen as the "standard" or "granddaddy" utility for this purpose. There are a number of variations of this utility available, some with slightly different capabilities; however, they all perform essentially the same basic function—they allow you to acquire images of drives or volumes.

One version of dd that is available for Windows is from George M. Garner, Jr., and is part of the Forensic Acquisition Utilities he makes available (http://gmgsystemsinc.com/fau/). This package of utilities allows you to acquire images of systems, pipe them over the network (if you do not have local storage available), use compression, and generate and verify computational hashes to ensure the integrity of the acquired data.

Tools & Traps …

Using dd for Live Images

Most folks think that tools such as dd are meant only for acquiring images of drives removed from systems—hook the drive up to a write-blocker and use dd to acquire your image. This is, of course, the preferred methodology, but in a number of cases this may not be possible. Due to the nature of a customer’s network infrastructure and the impact that taking a system down and offline in order to acquire an image of the hard drive would have had, we opted to use the native dd command (SUSE Linux 9) to acquire a live image of the physical hard drive. We made sure that we thoroughly documented the reason and process for this approach, including documenting the versions of the utilities used (dd, split, etc.) in our case notes and report.

Dcfldd (http://dcfldd.sourceforge.net/) is another freely available version of the dd tool that also runs on Windows. Dcfldd was written by Nick Harbour. The Sourceforge Web page for dcfldd describes it as an "enhanced version of the GNU dd," with additional features such as imaging/wiping verification, hashing, piping of logs, etc. All of these functions are extremely useful not only for ensuring the integrity of the image and allowing you to pipe the image off of the system (when acquiring an image of a live system, you do not want to write that file to the actual hard drive you are acquiring) but also for removing or wiping the image file when you’ve completed your analysis.

Tools & Traps …

dd Format Images

Increasingly more often, forensic examiners are seeing the need for some kind of standardization in all aspects of what we do. This applies to image acquisition as well. A responder’s toolkit or "fly-away" kit should include a number of methods for acquiring images, such as hardware write-blockers (i.e., those that provide for straight drive-to-drive imaging as well as those that allow the responder to connect the drive to be imaged to a write-blocker and acquire the image using software, etc.), as well as means (tools and processes) for acquiring live images. In addition, response teams should include in their standard operating procedures a standardized format to which images should be (if possible) acquired.

Why is this important? Prior to the recent release of updated versions of forensic analysis applications, I had an opportunity to assist with an examination in which one drive from a system was acquired using the dd format and the other drive from the same system was acquired using a proprietary format specific to one forensic analysis application. At the time, this situation could have restricted me to using one specific forensic analysis application in my analysis.

Using a consistent format for acquiring images is important for other reasons as well. First, it adds an air of professionalism in the eyes of the customer, as well as your peers. Believe me, when I first sat down to look at the specifics of the examination I was going to assist with and saw two hard drives from the same system acquired in different formats, my first thought was "did these guys even have a process?"

Also, don’t think for a minute that you’re the only one who’s ever going to see these images. I’ve performed a number of examinations where after everything was complete and the final report was delivered, the customer wanted the images rather than just having me wipe the drives and ship them back. Always be prepared to return the images to the customer or to turn the images over to someone else for analysis; having consistent image formats (along with your documentation) is simply more professional.

Second, requiring a consistent image format naturally leads to documentation that will address the issues not only of the process used to acquire the image but also justification as to why you needed to deviate from the standard. Overall, this is simply more professional and thorough.

FTK Imager

Both FTK Imager and FTK Imager Lite are available for free from AccessData.com (www.accessdata.com/downloads.html). FTK Imager Lite is a "light" version of the FTK Imager tool that can be unzipped and burned to CD or copied to a thumb drive for use. There is a support article on the AccessData.com Web site that also lists which files you will need from the FTK Imager archive if you want to run that tool from a CD or thumb drive, in order to have one consistent tool and version at your disposal.

I’ve found FTK Imager to be extremely valuable for a number of uses. When I’ve had to perform examinations of images acquired using EnCase and not had EnCase readily available (or simply not wanted to use it), I’ve opened the .E0x files in FTK Imager and either extracted specific files or acquired an image to dd format. I’ve also used FTK Imager to verify file systems of acquired images, including an image of a SUSE Linux 9 system running the ReiserFS. Of course, I’ve also used FTK Imager as an image acquisition tool, either running it in conjunction with a properly employed write-blocker or running it from a CD and acquiring a live image of a Windows system to an external USB-connected hard drive (or some other media/location).

FTK Imager can also be used to open VMware .vmdk files. I have responded to engagements where systems running within the VMware virtual environment were part of the network infrastructure and even systems that we needed to acquire and analyze. As such, perhaps the easiest way to "acquire" such systems is to simply copy the .vmdk (and .vmem, if available) files off of the host system. With FTK Imager, you can choose to Add an Evidence Item to view the file system and extract specific files or choose Create Disk Image to acquire the .vmdk (or .E0x image) file to raw dd, SMART, or .E0x format. This can be extremely useful when using commercial analysis tools that may not recognize the . vmdk format or that may be more cumbersome than is necessary for the work you intend to perform.

If you don’t want to or simply don’t have the facilities to acquire your own images, there are places you can go on the Internet to download images provided for tool testing or as part of challenges. This is a great thing that some very smart folks have been providing. After all, how better to communicate an idea or concept or process of analysis than to describe it and then provide some facility for folks to try the "hands-on" approach to learning? Most of the images are posted with some kind of challenge or series of questions involved to guide the participant’s examination. As a consultant, I am acutely aware of where direction such as "find all suspicious/malicious activity" can lead—to lots of billable hours that you can’t recover. The posted challenges provide not only an excellent resource for honing your analysis skills but also a great example for what an examination should look like from the beginning.

One of the first locations I found for freely available image files was the CFReDS (Computer Forensics Reference Data Sets) Project at NIST. The "Hacking Case" (www.cfreds.nist.gov/Hacking_Case.html) image files include not only a dd format split image but also an EnCase or EWF (Expert Witness Format; Expert Witness was the precursor to EnCase) for those who want to practice with other tools.

Another site that includes some specific images and test scenarios is the Digital Forensics Tool Testing (http://dftt.sourceforge.net/) site, set up by Dr. Brian Carrier. This site provides some very specific test images for testing forensic analysis tools, but as with other sites, the provided images can also be used as a basis for developing and honing analysis skills, as well as providing familiarization with various forensic analysis applications.

Lance Mueller provides two interesting practical application challenges via his ForensicKB. com blog site (www.forensickb.com/search?q=practical). Lance has graciously provided the practical scenarios, along with small (~400 MB) images acquired from a Windows XP system in compressed .E0x/EWF format. If you do not have a valid EnCase dongle, not to worry: FTK Imager will open these files easily, allowing you to export files from the image or simply generate a dd format image from the EWF format file. Some of the comments to Lance’s blog, for those posts, also provide insight into what other examiners are looking for and have found.

Image Analysis

Once you have an acquired image, have verified your image hashes and the file system, and have documented your entire process, you will need some means of opening the image and performing the basic analysis functions necessary as part of the work you’re doing. Throughout this topic, we’ve discussed various tools for doing this—for opening the entire image file and viewing the file system structure as a whole, running searches, etc., or simply extracting specific files for analysis (Registry hive files, Event Log files, etc.).

The SleuthKit

The SleuthKit (TSK; www.sleuthkit.org/) tools were written by Dr. Brian Carrier and provide the backend components for the Autopsy Forensic Browser. TSK is a set of command-line tools that allow you to examine and analyze file and volume systems within image files. The TSK tools are also available for Windows systems; however, at the time of this writing, the Autopsy Forensic Browser has not been ported to a native Windows format (although all of the tools can be compiled on Windows using the Cygwin subsystem).

The TSK tools can be used on a Windows system in much the same manner as they are on Linux systems; however, there are a couple of caveats. First, according to Dr. Carrier, there is an issue with "globbing" at the command line, requiring you to list all of the component files to a split image file in order. This means that if you are analyzing a split image file that contains multiple files, you will need to list each of them as follows:

command [options] imagel image2 image3…

This is where tools such as FTK Imager come in handy, in that you can reassemble the split image files into a single unified image file. FTK Imager is adept at reassembling some split image file formats, such as its own as well as those produced by tools such as Guidance Software’s EnCase. You can also use the native Windows type command to reassemble split image files that were acquired in the raw format:

tmp7B-17

TSK can open raw (i.e., dd), Expert Witness (i.e., EnCase, referred to as EWF), and AFF file system and disk images (www.sleuthkit.org/sleuthkit/desc.php). The TSK tool fls.exe (Version 3) for the Windows platform reports at the command line that it can parse raw (dd) image files, EWF image files, and split raw image files:

tmp7B-18

There are a number of documents available at the SleuthKit Web site, as well as other locations online, that describe how to use the various command-line tools in combination to perform analysis of an image. For example, the File System Analysis Techniques (http: //wiki. sleuthkit.org/index.php?title=FS_Analysis) and File Activity Timelines (http://wiki.sleuthkit.org/index.php?title=Timeline) references provide a great deal of extremely useful information about the TSK tools. Perhaps the best source of information about the tools is the TSK Wiki (http://wiki.sleuthkit.org/index.php?title=Main_Page).

Some straightforward usage examples of the TSK tools include using dls to collect unallocated space from an image file. The following command can be used to extract the unallocated space from an acquired image file:

tmp7B-19

Removing unallocated space from the acquired image file can be useful in performing string/grep searches or file carving on just that unallocated space, such as when searching for credit card numbers, IP or email addresses, or just performing file carving.

The following command will give you information about the file system within the image file:

tmp7B-20

The fsstat command provides file system, metadata, and content information about the image file. For example, running the command against an acquired image of a Windows XP system returns the following file system information:

tmp7B-21

On a live system, you can obtain much of this same information using fsutil.exe. For example, the following commands return similar information as fsstat.exe, albeit from a live system (including the volume serial number):

tmp7B-22

The volume serial number is created at and determined by the time that the drive was formatted and can be used in part to identify the acquired image when used in association with other documentation.

Perhaps the most useful tool for analysts that is part of the TSK tools is fls.exe (http://wiki.sleuthkit.org/index.php?title=Fls), which lists the file and directory names in a file system in a pipe-separated format that allows timeline information to be generated using mactime.pl. For example, the following command will run through the entire image file, recursing through directories and subdirectories:

tmp7B-23

The -m option allows you to prepend the file and directory listings with the name of the mount point used (in this case, C:\). Most often, the output of the command will be redirected to an output file and then run through a tool such as mactime.pl or ex-tip (timeline creation tool created by Mike Cloppert; https://www2.sans.org/reading_room/ whitepapers/forensics/32767.php) to sort the file system information into a more readable and understandable timeline format.

Tools & Traps …

Timelines

Tools such as TSK’s fls.exe, mactime.pl, and Mike Cloppert’s ex-tip provide extremely useful open source functionality for creating timelines of file system activity. However, because they are open source, they are easily extensible. For example, any other time-stamped source of information from a Windows system can be included in the timeline as well; all that needs to occur is that the proper format (as illustrated on the TSK Wiki page for fls.exe) be followed, and Registry keys (as well as values whose data includes time stamps), Event Log entries, and even the contents of other files (ex-tip includes a filter for McAfee OnAccessScan log files, and filters can be written for other AV log files, the setupapi.log file, etc.). In addition, data can be manually added to the "body file" prior to filtering and sorting with a tool such as ex-tip, allowing for the entry of additional data that the analyst may wish to include in the timeline. Normalizing entries to a common format allows for them to be imported into other tools such as Zeitline (http://projects.cerias.purdue.edu/forensics/timeline.php).

For other useful TSK tool commands, CyberGuardians has a two-page PDF "cheat sheet" available (www.cyberguardians.org/docs/ForensicsSheet.pdf).

There is also a Windows version of the Selective File Dumper tool called FUNDL (for "File Undeleter") that uses the TSK tools available on Sourceforge (http://sfdumper.sourceforge.net/fundl.htm).

Tools & Traps …

Image Formats

Previously in this topic, I mentioned the need for standardization in image formats. The purpose of this is to achieve consistency and professionalism through a standardized process. Some organizations may rely solely on one commercial forensic analysis application and may have an excellent reason for standardizing on a proprietary image format. Other organizations, such as consulting firms, may opt to standardize on a more accessible format (i.e., the dd format) to allow a wider range of access to forensic analysis applications, which in turn allows for verification, etc. In 2008, Technology Pathways released ProDiscover Version 5.0, which included the ability to open EWF format images. Following the DFRWS 2008 conference, Dr. Michael Cohen released a version of his PyFlag application that runs natively on Windows systems. In April 2008, Dr. Brian Carrier released versions of the Sleuthkit tools that were compiled to run natively on Windows. These tools, although (at the time of this writing) they cannot be used with the Autopsy Forensic Browser (Cygwin versions of the tools must be used), provide command-line access to dd, EWF (via libewf), and Advanced Forensic Format AFF (via afflib; www.afflib.org/) format images.

Next post:

Previous post: