Sunday, November 25, 2012

No Partition Table? No Problem

Warning: Veteran forensics professionals probably already know this stuff. Posting for those newer to the field. All are welcome to read and comment, however.

So you've got a disk or maybe just the image of a disk and your forensic tools say there are no partitions on it. What now? Windows isn't going to mount it, so how do we see if any files are on the disk? Some Windows based forensic tools (X-Ways Forensics) will probably be able to identify any lost partitions on the disk, but you still can't mount it natively. Fortunately, there are still ways to get at the data, if it's there at all. Using skills I learned in SANS Forensics 508, I was successful in doing all I needed to do.

The hard drive I'm working with is from a friend's computer. He asked me to troubleshoot, or perhaps just shoot it as it had suddenly stopped booting up.  On Patch Tuesday, the installed Avast Antivirus flagged some files as rootkits and asked to remove them, which my friend allowed. Avast notified him to reboot in order to finish the disinfection. When rebooting, the computer stops after the POST screen and advises there is no boot device.

Windows Disk Management in the management console showed the entire disk as being unallocated. I made a raw image of the drive and am investigating the cause. As part of that, I wanted to mount the image in order to conduct antivirus scans, since Avast had reported rootkits.

Since my friend's computer was unable to find the operating system when trying to boot, I loaded the image into WinHex and navigated to byte offset 446 to look at the partition table. I found the entire table was gone, right down to the 55AA normally found at the end of the table at offset 510 and 511.

So, is all hope lost? Nope. I knew this to be a fairly new computer with Windows XP as the only operating system. In WinHex, I searched for NTFS to try to find the start of a partition. As expected, I found the formerly active partition at Sector 63.

I opened the SANS Forensics SIFT Workstation (ver. 2.14) in VMware Workstation and connected the disk my image was stored on to it through the Shared Folders option. The SIFT Workstation, for those who don't know, is a ready made forensic work environment in a VMware virtual machine. An .iso file to create a live CD is also available.

Knowing the partition started at Sector 63, I knew it should be easy to mount it in Linux and it was. At the command line, I navigated to the disk where my image was located like this:

cd /home/sansforensics/Desktop/VMware-Shared-Drive/disk

There I found my image, named image.001. Before attempting to mount, I decided to use the mmls tool from the Sleuth Kit to see if it could recognize a partition in the image. Even using the sector offset option, mmls still couldn't find any partition.

So, I decided it was time to attempt to mount the partition I knew was in the image. At the command line, I typed the following:

mount -t ntfs-3g -o loop,ro,show_sys_files,streams_interface=windows,offset=32256 ./image.001 /mnt/windows_mount

A few explanations of that command line are probably in order. For those unfamiliar with the mount command, it may seem odd I specified an offset of 32256 instead of 63. The mount command does not work with sectors. Rather, it works with bytes. Knowing each sector was most likely 512 bytes and that the partition started on sector 63, you must multiply the number of sectors times the bytes per sector: 512 x 63=32256.

Also, show_sys_files allows you to see all the Windows "meta" files.

The streams_interface=windows option allows you to view Alternate Data Streams in Linux using the ntfs-3g driver just as you would in Windows.

The mount location I specified, /mnt/windows_mount is a location already existing in the SIFT Workstation for mounting images. One of the great things about this is the /mnt folders and the /cases folder are all shared via the network and you can access them from your host machine as network locations.

Anyway, the procedure outlined above worked like a charm. The partition on the image mounted immediately and I found the filesystem within that partition was intact. I was able to navigate it in Linux, as well as accessing it through the network from my host Windows machine.

I mapped a drive letter on my host to the /mnt so I could easily access this partition from either the SIFT or run other Windows based tools on it from the host machine. The great thing about this is I was able to run log2timeline-sift in the SIFT environment on my image while scanning the mounted image with antivirus from the host Windows machine, all at the same time.

Now, as I said, there is no way to mount this disk (or image) natively in Windows, at least as far as I know. However, it can be done with other tools. I tried FTK Imager, since it is normally able to mount partitions within image files. However, it was unable to mount this one without the partition table being present. IMDisk, a fantastic free tool, allows you to mount images such as this and specify the byte offset of the partition, just like the Linux mount command. It is a Windows based GUI and is extremely easy to use. I tried it and it successfully mounted the image to an accessible drive letter. There are probably other tools that would work, but I didn't have any others to try.

My investigation into this computer is still ongoing. I will post later what, if anything, I find out. I hope I'm able to determine the cause of this incident and get it fixed. I hope this post made a little sense. I'm always happy to respond to questions, so feel free to comment below.

Thursday, November 15, 2012

Malware Forensics Field Guide for Windows

I don't write many book reviews because I don't feel like I'm very good at doing them. However, I've been fortunate to read some very good books lately and wanted to tell you about them. First, I posted my review of Practical Packet Analysis a few days ago and now I want to tell you about another excellent book: Malware Forensics Field Guide for Windows Systems.

For those who like to cut to the chase, my recommendation is you should buy this book. Now. It's that good.

As readers of this blog know, I enjoy learning about malware investigation and forensics. I get excited anytime I hear a new book on malware investigation is coming out. I almost never pre-order books, preferring to just wait till they're actually available. This was one of those books I just knew was going to be good and I wanted it as soon as possible so, for a change, I actually pre-ordered.

Malware Forensics Field Guide for Windows was written by the authors of Malware Forensics, Investigating and Analyzing Malicious Code which came out in 2008. In both cases, the publisher is Syngress, one of my favorite publishers for tech books. As with the first book, this one is also written by Cameron H. Malin, Eoghan Casey and James M. Aquilina. Curtis W. Rose served as the technical editor.

This is not a "second edition" of the previous book. While it occasionally makes reference to the first book, it is it's own separate work. As it's name implies, this book is meant to take with you when you go out on the job and includes checklists, sample field notes and more. I'll say more about them later, but the checklists, field notes and guides are just outstanding.

The book has six chapters, which doesn't sound like much, but each chapter is pretty good sized and chock full of great information. Chapter One is Malware Incident Response. The authors do a great job of covering the collection of volatile data, process information and non-volatile data. What's nice here is they don't just tell you what you should do, they also tell you how to do it. Step by step guides in many cases guide the reader through important investigative processes. The authors are careful to guide you in scientifically sound means of investigation instead of just turning you loose with the tools.

My personal background is in dead disk forensics, not incident response, although I'd love to be involved in IR work. I found Chapter One very valuable to me as a relative newcomer to incident response. The included field notes and interview questions are a huge help to newcomer and veteran alike, helping you make sure you've "covered all the bases" in your response. The checklists are just great, reminding you of things you should always check during your assessment of a system, such as collecting volatile data, checking  Windows Prefetch files and so on.

Also included is a chapter on memory forensics. Many exciting things have occurred in the area of memory forensics as of late, led by the great Volatility Framework team, as well as Matthieu Suiche and his Moonsols company (win32dd, etc). Other cool software has come from the likes of Mandiant with its Redline tool and HBGary's Responder software. These days, a book like this couldn't be written without talking about memory forensics and the authors do an excellent job of covering the material. Use of the tools mentioned above and others is covered in great detail. As with most of the other chapters, this chapter ends with a Pitfalls to Avoid section, a checklist and interview questions section, a toolbox section (extra detail on the tools mentioned in the chapter) and a selected reading section directing the reader to more information on the topic of that chapter.

A chapter devoted to "traditional" post-mortem forensics is next. This chapter takes you through the investigation of a suspected victim computer, concentrating on disk based artifacts. Web history, OS and application logs, the Windows Registry and prefetch files are among the sources of possible evidence discussed. Other things like possible autostart locations and keyword searching are also talked about. Something I really liked about this chapter, as well as throughout the whole book, is the way the authors continue to stress the need to use a repeatable, scientific process to conduct your investigation and the need to document it. Further, throughout the book they talk about the importance of validating your results.

Legal considerations when conducting forensic investigations is covered in Chapter 4. I was glad to see this chapter included in the book, as I believe it's far too easy in the heat of the moment to just start doing the fun part (the investigation) without giving consideration to how the law views what you're doing. Federal wiretap laws, HIPAA, PCI, state laws and much more are covered in this chapter. It's nicely done and helps the reader to appreciate the potential legal pitfalls of this work. A very large book could be dedicated to just the legal concerns we face, so obviously not every possible legal topic is covered here, but the authors do a great job of getting the point across and helping the reader to have a good basic idea of the law, as well as giving the reader a better idea of what questions they should answer before proceeding with an investigation.

Next is File Identification and Profiling. I enjoyed the entire book, but this may have been my favorite chapter. Again, extensive note-taking and correlation of findings are stressed here. The focus of this chapter is on studying a suspect file, finding out what it is, what it does and so on. Hashing, file headers and file metadata are discussed, along with much more. The section on file obfuscation was very helpful to me, as it talked about various ways of hiding or obfuscating the functionality of files through the use of packing and encryption. The chapter wraps up with tips on profiling pdf, Microsoft Office and Windows .chm files.

Analysis of a malware specimen is the focus of the final chapter and wow, it is awesome. The authors provide a huge amount of great information on methods for performing static and dynamic analysis of specimen files. As in the other chapters, tools are suggested and step-by-step guidance is given for some. Also talked about are automated "sandbox" style testing using Buster Sandbox Analyzer, ZeroWine and online sandboxes like the ones available from GFI and Norman. Means of defeating file obfuscation are also given. There is far more to this chapter than I could possibly tell you about here.

As previously mentioned, each chapter (except the legal chapter) ends with sample field notes, interview questions, a toolbox (details about tools discussed in the chapter) and suggested reading. The notes and checklists are great, but there isn't much you can do with them in a book. Fortunately, you can go to the book's website and request electronic copies by clicking the Field Notes link at the top of the page. I received the five pdf files of notes and checklists by email after requesting them. The pdf's are in full color and very readable. They're an excellent resource and I know I'll use them.

In conclusion, I want to say I truly enjoyed reading this book and learned a lot from it. I truly only touched on highlights in this review. There is so much more to this book than I've mentioned. I strongly recommend it to anyone whose job entails responding to malware related incidents, as well as to all who simply have an interest in the subject. It is well written, easy to follow and chock full of information that I know I'll refer back to many times. I see the authors have another book, Malware Forensics Field Guide for Linux Systems, scheduled to come out soon. I guarantee I'll be buying that one too.

Sunday, November 11, 2012

Book review: Practical Packet Analysis

This is a copy of the review I put on Amazon tonight for the book Practical Packet Analysis (2nd edition). I didn't say this on Amazon, but I would recommend you buy the book straight from the publisher, No Starch Press, because you get the electronic version for free with your purchase of the physical book. Anyway, here's my review:

After reading this book, I have a much better understanding of the capabilities of Wireshark, but I really learned so much more. The author does a great job introducing the reader to basic networking concepts, such as the OSI model, data encapsulation, ports, MAC and IP addresses and so on. Chris Sanders does an excellent job teaching the basics and moving on from there in a way that even those very new to the material can keep up.


Networking has always been something I've known just a little about, but I've never been anywhere close to an expert. While I knew about setting up a basic Windows network, that was about it. I took SANS Network Forensics (FOR 558) last year, which uses Wireshark some and learned a lot. Looking back, I can see how much better off I would have been had I read Practical Packet Analysis before the class. So much of what was discussed in class is covered in PPA in clear, concise explanations that would have made it easier for me when I took the forensics course.


This really is one of the best tech books I've ever read. I don't say that lightly, as I've read many good IT and computer forensics books. It is well written and easy to follow. The author has .pcap files available for download from the publisher website so the reader can follow along with the examples in the book. To me, this made learning the material that much easier, allowing me to see first hand what was being taught.


Another thing I like about this and other books from the publisher, No Starch Press, are the graphics. Screenshots of computer screens are often very difficult to make out in other publishers books, but I've noticed in all of my No Starch books they are easy to see.


Practical Packet Analysis is a must-read for anyone wanting to learn how to sniff and analyze packets. Highly recommended!


I am currently working on a review of Malware Forensics Field Guide for Windows Systems. I hope to finish and post it sometime next week.

Wednesday, October 10, 2012

Quick Check-in

Hello everyone. I'm back for yet another blog post after a long absence. I wanted to post a great big giant THANK YOU to all the great open source forensic projects out there. You all are my heroes and I truly appreciate all that you make available to the digital forensics community.

I've been very busy as of late working on a case I was hired for and recently concluded. I wish I could tell about it as it's kind of interesting. Unfortunately, the most interesting parts are the things I really can't talk about. I'll just call it an employee misuse of company computer situation and leave it at that.

Among other things, the employee in question was using his company owned laptop to surf various types of porn, as well as using MS Word to do a little amateur porn story authoring. There were allegations of some financial misdeeds as well and I recovered a large number of files to help them conduct their investigation.

I relied in no small part on the knowledge I've gained from such books as the Windows Forensic Analysis series and Digital Forensics with Open Source Tools in working this case. Furthermore, I stand grateful to all the free and open source tool authors out there whose work benefited me greatly. Such awesome programs as Log2Timeline, RegRipper, the Sleuth Kit and the SANS SIFT Workstation virtual machine were huge help to me in this and most all my other cases.

The super awesome Volatility Framework crew has been rocking the proverbial house this month with their Month of Volatility Plugins. First, Volatility 2.2 was released at the beginning of this month and they're releasing tons of new plugins all month long. The blog posts at the official Volatility Labs blog accompanying these releases are just incredible. A great thank you and salute in no particular order to AAron, Jamie (Gleeda), MHL, Andrew and everyone else involved for using your talents to produce one of the greatest software projects ever and an amazing blog.

I've had the opportunity to do some public speaking lately and find I'm really enjoying it. Public speaking used to make me quite nervous, but I'm pretty comfortable with it these days. I've spoken to one group on protecting your home computer from malware, etc, while I did a training session on Identity Theft this week for a local bank. In both cases, I decided not to use Powerpoint or other visual aids. I believe it was Harlan Carvey at the WACCI conference a couple years ago who called the Powerpoint-free presentation "going commando." I liked that term and I enjoyed his presentation that day. It's fun to speak and interact with the crowd and I find I do a better job of that when I'm not using visual aids to distract me. Besides, I'm terrible at making PP slides anyway, so I'm better off going commando for that reason as well.

That's all I've got for now. Now that I've got a little more free time, I have a couple forensics related projects I hope to get started on. I hope to be back with new blogs posts about them "soon."

Monday, July 9, 2012

Knowing Normal

I heard talk at the SANS DFIR Summit a couple weeks ago about "knowing normal".  What does that mean? Knowing what your systems and networks are doing each day and what their stats should look like. That way, even if you don't really know how to recognize something bad, you'll still know you're not seeing what you usually expect to see. This will (hopefully) lead you to investigating the oddity and finding the cause. You don't have to be an expert on what you're looking at; you only need to know it doesn't look like it should.

I had the opportunity last week to help out a local business tracking down strange issues on their network. It's a small business and they have no actual IT staff. Rather, some of the employees try to manage things as best they can and call in help when a problem is beyond their ability to fix. The business runs an Untangle router/firewall and it automatically sends an email each day to an employee with stats for the previous day. On this day, the employee noticed that their network traffic was more than double what he expected to see on a "normal" day.

The week of Patch Tuesday is one of those times the employee expects to see a rise in network traffic when the 15 computers in the business receive their updates. On a typical workday, the network normally sees about 1.0 to 1.1 gigabytes of traffic. On this particular workday, the traffic rose to 2.47 gigabytes without any obvious reason. The employee knew something was wrong, but he didn't know what. Things just weren't "normal."

I was given access to the Untangle control panel and I began looking at the event logs for each section. I found nothing remarkable until I opened the panel for Application Control Lite. This section monitors various network protocols for applications like chat programs, peer to peer networking, etc. I was sure I found the problem as soon as I looked at the protocol logs. One workstation on the network was making repeated attempts to make contact with a UK IP address via the Soulseek peer to peer networking protocol. This was definitely not normal

A look at other system reports showed the second most popular destination port through the Untangle gateway was port 16464 (port 80, naturally, was number 1). Once again, not normal.

I went to the troublesome computer and found a fake antivirus program on it.  I decided to create an agent using Mandiant Redline to collect volatile data and a memory image prior to beginning cleanup. My plan was to use Redline to examine the data it collected and then later use the awesome Volatility Framework to continue studying the data.

While waiting for the Redline agent to finish, I posted a tweet that I was dealing with an apparent malware infection using a peer to peer protocol to commnicate out of the network. One of my friends, @dfirn00b asked if a port in the area of 16464 was part of the picture and I said it was. He told me it was very likely the ZeroAccess rootkit and said he'd created an Indicator of Compromise (IOC) for use with Redline to detect it. He pointed me to a location on disk where I would likely find files related to the infection and they were there. I happily accepted an offer to try out the IOC and returned the test results when Redline finished creating the report. The IOC had hit on several items in this collection and I would declare it a success. He has a blog post up on the DFIR Journal blog talking about this test and how the IOC was put together.

I collected some network traffic using WireShark on a Linux laptop I connected to the network with a hub. I haven't had time to review that traffic yet, but plan to later this week. I then rebooted the machine and loaded SMART for Linux from a live Ubuntu Linux CD-Rom. I imaged the hard drive and then once again rebooted, this time to a BitDefender Antivirus live CD. It found quite a few trojan's and deleted them for me. I tried booting to a Microsoft Security Essentials live CD, but it would never load

I finally rebooted back to the installed Windows XP OS and ran GMER to look for any further signs of rootkits. None found, I ran ComboFix and later, MalwareBytes. I removed the installed antivirus (10 year old copy of Norton Corporate) and installed Microsoft Security Essentials. The cleanup had left a few malware related files behind and I removed them manually.

Since that day, a close eye has been kept on the network logs and no further sign of malware phoning home has been seen. Through further scans with the ESET Online Scanner and others, the system does seem to be clean. I do plan to make a forensic timeline and further investigate the memory image, hard drive image and network traffic as time allows. The business is in no hurry for results and I have other "paying" cases to get done first. I'll add a new post here if and when I find something of value.

So, all this started with someone knowing what "normal" was and, even better, knowing when they weren't seeing it. The simple act of reviewing boring logs every day helped find and fix the problem. Kinda cool, don't ya think?

Sunday, July 1, 2012

Back from the Summit

What a week this has been! I attended the SANS Digital Forensics and Incident Response Summit in Austin, Texas this week and had an amazing time. I told Rob Lee I didn't think he and the SANS team could top last year's Summit, but somehow they managed to do it. That doesn't diminish the last year's outstanding conference by any means. Rather, they just raised the bar for such gatherings. Kudos to Rob and everyone at SANS for truly doing a fantastic job. Also, thank you very much to my friend Andrew Case for making it possible for me to be there.

The quality of the presentations was very good. I was familiar with most of the presenters and expected no disappointment. I won't review them all, as that has been done on other blogs. The only downside, if you will, was with simultaneous presentations going on in two tracks, it was hard to attend every talk you wanted to see. That's not really a complaint, as having two tracks certainly gave a lot more people the opportunity to present and allowed for a wide range of topics. Fortunately, SANS maintains Summit archives.

 One of the talks I did want to mention was the opening day keynote by Cindy Murphy, the Forensic Forecast Digital Forensic Examiner of the Year. I've never heard more post-talk discussion about a keynote speech at any conference. I heard many high praises for the speech from everyone I talked to. It was obvious she had put a lot of thought and heart into it. I want to congratulate her on an awesome speech and her well deserved award. Congrats Cindy, I'm really proud of you!

I also wanted to mention the talks by Andrew Case on Mac Memory Analysis and Sarah Edwards entitled "When Macs Get Hacked" were excellent. I know little about the Macintosh system, but they both did a great job relating the material to those of us without a heavy Apple background.

Finally, Melia Kelley and Alissa Torres both rocked the place in their respective talks. They are awesome presenters and I hope to have the opportunity to see them present again.

It was cool seeing so many women on the stage this year. I respect and admire each of them so much and am happy to see them stepping to the forefront in this field. They've always been there, they just haven't always gotten the recognition they deserve.

Something I really wanted to talk about, though, has nothing to do with computers or forensics. I want to tell you what an awesome group of people I was privileged to spend time with while at the Summit. As Cindy mentioned, it's so great to be around people who "get" you. The camaraderie I experienced this week with people I truly respect and look up to was amazing. Some I'd met face to face previously, while some I "knew" online only. I was also happy to meet some people I hadn't known previously and I hope to maintain a lasting friendship with them as well.

As you may have guessed, my time in Austin was greatly enjoyed. It was fantastic seeing my close friends Joe Garcia and Brad Garnett again. The three of us have spent a lot of time together over the last couple years, both online and off and I really enjoyed "getting the band back together".

The opportunities for networking are always one of the best things about a conference. I'm not talking about "looking for a job" networking necessarily, although that can come about as well. The networking I'm talking about is the kind where you gain good friends; people you can count on when you need help and having the chance to be there for them when needed. Of the SANS events I've been too over the last few years, I can say that type of networking is never in short supply. Kudos to the people at SANS for knowing how to balance the program with the networking to make an event that results in both learning and friendship.

Tuesday, March 20, 2012

From the All's Well That Ends Well department

Hello all, it's me with another drive by blog post.Yes, it really has been 7 months since my last post here. I won't be surprised if no one ever reads this. Time for writing or, for that matter, almost anything else, has been in short supply. Between my regular job and teaching at Lincoln Trail College I really haven't had the time to devote to testing and writing.

I took on a data recovery job recently that allowed me to practice my forensics skills a bit. The end result was rather humorous to me. I was asked to recover two missing Excel spreadsheets from a Windows 7 laptop computer. The owner told me the two spreadsheets had been saved to both the user's desktop, as well as a usb jump drive. These spreadsheets had important data and it would take a very long time to recreate them, so of course the owner was very interested in recovering them. He gave me the laptop, but didn't have the usb drive with him, so he said he would bring it to me the next day. I assumed this was going to be an easy job and it was.....sort of.

I took the laptop to my home office and hooked its hard drive up to my analysis system via a Wiebetech Ultradock v4. I started X-Ways Forensics, created a new case and brought the drive in to the case, followed by creating a forensic image and replacing the drive in the case with the newly created image. I selected Refine Volume Snapshot in X-Ways and set it up to do a particularly thorough file system data structure search and a file header signature search in hopes of carving out the missing files from unallocated space. Upon completion, I found no sign of the missing files.

Upon receipt of the 124 mb Lexar brand usb drive, I imaged it and added it to my X-Ways case. Following the same procedures as above, I expected I would find the missing files. No dice. Nada. Nothing.

The owner of the files had been quite certain the files had been in the two locations previously described. I decided I would look at the volume shadow copies from around the time the files went missing and see if I could locate anything. I used the VHD method of accessing the VSC's as presented by Harlan Carvey in Windows Forensic Analysis 3E and also as he presented in the DFIR Online session back in December. This method worked like a charm, but once again the data failed to be where I expected it.

About to give up, I realized I had failed to do something fairly obvious. I checked the Recent folder in Users\user\appdata\microsoft\windows\ to see if I could determine where the spreadsheets had been accessed from in the first place. I found lnk files for both spreadsheets in the Recent folder and they both had been opened from the "G:" drive, which I at first assumed to be the Lexar usb drive. I looked further into the lnk files and found the volume name of the G: drive had been Cruzer, which I knew to be the name for various SanDisk drives, not Lexar drives. I checked the volume name on the Lexar drive and it was not Cruzer.

I called the owner and asked if there was any chance at all the files had been saved to a different usb drive, possibly a SanDisk Cruzer. He didn't think so at first, but had such a drive in his pocket. He inserted it into a computer at his office and guess what....there were the two files!

This certainly wasn't your traditional data recovery, but in the end it worked out. I never did find any evidence the files had been saved on the laptop hard drive, but at least they got their files "back". Even though the files were never really gone, they were gone as far as the owner knew and a lot of work would have been done to recreate them if not for the use of digital forensic analysis. So, as mentioned in the title of this post, all's well that end's well.

Finally, I wanted to mention two awesome books I got recently. As previously mentioned, Windows Forensic Analsys 3E came out and I have been reading it. This version is a companion to WFA 2E and has lots of great info on volume shadow copies, file analysis, malware detection and much more. Harlan has never written a bad book as far as I'm concerned. Each one has excellent and well researched info of use to the forensics/IR crowd.

Also, another great book I got recently is Practical Malware Analysis. I've only had time to read the first couple chapters, but I love this book. It is so well written and easy to follow. Given the potential difficulty of learning this material, Michael Sikorski and Andrew Honig did a fantastic job writing it in a way that noobs and veterans alike can learn from it.

I'm hoping it won't be another 7 months before I post again. I'm going to try to make time to write now and then, but we'll see.