Sunday, April 10, 2011

IEF - great new updates including Gigatribe Chat!

Being a user of IEF since near it's inception, I have seen the product grow from an "add on" tool, to one that should almost be considered one our primary forensic toolsets.  Jad has added some new features, one of which will be critical in our own lab - a tool which Jad claims is the first to extract deleted Gigatribe Chat messages (although I know there is a parser tool called Gigaview over at QCC.)  We have noticed a significant increase in the use of Gigatribe as evidenced by many CP investigations, and welcome any additional tools to assist in the investigation of same.  I also note that he has expanded the capability of recovering deleted chat messages from RAM dumps, unallocated space and file slack.  Despite a lot of work being done in the last few years in the area of Windows RAM acquisition and analysis, why do I feel there is still much more to be discovered (??!).

Other improvements to IEF (from his site) include:
  • Gigatribe chat now supported (in Standard and Portable editions)
  • Facebook Email search improved
  • Firefox formhistory.sqlite search improved
  • Unicode support added for Facebook Email, Snippets, and Wall Posts / Status Updates (Unicode is converted to the appropriate HTML code)
  • Minor user interface bugs in IEF/Report Viewer fixed
For those still using the v3.x versions, it's definitely worth considering the jump over to v4.1.   From my point of view, the practice of digital forensics is very time consuming and methodical.  If there is a tool which is reliable and can speed up several facets of my investigation, it is worthy of consideration.  My thoughts...

Sunday, November 14, 2010

Windows Event Logs and F-Response

I have been looking to better define my ability to identify, preview and analyze Windows logs.  When I am looking at identifying key information/evidence to acquire from a target system, I found that the area of Windows Event logs was not as well defined as I would like (in terms of a defined methodology).  I set out to find a solution for analyzing Windows Event logs on both remote systems, as well as over a write-protected connection (such as F-Response).  I am aware that Windows has it's build in Event Viewer; however, my experiences with this application have been less than satisfying for the purpose of forensic documentation.

Here's what I found...

Tools I used:
1.  F-Response (Tactical Edition).
2.  Windows 7 Ultimate (Examiner's machine).
3.  Windows 7 Starter Edition (Target machine).
4.  Cross-over cable.
5.  Event Log Explorer v3.3  (by FSPro Labs).

a.  I connected my Examiner laptop and the Target laptop via a cross-over cable, and allowed the laptops to each assign an IP address.
b.  Started F-Response on each machine with the write-protected Subject and Examiner's dongles.  Connection established:

c.  Started Event Log Explorer on the Examiner's machine.  When the program starts, it automatically enumerated my Examiner's machine (local). 

d.  To avoid any confusion/contamination, I removed my local machine from the Computer Tree.  This leaves the column blank, ready to enumerate your remote machines, or other locally identified event log files.

e.    I saved the workspace with an identifiable name.

f.  In  Event Log Explorer, I selected   File > Open Log File > Direct  and pointed to the TARGET system that I had mounted using F-Response. 

**  The Windows Event Logs were found at \Windows\System32\winevt\Logs **

g.  I then loaded one of several different Event Log files - in this case, I loaded "system.evtx".  Like with any log, I was presented with thousands of entries - 25,385 to be precise.

h.  And this is where I found Event Log Explorer to take off, in terms of features.  I am familiar with using a more robust Log Analysis tool such as Splunk, so I am aware of the importance of learning about filters.   Filtering is straight-forward, and I'd recommend it. 

And this leads to filtered results:

i.  I also used color coding to better identify my filtered Event Logs, based on any of the criteria present in the Event Log.  The coding is completely configurable.

j.  Reporting - you can export your Event Log in one of many formats including HTML, tab separated and Excel formats.

While Event Log Explorer excels at offering flexible filtering and reporting, I personally found the most useful feature is the ability to merge logs.  When logs are merged, and then properly filtered, I found that I could create a very good timeline of a systems event logging.  In combination with tools such as RegRipper, these tools are invaluable when attempting to (in Chris Pogue's terms), conduct your "Sniper Forensics".

I believe that the noted programs offer a useful solution for previewing and collecting evidence from Windows Events Logs, in a live-analysis situation.  My preference will alway be to conduct a full analysis at the lab;  in circumstances where one needs to conduct an at-scene triage, this combination of tools may assist.

ps.. I was wondering how this tool (Event Log Explorer) would work within WinFE.  I had just burned off my most recent version, so this may have to wait.  I can tell you that I tested the installation directory of Event Log Explorer by copying it to a USB Drive, and trying the tool on a different computer.  It worked very well, however prompted for my Registration key (I entered 30 day demo and continued on). 

Some references on Windows Event Logs
Event Logs (TechNet)
Event Viewer  (TechNet)

Monday, October 25, 2010

Blackberry IPD files and FTK 3.2

I was curious as to FTK's ability to analyze RIM Blackberry IPD files.  I imported 7 backup files - some were "Autobackups" and others were manually created backup files.  The process of importing them was as simply as pointing to a live directory, and you are given the option of creating a image of the files or working from the "live files".

After the processing was complete (took about 1 minute), here's what the files looked like within FTK:

When you open up each of the IPD archives, you will note 89 different fields; some are populated with information and some are empty.  The fields which have data within them will most often produce HTML files which start with rows_0000000_0000xxx.html.  The .html report can be viewed and read quite readily in the "filtered" or "natural" mode.  There are some other formats as well.  The image below shows the database-type format in which the data appears.

An exception to the "row_000..." format is noted below.  You will note that the directory structures indicates that the parsed data is stored within "blobs".  Each of the 89 fields had a folder titled "blob", although many did not contain any data.  In the absence of any noteable file structure, I would think that this would refer to Binary Large Objects (string of binary with no associated code).  In the example below, the "Content Store" folder had several files.  Several of the files were images, which had file names blob_Data_00000xx, where xx was a number.  The numbers appeared to be incremental.  (image was purposely blurred for privacy).  Other files within the folder include exif data for the images - FTK "Properties" show that these files were created by the FTK carving process.  FTK was able to carve out images from the IPD files, although the number of images I retrieved (4) would suggest that the images were not truly "carved" from any undeleted area.  The properties of the carved images show that they were carved from "Blackberry backup files/blackberrybackup.ipd>>tables>>Content Store>>blobs>>blob_Data00000xx.>>Carved [120].jpeg" (where xx is an incremental number).  The path references a full-size photo, whereas the carved image appears to be thumbnail size.

When you open up the Case Overview tab, you'll see that many of the files have been categorized into the noted fields.

Email was nicely extracted, and displayed in traditional FTK format.  The fields appeared to parse out quite nicely.

And lastly, FTK has always been known for it's indexed search capability.  The following two images reinforce the power of FTK in finding results within the compound IPD file.  I would suspect that use of the Indexing feature will make it easier to identify areas where evidentiary information may be stored; whether the information was parsed out or not.

When you compare the output with that of ABC (Amber's Blackberry Converter), FTK does not parse out nearly the amount of information available from ABC; but, it does appear to provide us a more "forensic" approach to data whereby the data can be more easily validated against the raw data.

I'm am guessing that more fields will be retrievable as versions develop.  Overall, another great improvement in FTK 3.2.

Friday, October 22, 2010

Updated Windows Registry and Mac resources & Jad's Software....updated

As several sites have rightfully pointed out....Accessdata has made a huge jump ahead with their recent release of FTK Imager v3.0.  (not to mention FTK 3.2 and their most recent "Volatile tab.")   Just finished testing it today by mounting physical images and using VFC to virtually boot XP and Win7 systems.  Flawless!    While wandering around their site (actually looking for updated RSR files to add to their most recent Registry Viewer version), I stumbled across two additional documents that I believe are very worthy of a good read - or at least printing out as a permanent reference.

Registry Quick Find Chart - a very recently updated 34-page reference documenting Registry locations for the standard 5 Registry files.  The document has a few new columns in the document - one which lists what versions of Windows the reference pertains to (ie: XP, Vista or Win7) and a second column that states when the Registry reference is updated (immediately, when document opened, at logon...)    This document would also be great starting reference to initiate further research on Registry locations and extractable artifacts.  D/L it....know it....print it and keep it handy!

Mac System Artifacts - another reference document which provides 7 pages of Mac Artifact locations.  With FTK's amazing ability to parse out the Mac OS (including Plists), this document is another one to print off.  Updated in 2010.

Jad has also updated three of his applications:
Internet Evidence Finder (IEF) - updated to v3.6 to handle recent updates to Facebook Live chat.  Commercial - Cdn $49.00; Free for Law Enforcement.
FChat - updated to v1.20.    Commercial - Cdn $29.99
FJF - Facebook JPG finder - updated to v1.2.1.  Currently free for use.

Sunday, October 3, 2010

Kindle 3G Wireless Reading Device - forensically speaking

Having just acquired the new model of Kindle, I got to wondering what kind of information was stored on the device and if necessary, how would I go about accessing this information in the most forensically-sound manner possible.  Here's what I found.

1.  Using a Digital Intelligence Tableau Ultrablock USB write-blocker, I connected my forensic computer to the device through the micro-USB cable that was provided with the Kindle. 
2.  Realizing that it was necessary to power on the device, I did so.  I noted the date/time to compare this with the date/time stamps that were likely to change upon boot. 
3.  When powered on, I immediately checked to ensure the 3G/Wireless was turned off.  Select "Menu", toggle the five-way controller up to "Turn Wireless Off" and select the five-way controller (center button).  Alternatively, I could conduct the acquisition within our Faraday tent.
4.   Using FTKImager v2.9.0.5, identified the physical drive attributed to the Kindle.

5.  As noted, the drive recognized as "Kindle Internal Storage" with a size of 3240MB.  I noted that this differs from the stated size of the device (4GB).  Specifically, Amazon states the device has "Storage 4GB internal (approximately 3GB available for user content)."  I then acquired the physical drive as a RAW (DD) format to allow a more robust selection of analysis tools.

Here's what the partition looks like:

And contents of the "documents" directory:

6.  Made note of the filesystem, and VBR header - as noted in the following screenshot.

The filesystem is FAT32, formatted with mkdosfs - DOS formatting within a Linux environment.  From looking at the USER partition which was available, I'm asking whether the SYSTEM partition is ARM Linux Kernel (?).

While admittedly, my Kindle had not been populated with a lot of user interaction, the Kindle definitely does not appear to readily give up information.  It was obvious what books and documents were on my Kindle, and what the last document I accessed was, but as far as other artifacts,  my brief analysis was not overly productive.  I have surfed the Internet, opened several websites and likely populated the device with considerable Internet History.  I could not readily locate any of this history.

Just for heck of it, I through Jad's Internet Evidence Finder at it - nothing.   I'm thinking that a GREP search for Internet History might have more success.  I'm also interesting in running a search for my Wireless Access Point SSID and see what other artifacts might show up.

Other things I found:
- IMEI (3G) information on the device.
-  lots of deleted information.
- a significant number of dictionary terms (including in the unallocated space).

Eric Huber has a posting on the Kindle at A Fistful of Dongles - more great information.

More to come....perhaps I'll see how a boot CD such as Caine interacts with the device.  I'm going to continue to see what the imaged USER image is willing to give up in terms of forensic artifacts.  ps..EnCase will also be involved.

Any thoughts or ideas are welcome.

Thursday, September 23, 2010

Caine v2.0 - Newlight released !

The newest version of Caine, a forensics live Linux distro, has been released. Some of the 20 new tools include MountManager, SSDeep, Air v2.0.0, Log2Timeline and a whole pile of Scripts which are accessed off the file browser. A full list of tools is available on their site.  If you use WinTaylor, the versions been updated to v2.1. Downloading the new version as I type. For the price (Open Source), it's a "must have" for your forensic arsenal. It was less than two months ago when Caine was the only toolset I could get to recognize a signficantly corrupted 500GB portable USB drive, and then carve out images, WordPerfect files, raw images, etc. Directions are available on the site for creating a USB version for a Netbook.  More to come as I try out the new features.  Download the ISO (Caine and NBCaine v2.0) here.

Wednesday, September 8, 2010

Google Voice - Call phones - lovin' the log!

I decided to give Google Voice a try - was kind of difficult to ignore the "reminder" that popped up each time I logged into a GMail account.  Here's what I learned:
- CallerID shows the originating number coming from (760) 705-8888.
- Voice quality was good.  During my tests, I spoke with a colleague and we estimated the lag as 1 second. 
- very easy to use.

Now in Googling the phone number from the call display, I noted that the prank/harassing phone calls are starting already.  So, I decided to see what I could find in terms of call history on the originating (source) computer.  Like so many programs, Google Voice leaves a log - an a nicely detailed log at that!

Location/Path:  (Copied from EnCase-USER Acct edited for privacy)    
GMail Phone\C\Users\USER\AppData\Local\Google\Google Talk Plugin\gtalkplugin-c1598929683.log.bz2

Call History from within Google Account (required to be logged in).

Inside the bz2 archive is single log file containing a wealth of information including:
- IP address of the computer used (including port). Also includes NAT'ed IP address.
- full information on the computer used, including CPU details, OS, GPU details, etc.
- date/time stamps (GMT)
- associated GMail address.
- list of all network adapters on computer and their associated IP addresses.
- reference to address "+1XXX" (XXXX - numbers from the 10 digit phone# removed for privacy)
- log is fully timestamped and appears to contain a lot more information.
- each call generated an individual log file within it's own bz2 archive.

I found the log file quite detailed.  To activate the phone feature, make a 1-1/2 minute call and disconnect, the log file generated approx 247 entries.  As much of the information was new, I imported the log file into Splunk on my MacBook Pro.  Seamlessly, the log file was parsed (with exception of a few stray lines of left-over log entries - which appear to have been created by the use of the right-square bracket.  This is the 3rd time I've used Splunk this last week - absolutely invaluable). 

Definitely more to look through......

Wednesday, August 18, 2010

Want to learn Python....for free?

It's been more than a few years since I took my programming classes and to be honest, it's difficult to keep a skill unless you use it often.  So I decided I'd like to take a programming class and took a close look at Python and Perl.  Has anyone tried to find a good post-secondary class in either language??  Didn't go well.

I persisted in my search and believe I've found a fantastic opportunity...and it's free!!!  You may ask "Yeah, but how good can a free class in programming really be??".  Well, I'll answer that in three letters - M.I.T.

That's right, MIT offers several courses under their MITOPENCOURSEWARE program from courses in Aeronautics and Astronautics, to Writing and Humanistic Studies.  Of course, they have several courses under the area "Electrical Engineering and Computer Science."  Take for instance, the course "Introduction to Computer Science and Programming - Course #6.00".  The course includes full video of classroom lectures,  assignments, exams, solutions - they even have transcripts of the classroom lectures.  You can also download everything so you can study offline if necessary.  Did I mention that it's free?  How about a course specifically in Python - as taught in January, 2010 - A Gentle Introduction to Programming Using Python - Course #6.189.  There are classes in Java, C++, and numerous other areas.

Their Privacy and Terms of Use and information about the Creative Commons licence can be found here.

Monday, August 2, 2010

"The Missing Link" in my computer forensic training.....Network Forensics!

Over the years, I've taken several classes in computer forensics (vendor specific and neutral), information security and networks. Back in April, I realized what was missing - specific training in acquiring and analyzing network-based evidence in a methodical and reproduceable format.  Oh sure, I've used many of the current network tools, but I've always wondered if there was a better way to collect the evidence.  That's when a colleague of mine pointed out a new training course, specifically aimed at meeting this need; and perhaps completing my "Circle of Forensics".

In July, 2010 I had the privilege of attending the new Network Forensics Course - Forensics 558 being offered by the SANS Institute (Washington, DC).  The instructors were Johathan Ham (co-author of the course) and Alan Ptak who provided outstanding training over the 5-day course.  I can tell you, it was very different than my previous training in "traditional forensics".  For over a year now, we've been preparing, training and researching various techniques for acquiring targeted data in a scenario which required us to specifically target and forensically acquire the data which will/would form our evidence (and ensure we do not miss anything, overtly or covertly).  In the day and age of TB-sized hard drives, FDE, volatile data, etc, the move to identify and target key evidentiary information is needed.  Jonathan and Alan's training not only identified key areas to focus on, but several techniques on how to acquire and analyze this information in a more forensically-sound manner.  The course requires a "moderate" degree of Linux familiarity; the instructor's exceptional knowledge and instructional technique more than made up if someone was a little weak in any area.

One point worth mentioning.  While SANS appears to offer laptops on some courses, I have to say that the pre-loaded laptop provided on this course was a good move.  The SNIFT kit had been preloaded onto the laptop and for five straight days, I never heard a single complaint of "something not working", "hardware issues", "I don't have that version", etc.  The laptop had a 250GB hard drive, with VM Workstation and several pre-configured VM sessions (which were essential to the course).   If anyone from SANS reads this - good call!  The hardware (Lenovo S10-3) and software (SNIFT Kit) worked - each and every time.  (if it didn't, it was likely my fault :)

From hearing the various instructors and other speaking during the Summit and Network Forensics course, it appears that SANS has found another niche with a huge demand.  It was an expensive trip - but all in all, very well worth the high quality training.

ps..the presentation from Chris Pogue on Sniper Forensics was awesome and coincidentally, complemented the training provided by Jonathan.  If you ever get to attend this presentation by Chris, don't pass on the opportunity.

In terms of identifying information "to be sniped" from within a larger system, I see a huge need for someone to lead discussion in this area.  It will be necessary to qualify the type of investigation, type of guest/target systems and OS, type of forensics system/tools available, criminal/non-criminal, etc.  The biggest problem I see is that discussions in this area focus on EITHER network/volatile-based evidence, or that normally acquired through "traditional" hard-drive forensics.  We need a discussion which includes all areas of forensics.

Sunday, July 11, 2010

What's next in Volume Shadow Copies...?

Having just attended a presentation by Mark McKinnon (RedWolf Computer Forensics) and Lee Whitfield (Disklabs and Forensic4cast) at the SANS What Works in Forensics and Incident Response Summit 2010, I'd like to make a few comments on the excellent presentation by Mark and Lee. 

The presentation was on Volume Shadow Copies and started with a detailed description of what is currently known about this relatively new avenue in digital forensics.  From the information presented in this 1 hr presentation, if you have not started to think about Volume Shadow Copies, you had better start paying attention in the months to come.  While the process of getting information is moderately complex, the amount of information produced by the Volume Shadow Service will eventually be one of those tasks we must not ignore.  The presentation did a live demo of the product, and gave us an idea of just what we're missing.  While the state of the file may be dependent on the quality/existence of the previous snapshots in place, Mark and Lee have developed a tools which intends to automate the Volume Shadow Copy recovery process.  Timeline to release you ask....?  The authors indicated a few months.  They appear very committed to releasing a stable and tested product.

From their new site for Shadow Analyzer, here's a small idea of what the product can recover:
Shadow Analyser eliminates the hassle of analysing Microsoft volume shadow files. It allows a digital forensic investigators to take a disk image and, using that image:
■view the contents of the hard disk drive at a point in time
■recover deleted and erased files
■extract older versions of current files
■view historic date and time information for all files, both live and deleted
■view changes to files across days, weeks, or even months
■extract complete files from volume shadow files
The authors are putting a lot of time into the development of this tool; the shear volume of information that they are finding within the Volume Shadow Copies seems to be a driving force and motivation behind getting this tool to the end-stages of development.  You can follow their development on Twitter as well at @ShadowAnalyzer  .

Keep up the progress guys.....great presentation and even greater tool !!  Oh yeah....forgot to mention, the tool will be "tri-platform" - Win, Linux and Mac.

For more information on Volume Shadow Copies, take a look here:
Into the Shadows
Reliably recovering evidential data from Volume Shadow Copies in Windows Vista and Windows 7 (pdf)
Volume Shadow Copy Forensics - the Robocopy method Part 1
Volume Shadow Copy Forensics - the Robocopy method Part 2

More to come on the's take days to digest!