Evidence handling procedures are evolving
Evidence handling is clearly one of the most important aspects in the expanding field of computer forensics. The never-ending innovation in technologies tends to keep best practices in constant flux in effort to meet industry needs. One of the more recent shifts in evidence handling has been the shift away from simply "pulling the plug" as a first step in evidence collection to the adoption of methodologies to acquire evidence "Live" from a suspect computer.
The need for changes in digital evidence collection are being driven by the rapidly changing computing environment:
- Applications are installed from removable media such as a USB stick and are then virtualized in RAM without a trace on the hard disk
- Root kits hide within process undetected by the underlying operating system and when using local tools (binaries) — you must analyze memory with trusted binaries
- Malware is fully RAM resident with no trace of existence on the hard disk
- Users regularly utilize covert / hidden encrypted files or partitions - areas of the hard drive to hide evidence
- Popular web browsers offer the user the ability to cover their tracks — log files of user activity are created but deleted when the browser is closed
- Web 2.0 continues to change the landscape with web based email, blogs, wiki's and twitter extending storage of user actions / communications beyond the traditional hard disk found on the users machine.
An Introduction to Live Digital Evidence Collection
Effectively Live forensics provides for the collection of digital evidence in an order of collection that is actually based on the life expectancy of the evidence in question. Simply put in all likelihood perhaps the most important evidence to be gathered in digital evidence collection today and for the foreseeable future exists only in the form of the volatile data contained within the computers RAM.
Order of volatility of digital evidence
- CPU, cache and register content
- Routing table, ARP cache, process table, kernel statistics
- Memory
- Temporary file system / swap space
- Data on hard disk
- Remotely logged data
- Data contained on archival media
An accepted best practice in digital evidence collection - modified to incorporate live volatile data collection
Stand Alone Home Computer
For proper evidence preservation, follow these procedures in order (Do not use the computer or search for evidence)
- Photograph the computer and scene
- If the computer is off do not turn it on
- If the computer is on photograph the screen
- Collect live data - start with RAM image (Live Response locally or remotely via F-Response) and then collect other live data "as required" such as network connection state, logged on users, currently executing processes etc.
- If hard disk encryption detected (using a tool like Zero-View) such as full disk encryption i.e. PGP Disk — collect "logical image" of hard disk using dd.exe, Helix - locally or remotely via F-Response
- Unplug the power cord from the back of the tower - If the computer is a laptop and does not shut down when the cord is removed then remove the battery
- Diagram and label all cords
- Document all device model numbers and serial numbers
- Disconnect all cords and devices
- Check for HPA then image hard drives using a write blocker, Helix or a hardware imager
- Package all components (using anti-static evidence bags)
- Seize all additional storage media (create respective images and place original devices in anti-static evidence bags)
- Keep all media away from magnets, radio transmitters and other potentially damaging elements
Collect instruction manuals, documentation and notes - Document all steps used in the seizure
- Note: * If computer is x64 the author recommends collecting the image of RAM using HBGary FastDump Pro
Live forensics of volatile computer evidence is not necessarily a new or recent development
The author's first exposure to live forensics in digital evidence collection was nearly 10 years ago during his initial SANS GIAC Certified Forensic Analysis (GCFA) forensics training. The course included several hands on labs that allowed students to become familiar with tools such as the Windows Forensic Toolkit (WFT) that automated the collection of the volatile data from the subject PC in a forensically sound manner:
- Only utilizes trusted / know binaries
- Minimized impact on the subject PC — any impact documented
- Extensive logging
- Creates hashes for all tools utilized as well as all data collected
Hence even a decade ago computer forensics evidence collection training went well beyond being limited to simply imaging a hard drive. It included the training necessary to perform the collection of "Live" evidence such as that found in RAM in a forensically sound manner. As an example the methodologies taught at SANS as part of the GCFA training enabled the forensics investigator to include the volatility of all data as part of their consideration in the planning for the evidence collection process. Hence using the training from SANS you were effectively enabled to collect all available and relevant evidence. Starting of course with that data which is most volatile first. Not simply focusing on the limited evidence available on the computer hard drive.
Live forensics resources
There are several other options that have become available that the author has become familiar with to acquire volatile digital evidence - live data including creating an image of RAM in a forensically sound manner (in no specific order):
- Nigilant32
- Live Response
- Prodiscover IR
- Mandiant Intelligent Response
- KntDD
- HBGary Responder
- HBGary FastDump Pro
- F-Response
In closing
In digital evidence collection today live forensics has become a necessity. Among many forensic professionals both in law enforcement and private practice it has long been recognized that the tradition of first pulling the plug on a PC under examination is an outdated and overly conservative approach that can destroy valuable evidence. Our ability to reliability collect volatile evidence in a forensically sound manner has effectively rendered our legacy best practice of "pulling the plug" as an obsolete methodology.