Difference between pages "Forensic Disk Differencing" and "Disk Imaging"

From Forensics Wiki
(Difference between pages)
Jump to: navigation, search
m (idifference.py)
 
(Decryption while imaging)
 
Line 1: Line 1:
Forensic Disk Differencing is the process of taking two or more disk images from the same computer and determining what changes in the first disk image might have resulted in the changes that are observed in the second. One common use of differencing is to determine what an attacker did during a break-in. To be used for this purpose, it is necessary to have a forensic disk image of the computer before the break-in and after the break-in.
+
{{expand}}
  
==Differencing Tools==
+
Disk imaging is the process of making a bit-by-bit copy of a disk. Imaging (in more general terms) can apply to anything that can be considered as a bit-stream, e.g. a physical or logical volumes, network streams, etc.
===idifference.py===
+
idifference.py is part of the [[Digital Forensics XML]] Python Toolkit distributed with [[fiwalk]]. This tool will compare two different disk images and report changes in files between the first and the second. It also produces a timeline of changes.
+
  
For example, using the '''nps-2009-canon2''' series of disk images:
+
The most straight-forward disk imaging method is reading a disk from start to end and writing the data to a [[:Category:Forensics_File_Formats|Forensics image format]].
 +
This can be a time consuming process especially for disks with a large capacity.
  
<pre>
+
The process of disk imaging is also referred to as disk duplication.
$ python idifference.py /nps-2009-canon2-gen2.raw nps-2009-canon2-gen3.raw
+
>>> Reading nps-2009-canon2-gen2.raw
+
>>> Reading nps-2009-canon2-gen3.raw
+
  
Disk image:/corp/drives/nps/nps-2009-canon2/nps-2009-canon2-gen3.raw
+
== Disk Imaging Solutions ==
 +
See: [[:Category:Disk Imaging|Disk Imaging Solutions]]
  
New Files:
+
== Common practice ==
 +
It common practice to use a [[Write Blockers|Write Blocker]] when imaging a pyhical disk. The write blocker is an additional measure to prevent write access to the disk.
  
2008-12-23 14:26:12 1315993 DCIM/100CANON/IMG_0041.JPG
+
Also see: [[DCO and HPA|Device Configuration Overlay (DCO) and Host Protected Area (HPA)]]
  
Deleted Files:  
+
== Integrity ==
 +
Often when creating a disk image a [http://en.wikipedia.org/wiki/Cryptographic_hash_function cryptographic hash] is calculated of the entire disk. Commonly used cryptographic hashes are MD5, SHA1 and/or SHA256.
  
2008-12-23 14:12:38 855935 DCIM/100CANON/IMG_0001.JPG
 
2008-12-23 14:22:38 1347778 DCIM/100CANON/IMG_0037.JPG
 
  
Files with modified content (but size unchanged):  
+
By recalculating the integrity hash at a later time, one can determine if the data in the disk image has been changed. This by itself provides no protection against intentional tampering, but can indicate that the data was altered, e.g. due to corruption. The integrity hash does not indicate where int he data the alteration has occurred. Therefore some image tools and/or formats provide for additional integrity checks like:
 +
* A checksum
 +
* Parity data
 +
* [[Piecewise hashing]]
  
Files with changed file properties:
+
== Smart imaging ==
 +
Smart imaging is a combination of techniques to make the imaging process more intelligent.
 +
* Compressed storage
 +
* Deduplication
 +
* Selective imaging
 +
* Decryption while imaging
  
DCIM/CANONMSC/M0100.CTG SHA1 changed 69b30c352ee802f49b1ea25325af9fa05c3ffca1 -> baa42c03a917b01b212fb7e538e5deb525995f31
+
=== Compressed storage ===
DCIM/CANONMSC/M0100.CTG crtime changed to 1230070924 -> 1230071142
+
DCIM/CANONMSC/M0100.CTG mtime changed to 1230070924 -> 1230071142
+
DCIM/CANONMSC/M0100.CTG resized 180 -> 188
+
  
Timeline
+
A common technique to reduce the size of an image file is to compress the data. Where the compression method should be [http://en.wikipedia.org/wiki/Lossless_data_compression lossless].
 +
On modern computers, with multiple cores, the compression can be done in parallel reducing the output without prolonging the imaging process.
 +
Since the write speed of the target disk can be a bottleneck in imaging process, parallel compression can reduce the total time of the imaging process.
 +
[[Guymager]] was one of the first imaging tools to implement the concept of multi-process compression for the [[Encase image file format]]. This technique is now used by various imaging tools including [http://www.tableau.com/index.php?pageid=products&model=TSW-TIM Tableau Imager (TIM)]
  
2008-12-23 14:25:42 DCIM/CANONMSC/M0100.CTG SHA1 changed 69b30c352ee802f49b1ea25325af9fa05c3ffca1 -> baa42c03a917b01b212fb7e538e5deb525995f31
+
Other techniques like storing the data sparse, using '''empty-block compression''' or '''pattern fill''', can reduce the total time of the imaging process and the resulting size of new non-encrypted (0-byte filled) disks.
2008-12-23 14:25:42 DCIM/CANONMSC/M0100.CTG crtime changed 1230070924 -> 1230071142
+
2008-12-23 14:25:42 DCIM/CANONMSC/M0100.CTG mtime changed 1230070924 -> 1230071142
+
2008-12-23 14:25:42 DCIM/CANONMSC/M0100.CTG resized 180 -> 188
+
2008-12-23 14:26:12 DCIM/100CANON/IMG_0041.JPG created
+
$
+
</pre>
+
  
Here are some more examples:
+
=== Deduplication ===
* [[File:Idifference-demo1.txt]] --- idifference.py run on two disks from the 2009-M57 Patents scenario (Jo's November 23 vs. November 24th disk)
+
Deduplication is the process of determining and storing data that occurs more than once on-disk, only once in the image.
 +
It is even possible to store the data once for a corpus of images using techniques like hash based imaging.
 +
 
 +
=== Selective imaging ===
 +
Selective imaging is a technique to only make a copy of certain information on a disk like the $MFT on an [[NTFS]] volume with the necessary contextual information.
 +
 
 +
[[EnCase]] Logical Evidence Format (LEF) is an example of a selective image; although only file related contextual information is stored in the format by [[EnCase]].
 +
 
 +
=== Decryption while imaging ===
 +
Encrypted data is worst-case scenario for compression. Because the encryption process should be deterministic, a solution to reduce the size of an encrypted image is to store it non-encrypted and compressed and encrypt it again on-the-fly if required. Although this should be rare since the non-encrypted data is what undergoes analysis.
 +
 
 +
== Also see ==
 +
* [[:Category:Forensics_File_Formats|Forensics File Formats]]
 +
* [[Write Blockers]]
 +
* [[Piecewise hashing]]
 +
* [[Memory Imaging]]
 +
 
 +
== External Links ==
 +
* [http://www.tableau.com/pdf/en/Tableau_Forensic_Disk_Perf.pdf Benchmarking Hard Disk Duplication Performance in Forensic Applications], by [[Robert Botchek]]
 +
 
 +
=== Hash based imaging ===
 +
* [http://www.dfrws.org/2010/proceedings/2010-314.pdf Hash based disk imaging using AFF4], by [[Michael Cohen]], [[Bradley Schatz]]
 +
 
 +
[[Category:Disk Imaging]]

Revision as of 04:29, 28 July 2012

Information icon.png

Please help to improve this article by expanding it.
Further information might be found on the discussion page.

Disk imaging is the process of making a bit-by-bit copy of a disk. Imaging (in more general terms) can apply to anything that can be considered as a bit-stream, e.g. a physical or logical volumes, network streams, etc.

The most straight-forward disk imaging method is reading a disk from start to end and writing the data to a Forensics image format. This can be a time consuming process especially for disks with a large capacity.

The process of disk imaging is also referred to as disk duplication.

Contents

Disk Imaging Solutions

See: Disk Imaging Solutions

Common practice

It common practice to use a Write Blocker when imaging a pyhical disk. The write blocker is an additional measure to prevent write access to the disk.

Also see: Device Configuration Overlay (DCO) and Host Protected Area (HPA)

Integrity

Often when creating a disk image a cryptographic hash is calculated of the entire disk. Commonly used cryptographic hashes are MD5, SHA1 and/or SHA256.


By recalculating the integrity hash at a later time, one can determine if the data in the disk image has been changed. This by itself provides no protection against intentional tampering, but can indicate that the data was altered, e.g. due to corruption. The integrity hash does not indicate where int he data the alteration has occurred. Therefore some image tools and/or formats provide for additional integrity checks like:

Smart imaging

Smart imaging is a combination of techniques to make the imaging process more intelligent.

  • Compressed storage
  • Deduplication
  • Selective imaging
  • Decryption while imaging

Compressed storage

A common technique to reduce the size of an image file is to compress the data. Where the compression method should be lossless. On modern computers, with multiple cores, the compression can be done in parallel reducing the output without prolonging the imaging process. Since the write speed of the target disk can be a bottleneck in imaging process, parallel compression can reduce the total time of the imaging process. Guymager was one of the first imaging tools to implement the concept of multi-process compression for the Encase image file format. This technique is now used by various imaging tools including Tableau Imager (TIM)

Other techniques like storing the data sparse, using empty-block compression or pattern fill, can reduce the total time of the imaging process and the resulting size of new non-encrypted (0-byte filled) disks.

Deduplication

Deduplication is the process of determining and storing data that occurs more than once on-disk, only once in the image. It is even possible to store the data once for a corpus of images using techniques like hash based imaging.

Selective imaging

Selective imaging is a technique to only make a copy of certain information on a disk like the $MFT on an NTFS volume with the necessary contextual information.

EnCase Logical Evidence Format (LEF) is an example of a selective image; although only file related contextual information is stored in the format by EnCase.

Decryption while imaging

Encrypted data is worst-case scenario for compression. Because the encryption process should be deterministic, a solution to reduce the size of an encrypted image is to store it non-encrypted and compressed and encrypt it again on-the-fly if required. Although this should be rare since the non-encrypted data is what undergoes analysis.

Also see

External Links

Hash based imaging