Difference between pages "Disk Imaging" and "Compression"

From ForensicsWiki
(Difference between pages)
Jump to: navigation, search
(Integrity)
 
(LZ1)
 
Line 1: Line 1:
{{expand}}
+
{{Expand}}
 
+
Disk imaging is the process of making a bit-by-bit copy of a disk. Imaging (in more general terms) can apply to anything that can be considered as a bit-stream, e.g. a physical or logical volumes, network streams, etc.
+
 
+
The most straight-forward disk imaging method is reading a disk from start to end and writing the data to a [[:Category:Forensics_File_Formats|Forensics image format]].
+
This can be a time consuming process especially for disks with a large capacity.
+
 
+
The process of disk imaging is also referred to as disk duplication.
+
 
+
== Disk Imaging Solutions ==
+
See: [[:Category:Disk Imaging|Disk Imaging Solutions]]
+
 
+
== Common practice ==
+
It common practice to use a [[Write Blockers|Write Blocker]] when imaging a pyhical disk. The write blocker is an additional measure to prevent write access to the disk.
+
 
+
Also see: [[DCO and HPA|Device Configuration Overlay (DCO) and Host Protected Area (HPA)]]
+
 
+
== Integrity ==
+
Often when creating a disk image a [http://en.wikipedia.org/wiki/Cryptographic_hash_function cryptographic hash] is calculated of the entire disk. Commonly used cryptographic hashes are MD5, SHA1 and/or SHA256.
+
 
+
 
+
By recalculating the integrity hash at a later time, one can determine if the data in the disk image has been changed. This is by no means protection against intentional tampering, but can indicate data corruption.
+
The integrity hash does not indicate where the alteration has occurred. Some image tools and/or formats provide for additional integrity checks like:
+
* A checksum
+
* Parity data
+
* [[Piecewise hashing]]
+
 
+
== Smart imaging ==
+
Smart imaging is a combination of techniques to make the imaging process more intelligent.
+
* Compressed storage
+
* Deduplication
+
* Selective imaging
+
* Decryption while imaging
+
 
+
=== Compressed storage ===
+
 
+
A common technique to reduce the size of an image file is to compress the data. Where the compression method should be [http://en.wikipedia.org/wiki/Lossless_data_compression lossless].
+
On modern computers, with multiple cores, the compression can be done in parallel reducing the output without prolonging the imaging process.
+
Since the write speed of the target disk can be a bottleneck in imaging process, parallel compression can reduce the total time of the imaging process.
+
[[Guymager]] was one of the first imaging tools to implement the concept of multi-process compression for the [[Encase image file format]]. This technique is now used by various imaging tools including [http://www.tableau.com/index.php?pageid=products&model=TSW-TIM Tableau Imager (TIM)]
+
 
+
Other techniques like storing the data sparse, using '''empty-block compression''' or '''pattern fill''', can reduce the total time of the imaging process and the resulting size of new non-encrypted (0-byte filled) disks.
+
 
+
=== Deduplication ===
+
Deduplication is the process of determining and storing data that occurs more than once on-disk, only once in the image.
+
It is even possible to store the data once for a corpus of images using techniques like hash based imaging.
+
 
+
=== Selective imaging ===
+
Selective imaging is a technique to only make a copy of certain information on a disk like the $MFT on an [[NTFS]] volume with the necessary contextual information.
+
 
+
[[EnCase]] Logical Evidence Format (LEF) is an example of a selective image; although only file related contextual information is stored in the format by [[EnCase]].
+
 
+
=== Decryption while imaging ===
+
Encrypted data is worst-case scenario for compression. Because the encryption process should be deterministic a solution to reduce the size of an encrypted image is to store it non-encrypted and compressed and encrypt it on-the-fly if required. Although this should be rare since the non-encrypted data is what undergoes analysis.
+
 
+
== Also see ==
+
* [[:Category:Forensics_File_Formats|Forensics File Formats]]
+
* [[Write Blockers]]
+
* [[Piecewise hashing]]
+
* [[Memory Imaging]]
+
  
 
== External Links ==
 
== External Links ==
* [http://www.tableau.com/pdf/en/Tableau_Forensic_Disk_Perf.pdf Benchmarking Hard Disk Duplication Performance in Forensic Applications], by [[Robert Botchek]]
+
* [http://www.coderforlife.com/ Microsoft Compression Formats]
 
+
=== Hash based imaging ===
+
* [http://www.dfrws.org/2010/proceedings/2010-314.pdf Hash based disk imaging using AFF4], by [[Michael Cohen]], [[Bradley Schatz]]
+
  
[[Category:Disk Imaging]]
+
=== LZ1 ===
 +
* [http://andyh.org/LZ1.html LZ1]

Revision as of 02:09, 9 June 2014

Information icon.png

Please help to improve this article by expanding it.
Further information might be found on the discussion page.

External Links

LZ1