Difference between pages "Disk Imaging" and "File:1-BB9780-VendorPlateRemoval.jpg"

From ForensicsWiki
(Difference between pages)
Jump to: navigation, search
(Compressed storage)
 
 
Line 1: Line 1:
{{expand}}
 
  
Disk imaging is the process of making a bit-by-bit copy of a disk. Imaging (in more general terms) can apply to anything that can be considered as a bit-stream, e.g. a physical or logical volumes, network streams, etc.
 
 
The most straight-forward disk imaging method is reading a disk from start to end and writing the data to a [[:Category:Forensics_File_Formats|Forensics image format]].
 
This can be a time consuming process especially for disks with a large capacity.
 
 
== Disk Imaging Solutions ==
 
See: [[:Category:Disk Imaging|Disk Imaging Solutions]]
 
 
== Common practice ==
 
It common practice to use a [[Write Blockers|Write Blocker]] when imaging a pyhical disk. The write blocker is an additional measure to prevent write access to the disk.
 
 
Also see: [[DCO and HPA|Device Configuration Overlay (DCO) and Host Protected Area (HPA)]]
 
 
== Error tolerance and recovery ==
 
...
 
 
== Smart imaging ==
 
Smart imaging is a combination of techniques to make the imaging process more intelligent.
 
* Compressed storage
 
* Deduplication
 
* Selective imaging
 
* Decryption while imaging
 
 
=== Compressed storage ===
 
 
A common technique to reduce the size of an image file is to compress the data. Where the compression method should be [http://en.wikipedia.org/wiki/Lossless_data_compression lossless].
 
On modern computers, with multiple cores, the compression can be done in parallel reducing the output without prolonging the imaging process.
 
Since the write speed of the target disk can be a bottleneck in imaging process, parallel compression can reduce the total time of the imaging process.
 
[[Guymager]] was one of the first imaging tools to implement the concept of multi-process compression for the [[Encase image file format]]. This technique is now used by various imaging tools including [http://www.tableau.com/index.php?pageid=products&model=TSW-TIM Tableau Imager (TIM)]
 
 
Other techniques like storing the data sparse, using '''empty-block compression''' or '''pattern fill''', can reduce the total time of the imaging process and the resulting size of new non-encrypted (0-byte filled) disks.
 
 
=== Deduplication ===
 
Deduplication is the process of determining and storing data that occurs more than once on-disk, only once in the image.
 
It is even possible to store the data once for a corpus of images using techniques like hash based imaging.
 
 
=== Selective imaging ===
 
Selective imaging is a technique to only make a copy of certain information on a disk like the $MFT on an [[NTFS]] volume with the necessary contextual information.
 
 
[[EnCase]] Logical Evidence Format (LEF) is an example of a selective image; although only file related contextual information is stored in the format by [[EnCase]].
 
 
=== Decryption while imaging ===
 
Encrypted data is worst-case scenario for compression. Because the encryption process should be deterministic a solution to reduce the size of an encrypted image is to store it non-encrypted and compressed and encrypt it on-the-fly if required. Although this should be rare since the non-encrypted data is what undergoes analysis.
 
 
== Also see ==
 
* [[:Category:Forensics_File_Formats|Forensics File Formats]]
 
* [[Write Blockers]]
 
 
== External Links ==
 
* [http://www.tableau.com/pdf/en/Tableau_Forensic_Disk_Perf.pdf Benchmarking Hard Disk Duplication Performance in Forensic Applications], by [[Robert Botchek]]
 
 
=== Hash based imaging ===
 
* [http://www.dfrws.org/2010/proceedings/2010-314.pdf Hash based disk imaging using AFF4], by [[Michael Cohen]], [[Bradley Schatz]]
 
 
[[Category:Disk Imaging]]
 

Latest revision as of 01:30, 8 August 2013