Difference between pages "Bibliography" and "Disk Imaging"

From ForensicsWiki
(Difference between pages)
Jump to: navigation, search
m
 
 
Line 1: Line 1:
=Disk Disposal and Data Recovery=
+
{{expand}}
* [http://www.deepspar.com/pdf/DeepSparDiskImagingWhitepaper3.pdf Disk Imaging: A Vital Step in Data Recovery], DeepSpar Data Recovery Systems, November 2006. An in depth look at the many issues that cause data loss / irretrievable data in the data recovery imaging process and how to overcome them.
+
* [http://www.actionfront.com/ts_whitepaper.asp Drive-Independent Data Recovery: The Current State-of-the-Art], ActionFront Data Recovery Labs, August 2005.
+
* [[Recovering Overwritten Data#The Gutmann Paper|Secure Deletion of Data from Magnetic and Solid-State Memory]], Peter Gutmann, Proceedings of the Sixth Usenix Security Symposium, 1996. [http://www.cs.auckland.ac.nz/~pgut001/pubs/secure_del.html]
+
* [http://www-03.ibm.com/financing/pdf/us/recovery/igf4-a032.pdf Hard Drive Disposal: The Overlooked Confidentiality Exposure], FInancial Perspectives, IBM White Paper, November 2003.
+
  
<bibtex>
+
Disk imaging is the process of making a bit-by-bit copy of a disk. Imaging (in more general terms) can apply to anything that can be considered as a bit-stream, e.g. a physical or logical volumes, network streams, etc.
@Article{garfinkel:remembrance,
+
  author =      "Simson Garfinkel and Abhi Shelat",
+
  author_a =      "Simson L. Garfinkel and Abhi Shelat",
+
  title =        "Remembrance of Data Passed",
+
  journal =      "{IEEE} Security and Privacy Magazine",
+
  publisher =    "IEEE",
+
  year      =        "2002",
+
  month    = Jan,
+
  url="http://www.simson.net/clips/academic/2003.IEEE.DiskDriveForensics.pdf"
+
}
+
</bibtex>
+
  
=Evidence Gathering=
+
The most straight-forward disk imaging method is reading a disk from start to end and writing the data to a [[:Category:Forensics_File_Formats|Forensics image format]].
 +
This can be a time consuming process especially for disks with a large capacity.
  
* [http://utdallas.edu/~sxs018540/index/docs/byteprints_itcc05.pdf Byteprints: A Tool to Gather Digital Evidence], Sriranjani Sitaraman, Srinivasan Krishnamurthy and S. Venkatesan, Proceedings of the International Conference on Information Technology (ITCC 2005), Las Vegas, Nevada, USA, April 4 - 6, 2005
+
== Compressed storage ==
 +
A common technique to reduce the size of an image file is to compress the data.
 +
On modern computers, with multiple cores, the compression can be done in parallel reducing the output without prolonging the imaging process.
 +
Since the write speed of the target disk can be a bottleneck in imaging process parallel compression can reduce the total time of the imaging process.
 +
[[Guymager]] was one of the first imaging tools to implement the concept of multi-process compression.
  
=Fake Information=
+
Other techniques like storing the data sparse or '''empty-block compression''' can reduce the total time of the imaging process and the resulting size of new non-encrypted (0-byte filled) disks.
  
* [https://analysis.mitre.org/proceedings/Final_Papers_Files/84_Camera_Ready_Paper.pdf Automatic Detection of Fake File Systems], Neil C. Rowe, International Conference on Intelligence Analysis Methods and Tools, McLean, Virginia, May 2005.
+
== Error tolerance and recovery ==
  
=Feature Extraction and Data Fusion=
+
== Smart imaging ==
Computer Location Determination Through Geoparsing and Geocoding of
+
Smart imaging is a combination of techniques to make the imaging process more intelligent.
Extracted Features
+
* Selective imaging
http://www2.chadsteel.com:8080/Publications/drive_location2.doc
+
* Decryption while imaging
<bibtex>
+
@inproceedings{garfinkel:cda,
+
  title="Forensic feature extraction and cross-drive analysis",
+
  author="Simson Garfinkel",
+
  booktitle={Proceedings of the 6th Annual Digital Forensic Research Workshop (DFRWS)},
+
  address = "Lafayette, Indiana",
+
  journal="Digital Investigation",
+
  year=2006,
+
  month=Aug,
+
  url="http://www.dfrws.org/2006/proceedings/10-Garfinkel.pdf",
+
  location="Lafayette, Indiana"
+
}
+
</bibtex>
+
  
=Text Mining=
+
=== Selective imaging ===
 +
Selective imaging is a technique to only make a copy of certain information on a disk like the $MFT on an [[NTFS]] volume with the necessary contextual information.
  
'''Computer Forensic Text Analysis with Open Source Software,''' Christian Johansson, Masters Thesis, Blekinge Tekniska Hogskola, June 2003  http://www.fukt.bth.se/~uncle/papers/master/thesis.pdf
+
=== Decryption while imaging ===
 +
Encrypted data is worst-case scenario for compression. Because the encryption process should be deterministic a solution to reduce the size of an encrypted image is to store it non-encrypted and compressed and encrypt it on-the-fly if required. Although this should be rare since the non-encrypted data is what undergoes analysis.
  
=Signed Evidence=
+
== Logical image ==
<bibtex>
+
@article{duerr-2004,
+
  title="Information Assurance Applied to Authentication of Digital Evidence",
+
  author="Thomas E. Duerr and Nicholas D. Beser and Gregory P. Staisiunas",
+
  year=2004,
+
  journal="Forensic Science Communications",
+
  volume=6,
+
  number=4,
+
  url="http://www.fbi.gov/hq/lab/fsc/backissu/oct2004/research/2004_10_research01.htm"
+
}
+
</bibtex>
+
  
 +
== Also see ==
 +
[[:Category:Forensics_File_Formats|Forensics File Formats]]
  
<bibtex>
+
== External Links ==
@article{OppligerR03,
+
* [http://www.tableau.com/pdf/en/Tableau_Forensic_Disk_Perf.pdf Benchmarking Hard Disk Duplication Performance in Forensic Applications], by [[Robert Botchek]]
  author    = {Rolf Oppliger and Ruedi Rytz},
+
  title    = {Digital Evidence: Dream and Reality},
+
  journal  = {IEEE Security {\&} Privacy},
+
  volume    = {1},
+
  number    = {5},
+
  year      = {2003},
+
  pages    = {44-48},
+
  url      = {http://doi.ieeecomputersociety.org/10.1109/MSECP.2003.1236234},
+
  abstract="Digital evidence is inherently weak. New evidence-gathering technologies-digital black boxes-must be developed and deployed to support investigations of irreproducible events such as digitally signing a document."
+
}
+
</bibtex>
+
  
=Theory=
+
=== Hash based imaging ===
'''A Hypothesis-Based Approach to Digital Forensic Investigations,''' Brian D. Carrier, Ph.D. Dissertation
+
* [http://www.dfrws.org/2010/proceedings/2010-314.pdf Hash based disk imaging using AFF4], by [[Michael Cohen]], [[Bradley Schatz]]
Purdue University, May 2006 https://www.cerias.purdue.edu/tools_and_resources/bibtex_archive/archive/2006-06.pdf
+
 
+
=Other Papers=
+
 
+
* [http://papers.ssrn.com/sol3/papers.cfm?abstract_id=531782 A Model for When Disclosure Helps Security: What is Different About Computer and Network Security?], Peter P. Swire, Moritz College of Law of the Ohio State University, Journal on Telecommunications and High Technology Law, Vol. 2, 2004.
+
 
+
[[Category::Bibliographies]]
+

Revision as of 07:11, 21 July 2012

Information icon.png

Please help to improve this article by expanding it.
Further information might be found on the discussion page.

Disk imaging is the process of making a bit-by-bit copy of a disk. Imaging (in more general terms) can apply to anything that can be considered as a bit-stream, e.g. a physical or logical volumes, network streams, etc.

The most straight-forward disk imaging method is reading a disk from start to end and writing the data to a Forensics image format. This can be a time consuming process especially for disks with a large capacity.

Compressed storage

A common technique to reduce the size of an image file is to compress the data. On modern computers, with multiple cores, the compression can be done in parallel reducing the output without prolonging the imaging process. Since the write speed of the target disk can be a bottleneck in imaging process parallel compression can reduce the total time of the imaging process. Guymager was one of the first imaging tools to implement the concept of multi-process compression.

Other techniques like storing the data sparse or empty-block compression can reduce the total time of the imaging process and the resulting size of new non-encrypted (0-byte filled) disks.

Error tolerance and recovery

Smart imaging

Smart imaging is a combination of techniques to make the imaging process more intelligent.

  • Selective imaging
  • Decryption while imaging

Selective imaging

Selective imaging is a technique to only make a copy of certain information on a disk like the $MFT on an NTFS volume with the necessary contextual information.

Decryption while imaging

Encrypted data is worst-case scenario for compression. Because the encryption process should be deterministic a solution to reduce the size of an encrypted image is to store it non-encrypted and compressed and encrypt it on-the-fly if required. Although this should be rare since the non-encrypted data is what undergoes analysis.

Logical image

Also see

Forensics File Formats

External Links

Hash based imaging