Difference between revisions of "Past Selected Articles"

From Forensics Wiki
Jump to: navigation, search
m
m
 
(18 intermediate revisions by 2 users not shown)
Line 1: Line 1:
 
''Archived past selected research articles''
 
''Archived past selected research articles''
 +
 +
<small>Jan 2013</small>
 +
<bibtex>
 +
@article{young:distinct,
 +
title="Distinct Sector hashing for Target Detection",
 +
author="Joel Young and Kristina Foster and Simson Garfinkel and Kevin Fairbanks",
 +
year=2012,
 +
month=Dec,
 +
journal="IEEE Computer"
 +
}
 +
</bibtex>
 +
Using an alternative approach to traditional file hashing, digital forensic investigators can hash individually sampled subject drives on sector boundaries and then check these hashes against a prebuilt database, making it possible to process raw media without reference to the underlying file system.
 +
 +
 +
 +
<small>Aug 2012</small>
 +
<bibtex>
 +
@misc{apple,
 +
title="Infiltrate the Vault: Security Analysis and Decryption of Lion Full Disk Encryption",
 +
abstract="With the launch of Mac OS X 10.7 (Lion), Apple has introduced a volume encryption mechanism known as FileVault 2. Apple only disclosed marketing aspects of the closed-source software, e.g. its use of the AES-XTS tweakable encryption, but a publicly available security evaluation and detailed description was unavailable until now.. We have performed an extensive analysis of FileVault 2 and we have been able to find all the algorithms and parameters needed to successfully read an encrypted volume. This allows us to perform forensic investigations on encrypted volumes using our own tools. In this paper we present the architecture of FileVault 2, giving details of the key derivation, encryption process and metadata structures needed to perform the volume  decryption. Besides the analysis of the system, we have also built a library that can mount a volume encrypted with FileVault 2. As a contribution to the research and forensic communities we have made this library open source. Additionally, we present an informal security evaluation of the system and comment on some of the design and implementation features. Among others we analyze the random number generator used to create the recovery password. We have also analyzed the entropy of each 512-byte block in the encrypted volume and discovered that part of the user data was left unencrypted",
 +
author="Omar Choudary and Felix Grobert and Joachim Metz",
 +
year=2012,
 +
month=Aug,
 +
url="http://eprint.iacr.org/2012/374.pdf"
 +
}
 +
</bibtex>
 +
With the launch of Mac OS X 10.7 (Lion), Apple has introduced a volume encryption mechanism known as FileVault 2. Apple only disclosed marketing aspects of the closed-source software, e.g. its use of the AES-XTS tweakable encryption, but a publicly available security evaluation and detailed description was unavailable until now.. We have performed an extensive analysis of FileVault 2 and we have been able to find all the algorithms and parameters needed to successfully read an encrypted volume. This allows us to perform forensic investigations on encrypted volumes using our own tools. In this paper we present the architecture of FileVault 2, giving details of the key derivation, encryption process and metadata structures needed to perform the volume  decryption. Besides the analysis of the system, we have also built a library that can mount a volume encrypted with FileVault 2. As a contribution to the research and forensic communities we have made this library open source. Additionally, we present an informal security evaluation of the system and comment on some of the design and implementation features. Among others we analyze the random number generator used to create the recovery password. We have also analyzed the entropy of each 512-byte block in the encrypted volume and discovered that part of the user data was left unencrypted.
 +
 +
<small>Mar 2012</small>
 +
 
 +
<bibtex>
 +
@inproceedings{Walls:2011a,
 +
Audio_Url = {http://prisms.cs.umass.edu/brian/pubs/Walls.hotsec.2011.mp3},
 +
Author = { Robert J. Walls and Brian Neil Levine and Marc Liberatore and Clay Shields},
 +
Booktitle = {Proc.\ USENIX Workshop on Hot Topics in Security (HotSec)},
 +
Keywords = {forensics; security},
 +
Month = {August},
 +
Slides_Url = {http://prisms.cs.umass.edu/brian/pubs/rjwalls.hotsec.2011.slides.pdf},
 +
Sponsors = {CNS-1018615, CNS-0905349, DUE-0830876, 2008-CE-CXK005},
 +
Title = {{Effective Digital Forensics Research is Investigator-Centric}},
 +
Url = {http://prisms.cs.umass.edu/brian/pubs/Walls.hotsec.2011.pdf},
 +
Video_Url = {http://prisms.cs.umass.edu/brian/pubs/Walls.hotsec.2011.mp4},
 +
Year = {2011},
 +
Bdsk-Url-1 = {http://prisms.cs.umass.edu/brian/pubs/Walls.hotsec.2011.pdf}}
 +
</bibtex>
 +
Many technical mechanisms across computer security for attribution, identification, and classification are neither sufficient nor necessary for forensically valid digital investigations; yet they are often claimed as useful or necessary. Similarly, when forensic research is evaluated using the viewpoints held by computer security venues, the challenges, constraints, and usefulness of the work is often misjudged. In this paper, we point out many key aspects of digital forensics with the goal of ensuring that research seeking to advance the discipline will have the highest possible adoption rate by practitioners. We enumerate general legal and practical constraints placed on forensic investigators that set the field apart. We point out the assumptions, often limited or incorrect, made about forensics in past work, and discuss how these assumptions limit the impact of contributions.
 +
 +
* [https://www.usenix.org/conference/hotsec11/effective-digital-forensics-research-investigator-centric Usenix Presentation]
 +
* [http://prisms.cs.umass.edu/brian/pubs/rjwalls.hotsec.2011.slides.pdf Slides]
 +
* [http://prisms.cs.umass.edu/brian/pubs/Walls.hotsec.2011.pdf paper]
 +
 +
 +
 +
<small>March 2012</small>
 +
<bibtex>
 +
@inproceedings{Balasubramaniyan:2010:PUS:1866307.1866320,
 +
author = {Balasubramaniyan, Vijay A. and Poonawalla, Aamir and Ahamad, Mustaque and Hunter, Michael T. and Traynor, Patrick},
 +
title = {PinDr0p: using single-ended audio features to determine call provenance},
 +
booktitle = {Proceedings of the 17th ACM conference on Computer and communications security},
 +
series = {CCS '10},
 +
year = {2010},
 +
isbn = {978-1-4503-0245-6},
 +
location = {Chicago, Illinois, USA},
 +
pages = {109--120},
 +
numpages = {12},
 +
url = {http://doi.acm.org/10.1145/1866307.1866320},
 +
doi = {http://doi.acm.org/10.1145/1866307.1866320},
 +
acmid = {1866320},
 +
publisher = {ACM},
 +
address = {New York, NY, USA},
 +
keywords = {VoIP, call fingerprinting, provenance, telephony},
 +
}
 +
 +
</bibtex>
 +
The recent diversification of telephony infrastructure allows users to communicate through landlines, mobile phones and VoIP phones. However, call metadata such as Caller-ID is either not transferred or transferred without verification across these networks, allowing attackers to maliciously alter it. In this paper, we develop PinDr0p, a mechanism to assist users in determining call provenance — the source and the path taken by a call. Our techniques detect and mea- sure single-ended audio features to identify all of the applied voice codecs, calculate packet loss and noise profiles, while remaining agnostic to characteristics of the speaker’s voice (as this may le- gitimately change when interacting with a large organization). In the absence of verifiable call metadata, these features in combina- tion with machine learning allow us to determine the traversal of a call through as many as three different providers (e.g., cellular, then VoIP, then PSTN and all combinations and subsets thereof) with 91.6% accuracy. Moreover, we show that once we identify and characterize the networks traversed, we can create detailed fin- gerprints for a call source. Using these fingerprints we show that we are able to distinguish between calls made using specific PSTN, cellular, Vonage, Skype and other hard and soft phones from loca- tions across the world with over 90% accuracy. In so doing, we provide a first step in accurately determining the provenance of a call.
 +
 +
 +
<small>Jan 2012</small>
 +
 
 +
<bibtex>
 +
@article{10.1109/CIS.2011.180,
 +
author = {Vrizlynn L.L. Thing and Tong-Wei Chua and Ming-Lee Cheong},
 +
title = {Design of a Digital Forensics Evidence Reconstruction System for Complex and Obscure Fragmented File Carving},
 +
journal ={Computational Intelligence and Security, International Conference on},
 +
volume = {0},
 +
isbn = {978-0-7695-4584-4},
 +
year = {2011},
 +
pages = {793-797},
 +
doi = {http://doi.ieeecomputersociety.org/10.1109/CIS.2011.180},
 +
publisher = {IEEE Computer Society},
 +
address = {Los Alamitos, CA, USA},
 +
}
 +
</bibtex>
 +
 +
 +
<small>Dec 2011</small>
 +
 
 +
<bibtex>
 +
@INPROCEEDINGS{5931110,
 +
author={Baier, H. and Breitinger, F.},
 +
booktitle={IT Security Incident Management and IT Forensics (IMF), 2011 Sixth International Conference on},
 +
title={Security Aspects of Piecewise Hashing in Computer Forensics},
 +
year={2011},
 +
month={may},
 +
volume={},
 +
number={},
 +
pages={21 -36},
 +
keywords={MD5 hash function;SHA-1 hash function;computer forensics;cryptographic hash function;piecewise hashing security aspect;pseudorandom number generator;security analysis;computer forensics;cryptography;random number generation;},
 +
doi={10.1109/IMF.2011.16},
 +
abstract="Although hash functions are a well-known method in computer science to map arbitrary large data to bit strings of a fixed length, their use in computer forensics is currently very limited. As of today, in a pre-step process hash values of files are generated and stored in a database, typically a cryptographic hash function like MD5 or SHA-1 is used. Later the investigator computes hash values of files, which he finds on a storage medium, and performs look ups in his database. This approach has several drawbacks, which have been sketched in the community, and some alternative approaches have been proposed. The most popular one is due to Jesse Kornblum, who transferred ideas from spam detection to computer forensics in order to identify similar files. However, his proposal lacks a thorough security analysis. It is therefore one aim of the paper at hand to present some possible attack vectors of an active adversary to bypass Kornblum's approach. Furthermore, we present a pseudo random number generator being both more efficient and more random compared to Kornblum's pseudo random number generator."
 +
ISSN={},}
 +
</bibtex>
 +
Although hash functions are a well-known method in computer science to map arbitrary large data to bit strings of a fixed length, their use in computer forensics is currently very limited. As of today, in a pre-step process hash values of files are generated and stored in a database, typically a cryptographic hash function like MD5 or SHA-1 is used. Later the investigator computes hash values of files, which he finds on a storage medium, and performs look ups in his database. This approach has several drawbacks, which have been sketched in the community, and some alternative approaches have been proposed. The most popular one is due to Jesse Kornblum, who transferred ideas from spam detection to computer forensics in order to identify similar files. However, his proposal lacks a thorough security analysis. It is therefore one aim of the paper at hand to present some possible attack vectors of an active adversary to bypass Kornblum's approach. Furthermore, we present a pseudo random number generator being both more efficient and more random compared to Kornblum's pseudo random number generator.
 +
 +
 +
<small>August 2011</small>
 +
 
 +
<bibtex>
 +
@article{beverly:ipcarving,
 +
author = "Robert Beverly and Simson Garfinkel and Gregory Cardwell",
 +
journal = "Digital Investigation",
 +
publisher="Elsevier",
 +
booktitle = {Proc. of the Eleventh Annual DFRWS Conference},
 +
title = "Forensic Carving of Network Packets and Associated Data Structures",
 +
volume=8
 +
year = 2011,
 +
abstract="Using validated carving techniques, we show that popular operating systems (\eg Windows, Linux, and OSX)  frequently have residual IP packets, Ethernet frames,  and associated data structures present in system memory  from long-terminated network traffic. Such information is useful  for many forensic purposes including establishment of  prior connection activity and services used;  identification of other  systems present on the system's LAN or WLAN; geolocation of the  host computer system; and cross-drive analysis. We show that network structures can also be  recovered from memory that is persisted onto a mass storage medium  during the course of system swapping or hibernation.  We present our network carving techniques, algorithms and tools,  and validate these against both purpose-built memory images and a readily  available forensic corpora.  These techniques are  valuable to both forensics tasks, particularly  in analyzing mobile devices, and to cyber-security objectives such  as malware analysis."
 +
}
 +
</bibtex>
 +
<i>Using validated carving techniques, we show that popular operating systems (\eg Windows, Linux, and OSX)  frequently have residual IP packets, Ethernet frames,  and associated data structures present in system memory  from long-terminated network traffic. Such information is useful  for many forensic purposes including establishment of  prior connection activity and services used;  identification of other  systems present on the system's LAN or WLAN; geolocation of the  host computer system; and cross-drive analysis. We show that network structures can also be  recovered from memory that is persisted onto a mass storage medium  during the course of system swapping or hibernation.  We present our network carving techniques, algorithms and tools,  and validate these against both purpose-built memory images and a readily  available forensic corpora.  These techniques are  valuable to both forensics tasks, particularly  in analyzing mobile devices, and to cyber-security objectives such  as malware analysis.</i>
 +
 +
 +
 +
 +
<small>July 2011</small>
 +
<bibtex>
 +
@article{fiorillo-flash,
 +
title="Theory and practice of flash memory mobile forensics",
 +
year=2009,
 +
author="Salvatore Florillio",
 +
  url="http://ro.ecu.edu.au/adf/67/",
 +
publisher="School of Computer and Information Science, Edith Cowan University, Perth, Western Australia",
 +
abstract="This paper is an introduction to flash memory forensics with a special focus on completeness of evidence acquired from mobile phones. Moving through academic papers and industrial documents will be introduced the particular nature of non-volatile memories present in nowadays mobile phones; how they really work and which challenges they pose to forensic investigators. Then will be presented an advanced test in which some brand new flash memories have been used to hide data in man-made bad blocks: the aim is to verify if forensic software tools are able to acquire data from such blocks, and to evaluate the possibility to hide data at analysts’ eyes."
 +
}
 +
</bibtex>
 +
 +
 +
<small>June 2011</small>
 +
<bibtex>
 +
@PhDThesis{Kessler2010,
 +
title="Judges’ Awareness, Understanding, and Application of Digital Evidence",
 +
author="Gary Craig Kessler",
 +
year=2010,
 +
institution="Graduate School of Computer and Information Sciences Nova Southeastern University",
 +
url="http://www.forensicswiki.org/wiki/File:Kessler_judges%26de.pdf"
 +
}
 +
</bibtex>
 +
 +
<small>Spring 2011</small>
 +
[[Solid State Drive (SSD) Forensics]]<br>
 +
We now have a new page on SSD forensics. The page has some basic information and a growing bibliography. One of the first entries is:
 +
<!-- ARTICLE GOES HERE -->
 +
<bibtex>
 +
@inproceedings{wei2011,
 +
  author = {Michael Wei and Laura M. Grupp and Frederick M. Spada and Steven Swanson},
 +
  title = {Reliably Erasing Data from Flash-Based Solid State Drives},
 +
  booktitle={FAST 2011},
 +
  year = 2011,
 +
  keywords = {erasing flash security ssd},
 +
  added-at = {2011-02-22T09:22:03.000+0100},
 +
  url={http://cseweb.ucsd.edu/users/m3wei/assets/pdf/FMS-2010-Secure-Erase.pdf},
 +
  biburl = {http://www.bibsonomy.org/bibtex/27c408ad559fc19f829717f485707a909/schmidt2}
 +
}
 +
</bibtex>
 +
<!--END OF ARTICLE-->
 +
(Past selected articles [[Past Selected Articles|are archived here]].)
 +
<!-- END OF NEWS -->
 +
</div>
 +
 +
<small>JULY-2010</small>
 +
;Sleuth Kit and Open Source Digital Forensics Conference
 +
 +
The slides from the first ever Sleuth Kit and Open Source Digital Forensics Conference are now available online:
 +
 +
* http://www.basistech.com/conference/2010/digital-forensics-agenda.html
 +
 +
Highlights include:
 +
* [http://www.basistech.com/conference/2010/osdf-slides/Carrier-SleuthKitOverview.pdf The Sleuth Kit Overview and Automated Scanning Features] (Brian Carrier)
 +
* [http://www.basistech.com/conference/2010/osdf-slides/Butler-Schiffer-Mandiant-Open-Source-Digital-Forensics.pdf Faster Response with Sleuth Kit and Other Open Source Technologies] (Jamie Butler and Jason Shiffer)
 +
* [http://www.basistech.com/conference/2010/osdf-slides/Forte-PTK-Forensics.pdf PTK Forensics after Two years: Past, Present and Future] (Dario Forte)
 +
* [http://www.basistech.com/conference/2010/osdf-slides/Joyce-Mac-Forensics-SleuthKit.pdf Mac Forensic Tools Using Sleuth Kit] (Rob Joyce)
 +
* [http://www.basistech.com/conference/2010/osdf-slides/Altheide-Commando-Forensics.pdf Commando Forensics: What Dongle?] (Cory Altheide)
 +
* [http://www.basistech.com/conference/2010/osdf-slides/Garfinkel-AFFLIB.pdf AFF and AFF4: Where we are, where we are going, and why it matters to you.] (Simson Garfinkel)
 +
* [http://www.basistech.com/conference/2010/osdf-slides/Carvey-Timelines.pdf Timeline Creation using Open-Source Tools] (Harlan Carvey)
 +
 +
 +
<small>MARCH-2010</small>
 +
;[http://portal.acm.org/citation.cfm?id=1592451.1592455  Internet geolocation: Evasion and counterevasion]
 +
; ACM Computing Surveys (CSUR), Volume 42 ,  Issue 1  (December 2009)
 +
<blockquote>
 +
Internet geolocation technology aims to determine the physical (geographic) location of Internet users and devices. It is currently proposed or in use for a wide variety of purposes, including targeted marketing, restricting digital content sales to authorized jurisdictions, and security applications such as reducing credit card fraud. This raises questions about the veracity of claims of accurate and reliable geolocation. We provide a survey of Internet geolocation technologies with an emphasis on adversarial contexts; that is, we consider how this technology performs against a knowledgeable adversary whose goal is to evade geolocation. We do so by examining first the limitations of existing techniques, and then, from this base, determining how best to evade existing geolocation techniques. We also consider two further geolocation techniques which may be of use even against adversarial targets: (1) the extraction of client IP addresses using functionality introduced in the 1.5 Java API, and (2) the collection of round-trip times using HTTP refreshes. These techniques illustrate that the seemingly straightforward technique of evading geolocation by relaying traffic through a proxy server (or network of proxy servers) is not as straightforward as many end-users might expect. We give a demonstration of this for users of the popular Tor anonymizing network.</blockquote>
 +
 +
 +
 +
<small>FEB-2010</small>
 +
[http://www.ojp.usdoj.gov/nij/journals/259/csi-effect.htm The 'CSI Effect': Does It Really Exist?], by The Honorable Donald E. Shelton
 +
 +
Crime and courtroom proceedings have long been fodder for film and television scriptwriters. In recent years, however, the media's use of the courtroom as a vehicle for drama has not only proliferated, it has changed focus. In apparent fascination with our criminal justice process, many of today's courtroom dramas are based on actual cases. Court TV offers live gavel-to-gavel coverage of trials over the Internet for $5.95 a month. Now, that's "reality television"!
 +
 +
Reality and fiction have begun to blur with crime magazine television shows such as 48 Hours Mystery, American Justice, and even, on occasion, Dateline NBC. These programs portray actual cases, but only after extensively editing the content and incorporating narration for dramatic effect. Presenting one 35-year-old cold case, for example, 48 Hours Mystery filmed for months to capture all pretrial hearings as well as the 2-week trial; the program, however, was ultimately edited to a 1-hour episode that suggested the crime remained a "mystery" . . . notwithstanding the jury's guilty verdict....
 +
 +
<small>JAN-2010</small>
 +
[http://hal.archives-ouvertes.fr/docs/00/35/09/62/PDF/ColDanDauDef09.pdf Using Graphics Processors for Parallelizing Hash-based Data Carving],  by Sylvain Collange, Marc Daumas, Yoginder S. Dandass, and David Defour, Proceedings of the 42nd Hawaii International Conference on System Sciences - 2009.
 +
 +
Abstract
 +
 +
The ability to detect fragments of deleted image files and to reconstruct these image files from all available fragments on disk is a key activity in the field of digital forensics. Although reconstruction of image files from the file fragments on disk can be accomplished by simply comparing the content of sectors on disk with the content of known files, this brute-force approach can be time consuming.
 +
 +
This paper presents results from research into the use of Graphics Processing Units (GPUs) in detecting specific image file byte patterns in disk clusters. Unique identifying pattern for each disk sector is compared against patterns in known images. A pattern match indicates the potential presence of an image and flags the disk sector for further in-depth examination to confirm the match. The GPU-based implementation outperforms the software implementation by a significant margin.
 +
 +
 +
<small>December-2009</small>
 +
;'''[http://blogs.sans.org/computer-forensics/2009/02/04/what-happens-when-you-overwrite-data/ What happens when you overwrite data?]'''.
 +
Data recovery Craig S. Wright explores what happens when you try to cover overwritten data using high-quality scientific equipment. His conclusion: "The values do not tell you what existed on the drive prior to the wipe; they just allow you to make a guess, bit by bit. Each time you guess, you compound the error. As recovering a single bit value has little if any forensic value, you soon find that the cumulative errors render any recovered data worthless."
 +
 +
 +
<small>November-2009</small>
 +
;'''[http://www.computer-forensics-lab.org/pdf/Linux_for_computer_forensic_investigators.pdf Linux for computer forensic investigators: «pitfalls» of mounting file systems] [http://computer-forensics-lab.org/lib/?cid=174 (Russian version)], Suhanov Maxim, 2009'''
 +
 +
The paper opens discussion about building forensically sound Live CD distributions based on Linux. Problems described:
 +
* Common misconceptions about "-o ro" mount option (is it forensically sound?);
 +
* Bugs in many forensic Live CDs that alter the data on evidentiary media.
 +
 +
Denis Frati ([[CAINE Live CD|CAINE]] developer) wrote an [http://www.denisfrati.it/pdf/Suhanov_Maxim_bug.pdf excellent review (Italian)] of the bug found in Casper scripts.
 +
 +
<small>September-2009</small>
 +
;'''[http://www.blackhat.com/presentations/bh-dc-08/FX/Whitepaper/bh-dc-08-fx-WP.pdf Cisco IOS Forensics]'''
 +
"Cisco System’s routers running Cisco IOS are still the prevalent routing platform on the Internet and corporate networks. Their huge population, architectural deficiencies and hugely diverse version distribution make them a valuable target that gains importance as common operating system platforms are closed down and secured.
 +
This paper takes the position that the currently used, well accepted practices for monitoring, debugging and post mortem crash analysis are insufficient to deal with the threat of compromised IOS devices. It sets forth a different method that reduces the requirement for constant logging, favoring on- demand in-depth analysis in case of suspicion or actual device crashes. The paper concludes by presenting the current state in the development of software supporting the proposed method and requesting feedback from the community on the software’s future directions."
 +
 +
 +
<small>July-2009</small>
 +
;'''[http://viaforensics.com/wpinstall/wp-content/uploads/2009/08/Android-Forensics-Andrew-Hoog-viaForensics.pdf Android Forensics]'''
 +
:Presentation on [http://viaforensics.com/android Android Forensics] by Andrew Hoog, Mobile Forensics World 2009. Presentation gives an overview of Android, explains how to root phones, and extract data from a phone once you have superuser access.  One of the complications is that Android phones (like the T-Mobile G1) use YAFFS2, a flash-specific file system.
  
 
<small>2009-June</small>
 
<small>2009-June</small>

Latest revision as of 15:58, 30 June 2013

Archived past selected research articles

Jan 2013

Joel Young, Kristina Foster, Simson Garfinkel, Kevin Fairbanks - Distinct Sector hashing for Target Detection
IEEE Computer , December 2012
Bibtex
Author : Joel Young, Kristina Foster, Simson Garfinkel, Kevin Fairbanks
Title : Distinct Sector hashing for Target Detection
In : IEEE Computer -
Address :
Date : December 2012

Using an alternative approach to traditional file hashing, digital forensic investigators can hash individually sampled subject drives on sector boundaries and then check these hashes against a prebuilt database, making it possible to process raw media without reference to the underlying file system.


Aug 2012

Omar Choudary, Felix Grobert, Joachim Metz - Infiltrate the Vault: Security Analysis and Decryption of Lion Full Disk Encryption
, August 2012
http://eprint.iacr.org/2012/374.pdf
Bibtex
Author : Omar Choudary, Felix Grobert, Joachim Metz
Title : Infiltrate the Vault: Security Analysis and Decryption of Lion Full Disk Encryption
In : -
Address :
Date : August 2012

With the launch of Mac OS X 10.7 (Lion), Apple has introduced a volume encryption mechanism known as FileVault 2. Apple only disclosed marketing aspects of the closed-source software, e.g. its use of the AES-XTS tweakable encryption, but a publicly available security evaluation and detailed description was unavailable until now.. We have performed an extensive analysis of FileVault 2 and we have been able to find all the algorithms and parameters needed to successfully read an encrypted volume. This allows us to perform forensic investigations on encrypted volumes using our own tools. In this paper we present the architecture of FileVault 2, giving details of the key derivation, encryption process and metadata structures needed to perform the volume decryption. Besides the analysis of the system, we have also built a library that can mount a volume encrypted with FileVault 2. As a contribution to the research and forensic communities we have made this library open source. Additionally, we present an informal security evaluation of the system and comment on some of the design and implementation features. Among others we analyze the random number generator used to create the recovery password. We have also analyzed the entropy of each 512-byte block in the encrypted volume and discovered that part of the user data was left unencrypted.

Mar 2012

Robert J. Walls, Brian Neil Levine, Marc Liberatore, Clay Shields - {Effective Digital Forensics Research is Investigator-Centric}
Proc.\ USENIX Workshop on Hot Topics in Security (HotSec) , August 2011
http://prisms.cs.umass.edu/brian/pubs/Walls.hotsec.2011.pdf
Bibtex
Author : Robert J. Walls, Brian Neil Levine, Marc Liberatore, Clay Shields
Title : {Effective Digital Forensics Research is Investigator-Centric}
In : Proc.\ USENIX Workshop on Hot Topics in Security (HotSec) -
Address :
Date : August 2011

Many technical mechanisms across computer security for attribution, identification, and classification are neither sufficient nor necessary for forensically valid digital investigations; yet they are often claimed as useful or necessary. Similarly, when forensic research is evaluated using the viewpoints held by computer security venues, the challenges, constraints, and usefulness of the work is often misjudged. In this paper, we point out many key aspects of digital forensics with the goal of ensuring that research seeking to advance the discipline will have the highest possible adoption rate by practitioners. We enumerate general legal and practical constraints placed on forensic investigators that set the field apart. We point out the assumptions, often limited or incorrect, made about forensics in past work, and discuss how these assumptions limit the impact of contributions.


March 2012

Balasubramaniyan, Vijay A., Poonawalla, Aamir, Ahamad, Mustaque, Hunter, Michael T., Traynor, Patrick - PinDr0p: using single-ended audio features to determine call provenance
Proceedings of the 17th ACM conference on Computer and communications security pp. 109--120, New York, NY, USA,2010
http://doi.acm.org/10.1145/1866307.1866320
Bibtex
Author : Balasubramaniyan, Vijay A., Poonawalla, Aamir, Ahamad, Mustaque, Hunter, Michael T., Traynor, Patrick
Title : PinDr0p: using single-ended audio features to determine call provenance
In : Proceedings of the 17th ACM conference on Computer and communications security -
Address : New York, NY, USA
Date : 2010

The recent diversification of telephony infrastructure allows users to communicate through landlines, mobile phones and VoIP phones. However, call metadata such as Caller-ID is either not transferred or transferred without verification across these networks, allowing attackers to maliciously alter it. In this paper, we develop PinDr0p, a mechanism to assist users in determining call provenance — the source and the path taken by a call. Our techniques detect and mea- sure single-ended audio features to identify all of the applied voice codecs, calculate packet loss and noise profiles, while remaining agnostic to characteristics of the speaker’s voice (as this may le- gitimately change when interacting with a large organization). In the absence of verifiable call metadata, these features in combina- tion with machine learning allow us to determine the traversal of a call through as many as three different providers (e.g., cellular, then VoIP, then PSTN and all combinations and subsets thereof) with 91.6% accuracy. Moreover, we show that once we identify and characterize the networks traversed, we can create detailed fin- gerprints for a call source. Using these fingerprints we show that we are able to distinguish between calls made using specific PSTN, cellular, Vonage, Skype and other hard and soft phones from loca- tions across the world with over 90% accuracy. In so doing, we provide a first step in accurately determining the provenance of a call.


Jan 2012

Vrizlynn L.L. Thing, Tong-Wei Chua, Ming-Lee Cheong - Design of a Digital Forensics Evidence Reconstruction System for Complex and Obscure Fragmented File Carving
Computational Intelligence and Security, International Conference on 0:793-797, Los Alamitos, CA, USA,2011
Bibtex
Author : Vrizlynn L.L. Thing, Tong-Wei Chua, Ming-Lee Cheong
Title : Design of a Digital Forensics Evidence Reconstruction System for Complex and Obscure Fragmented File Carving
In : Computational Intelligence and Security, International Conference on -
Address : Los Alamitos, CA, USA
Date : 2011


Dec 2011

Baier, H., Breitinger, F. - Security Aspects of Piecewise Hashing in Computer Forensics
IT Security Incident Management and IT Forensics (IMF), 2011 Sixth International Conference on pp. 21 -36, may 2011
Bibtex
Author : Baier, H., Breitinger, F.
Title : Security Aspects of Piecewise Hashing in Computer Forensics
In : IT Security Incident Management and IT Forensics (IMF), 2011 Sixth International Conference on -
Address :
Date : may 2011

Although hash functions are a well-known method in computer science to map arbitrary large data to bit strings of a fixed length, their use in computer forensics is currently very limited. As of today, in a pre-step process hash values of files are generated and stored in a database, typically a cryptographic hash function like MD5 or SHA-1 is used. Later the investigator computes hash values of files, which he finds on a storage medium, and performs look ups in his database. This approach has several drawbacks, which have been sketched in the community, and some alternative approaches have been proposed. The most popular one is due to Jesse Kornblum, who transferred ideas from spam detection to computer forensics in order to identify similar files. However, his proposal lacks a thorough security analysis. It is therefore one aim of the paper at hand to present some possible attack vectors of an active adversary to bypass Kornblum's approach. Furthermore, we present a pseudo random number generator being both more efficient and more random compared to Kornblum's pseudo random number generator.


August 2011

Robert Beverly, Simson Garfinkel, Gregory Cardwell - Forensic Carving of Network Packets and Associated Data Structures
Digital Investigation 8,2011
Bibtex
Author : Robert Beverly, Simson Garfinkel, Gregory Cardwell
Title : Forensic Carving of Network Packets and Associated Data Structures
In : Digital Investigation -
Address :
Date : 2011

Using validated carving techniques, we show that popular operating systems (\eg Windows, Linux, and OSX) frequently have residual IP packets, Ethernet frames, and associated data structures present in system memory from long-terminated network traffic. Such information is useful for many forensic purposes including establishment of prior connection activity and services used; identification of other systems present on the system's LAN or WLAN; geolocation of the host computer system; and cross-drive analysis. We show that network structures can also be recovered from memory that is persisted onto a mass storage medium during the course of system swapping or hibernation. We present our network carving techniques, algorithms and tools, and validate these against both purpose-built memory images and a readily available forensic corpora. These techniques are valuable to both forensics tasks, particularly in analyzing mobile devices, and to cyber-security objectives such as malware analysis.



July 2011

Salvatore Florillio - Theory and practice of flash memory mobile forensics
,2009
http://ro.ecu.edu.au/adf/67/
Bibtex
Author : Salvatore Florillio
Title : Theory and practice of flash memory mobile forensics
In : -
Address :
Date : 2009


June 2011

Gary Craig Kessler - Judges’ Awareness, Understanding, and Application of Digital Evidence
Ph.D. Thesis, ,2010
http://www.forensicswiki.org/wiki/File:Kessler_judges%26de.pdf
Bibtex
Author : Gary Craig Kessler
Title : Judges’ Awareness, Understanding, and Application of Digital Evidence
In : Ph.D. Thesis, -
Address :
Date : 2010

Spring 2011 Solid State Drive (SSD) Forensics
We now have a new page on SSD forensics. The page has some basic information and a growing bibliography. One of the first entries is:

Michael Wei, Laura M. Grupp, Frederick M. Spada, Steven Swanson - Reliably Erasing Data from Flash-Based Solid State Drives
FAST 2011 ,2011
http://cseweb.ucsd.edu/users/m3wei/assets/pdf/FMS-2010-Secure-Erase.pdf
Bibtex
Author : Michael Wei, Laura M. Grupp, Frederick M. Spada, Steven Swanson
Title : Reliably Erasing Data from Flash-Based Solid State Drives
In : FAST 2011 -
Address :
Date : 2011

(Past selected articles are archived here.) </div>

JULY-2010

Sleuth Kit and Open Source Digital Forensics Conference

The slides from the first ever Sleuth Kit and Open Source Digital Forensics Conference are now available online:

Highlights include:


MARCH-2010

Internet geolocation: Evasion and counterevasion
ACM Computing Surveys (CSUR), Volume 42 , Issue 1 (December 2009)
Internet geolocation technology aims to determine the physical (geographic) location of Internet users and devices. It is currently proposed or in use for a wide variety of purposes, including targeted marketing, restricting digital content sales to authorized jurisdictions, and security applications such as reducing credit card fraud. This raises questions about the veracity of claims of accurate and reliable geolocation. We provide a survey of Internet geolocation technologies with an emphasis on adversarial contexts; that is, we consider how this technology performs against a knowledgeable adversary whose goal is to evade geolocation. We do so by examining first the limitations of existing techniques, and then, from this base, determining how best to evade existing geolocation techniques. We also consider two further geolocation techniques which may be of use even against adversarial targets: (1) the extraction of client IP addresses using functionality introduced in the 1.5 Java API, and (2) the collection of round-trip times using HTTP refreshes. These techniques illustrate that the seemingly straightforward technique of evading geolocation by relaying traffic through a proxy server (or network of proxy servers) is not as straightforward as many end-users might expect. We give a demonstration of this for users of the popular Tor anonymizing network.


FEB-2010 The 'CSI Effect': Does It Really Exist?, by The Honorable Donald E. Shelton

Crime and courtroom proceedings have long been fodder for film and television scriptwriters. In recent years, however, the media's use of the courtroom as a vehicle for drama has not only proliferated, it has changed focus. In apparent fascination with our criminal justice process, many of today's courtroom dramas are based on actual cases. Court TV offers live gavel-to-gavel coverage of trials over the Internet for $5.95 a month. Now, that's "reality television"!

Reality and fiction have begun to blur with crime magazine television shows such as 48 Hours Mystery, American Justice, and even, on occasion, Dateline NBC. These programs portray actual cases, but only after extensively editing the content and incorporating narration for dramatic effect. Presenting one 35-year-old cold case, for example, 48 Hours Mystery filmed for months to capture all pretrial hearings as well as the 2-week trial; the program, however, was ultimately edited to a 1-hour episode that suggested the crime remained a "mystery" . . . notwithstanding the jury's guilty verdict....

JAN-2010 Using Graphics Processors for Parallelizing Hash-based Data Carving, by Sylvain Collange, Marc Daumas, Yoginder S. Dandass, and David Defour, Proceedings of the 42nd Hawaii International Conference on System Sciences - 2009.

Abstract

The ability to detect fragments of deleted image files and to reconstruct these image files from all available fragments on disk is a key activity in the field of digital forensics. Although reconstruction of image files from the file fragments on disk can be accomplished by simply comparing the content of sectors on disk with the content of known files, this brute-force approach can be time consuming.

This paper presents results from research into the use of Graphics Processing Units (GPUs) in detecting specific image file byte patterns in disk clusters. Unique identifying pattern for each disk sector is compared against patterns in known images. A pattern match indicates the potential presence of an image and flags the disk sector for further in-depth examination to confirm the match. The GPU-based implementation outperforms the software implementation by a significant margin.


December-2009

What happens when you overwrite data?.

Data recovery Craig S. Wright explores what happens when you try to cover overwritten data using high-quality scientific equipment. His conclusion: "The values do not tell you what existed on the drive prior to the wipe; they just allow you to make a guess, bit by bit. Each time you guess, you compound the error. As recovering a single bit value has little if any forensic value, you soon find that the cumulative errors render any recovered data worthless."


November-2009

Linux for computer forensic investigators: «pitfalls» of mounting file systems (Russian version), Suhanov Maxim, 2009

The paper opens discussion about building forensically sound Live CD distributions based on Linux. Problems described:

  • Common misconceptions about "-o ro" mount option (is it forensically sound?);
  • Bugs in many forensic Live CDs that alter the data on evidentiary media.

Denis Frati (CAINE developer) wrote an excellent review (Italian) of the bug found in Casper scripts.

September-2009

Cisco IOS Forensics

"Cisco System’s routers running Cisco IOS are still the prevalent routing platform on the Internet and corporate networks. Their huge population, architectural deficiencies and hugely diverse version distribution make them a valuable target that gains importance as common operating system platforms are closed down and secured. This paper takes the position that the currently used, well accepted practices for monitoring, debugging and post mortem crash analysis are insufficient to deal with the threat of compromised IOS devices. It sets forth a different method that reduces the requirement for constant logging, favoring on- demand in-depth analysis in case of suspicion or actual device crashes. The paper concludes by presenting the current state in the development of software supporting the proposed method and requesting feedback from the community on the software’s future directions."


July-2009

Android Forensics
Presentation on Android Forensics by Andrew Hoog, Mobile Forensics World 2009. Presentation gives an overview of Android, explains how to root phones, and extract data from a phone once you have superuser access. One of the complications is that Android phones (like the T-Mobile G1) use YAFFS2, a flash-specific file system.

2009-June

Recovery of Damaged Compressed Files for Digital Forensic Purposes, Bora Park Savoldi, A. Gubian, P. Jungheum Park Seok Hee Lee Sangjin Lee

Korea Univ., Seoul, International Conference on Multimedia and Ubiquitous Engineering, 2008. MUE 2008/

Abstract:

Nowadays compressed files are very widespread and can be considered, without any doubt, with regard to the Digital Forensic realm, an important and precious source of probatory data. This is especially true when in a digital investigation the examiner has to deal with corrupted compressed files, which have been gathered in the collection phase of the investigative process. Therefore, in the computer forensic field, data recovery technologies are very important for acquiring useful pieces of data which can become, in a court of low, digital evidence. This kind of technology is used not only by law enforcement, but also by the multitude of users in their daily activities, which justify the relevant presence of tools in the software market which are devoted to rescue data from damaged compressed files. However, state-of-the-art data recovery tools have many limitations with regard to the capability of recovering the original data, especially in the case of damaged compressed files. So far, such recovery tools have been based on a which controls the signature/header of the file and, thus, provides the offset to the raw compressed data block. As a result, they cannot recover the compressed files if the first part of the raw compressed data block, which pertains to the header, is damaged or the signature/header block is corrupted. Therefore, in order to deal with this issue, we have developed a new tool capable of rescuing damaged compressed files, according to the DEFLATE compression scheme, even though the header block is missing or corrupted. This represents a new interesting opportunity for the digital forensic discipline.


2009-May

Overcoming Impediments to Cell Phone Forensics, Wayne Jansen, Aurelien Delaitre, and Ludovic Moenner, Proceedings of the 41st Hawaii International Conference on System Sciences - 2008
Cell phones are an emerging but rapidly growing area of computer forensics. While cell phones are becoming more like desktop computers functionally, their organization and operation are quite different in certain areas. For example, most cell phones do not contain a hard drive and rely instead on flash memory for persistent storage. Cell phones are also designed more as special purpose appliances that perform a set of predefined tasks using proprietary embedded software, rather than general-purpose extensible systems that run common operating system software. Such differences make the application of classical computer forensic techniques difficult. Also complicating the situation is the state of the art of present day cell phone forensic tools themselves and the way in which tools are applied. This paper identifies factors that impede cell phone forensics and describes techniques to address two resulting problems in particular: the limited coverage of available phone models by forensic tools, and the inadequate means for validating the correct functioning of forensic tools

2009-Apr

A Framework for Automated Digital Forensic Reporting, Lt. Paul Farrell, Master's Thesis, Naval Postgraduate School, Monterey, CA, March 2009
Forensic analysis is the science of finding, examining and analyzing evidence in support of law enforcement, regulatory compliance or information gathering. Today, almost all digital forensic analysis is done by humans, requiring dedicated training and consuming man-hours at a considerable rate. As storage sizes increase and digital forensics gain importance in investigations, the backlog of media requiring human analysis has increased as well. This thesis tests today's top-of-the-line commercial and open source forensic tools with the analysis of a purpose-built Windows XP computer system containing two users that engaged in email, chat and web browsing. It presents the results of a pilot user study of the PyFlag forensic tool. Finally, it presents a technique to use software to do a preliminary analysis on media and provide a human readable report to the examiner. The goal of the technique is to perform rapid triaging of media and allow the human examiner to better prioritize man hours towards media with high return on investment.


2009-Feb-08

The impact of full disk encryption on digital forensics, ACM SIGOPS Operating Systems Review archive, Volume 42 , Issue 3 (April 2008) , Pages 93-98
The integration of strong encryption into operating systems is creating challenges for forensic examiners, potentially preventing us from recovering any digital evidence from a computer. Because strong encryption cannot be circumvented without a key or passphrase, forensic examiners may not be able to access data after a computer is shut down, and must decide whether to perform a live forensic acquisition. In addition, with encryption becoming integrated into the operating system, in some cases, virtualization is the most effective approach to performing a forensic examination of a system with FDE. This paper presents the evolution of full disk encryption (FDE) and its impact on digital forensics. Furthermore, by demonstrating how full disk encryption has been dealt with in past investigations, this paper provides forensics examiners with practical techniques for recovering evidence that would otherwise be inaccessible.


2009-JAN-01

Forensic Investigation of the Nintendo Wii: A First Glance, Dr. Benjamin Turnbull, SMALL SCALE DIGITAL DEVICE FORENSICS JOURNAL, VOL. 2, NO. 1, JUNE 2008 ISSN# 1941-6164
The closed nature of the Wii makes it a challenging game console for forensic analysis. This article takes a first look at the platform, discussing the various places where forensically interesting information may be hidden. There's also a reference to an interesting article about how the ability of wimotes to move avatars from one system to another documents a woman's affair while her husband was serving in Iraq.

2008-Nov-18

Data Acquisition from Cell Phone using Logical Approach, Keonwoo Kim, Dowon Hong, Kyoil Chung, and Jae-Cheol Ryou, PROCEEDINGS OF WORLD ACADEMY OF SCIENCE, ENGINEERING AND TECHNOLOGY VOLUME 26 DECEMBER 2007 ISSN 1307-6884
This article discusses three approaches for acquiring data from cell phones: physically removing the flash RAM chips and reading them directly; reading the data out using the JTAG interface, and running software on the cell phone to extract the files at a logical level. The authors have built a logical extraction system and are working on a system based on JTAG.

2008-Oct-30

Semi-Supervised Named Entity Recognition
Learning to Recognize 100 Entity Types with Little Supervision, David Nadeau, PhD Thesis, University of Ottawa, 2007.
Named Entity Recognition is the process of analyzing text documents and automatically identifying the Who, What, Where and Whens. David Nadeau's thesis presents a novel approach which bootstraps a named entity recognizer using semi-structured documents on the web. Few forensic tools use NER today, but that may well change in the future. What makes this thesis so interesting to read is that it also has a history of the last 20 years or so of the field. Highly recommended.

2008-Oct-24

Advanced JPEG carving, Michael I. Cohen, Proceedings of the 1st international conference on Forensic applications and techniques in telecommunications, information, and multimedia and workshop
Michael I. Cohen presents a fully automated carver which can carve fragmented JPEGs using typical fragmentation patterns. (Cohen argues the the DFRWS carving challenges do not represent typical fragmentation patterns.)

2008-Oct-18

Threats to Privacy in the Forensic Analysis of Database Systems
Patrick Stahlberg, Gerome Miklau, and Brian Neil Levine
Proceedings of the 2007 ACM SIGMOD international conference on Management of data table of contents
Beijing, China.
This paper looks at residual data left behind in databases after DELETE, UPDATE, and VACUUM operations. The authors show that residual data is a real issue in databases, and that it's pretty easy to modify a database so that no residual data is left behind. MySQL with MyISM tables has clean delete, but InnoDB does not. Very much worth reading.

2008-Aug-13

Lest We Remember: Cold Boot Attacks on Encryption Keys
J. Alex Halderman, Princeton University; Seth D. Schoen, Electronic Frontier Foundation; Nadia Heninger and William Clarkson, Princeton University; William Paul, Wind River Systems; Joseph A. Calandrino and Ariel J. Feldman, Princeton University; Jacob Appelbaum; Edward W. Felten, Princeton University
USENIX Security '08 Refereed Paper
Awarded Best Student Paper
Increasingly memory analysis is of interest in forensic research---both because new malware only resides in memory, and because memory analysis is frequently the only way for analysts to get the keys that are used to protect cryptographic file systems. In this paper the authors show that cryptographic keys in memory are vulnerable to exploitation after the computer is turned off. The authors show that the contents of dynamic RAM are retained seconds, and sometimes minutes, after power is turned off. By chilling the memory the data can be retained as long as necessary. And while most laptops wipe their memory when they reboot, the authors show that the chilled memory can be moved from one laptop that wipes to another laptop that does not wipe. Finally, the authors show that it is possible to find the cryptographic keys in memory and correct random bit errors by using the AES key schedule as an error-correction code. The authors demonstrate an attack USB stick which reboots a computer protected with BitLocker, finds the cryptographic keys, and then allows access to the cleartext information on the disk.

2008-July-27

The Symposium on Usable Privacy and Security (2008) concluded this past week in Pittsburgh, PA. One paper that appeared which is interesting to the network forensics crowd is The Challenges of Using an Intrusion Detection System: Is It Worth the Effort?, by Rodrigo Werlinger, Kirstie Hawkey, Kasia Muldner, Pooya Jaferian and Konstantin Beznosov. slides
In this article, the authors conducted interviews with 9 IT security practitioners who have worked with IDSs performed ethnographic observations within an organization that was deploying a new IDS. They found that security practitioners were heavily divided on the value of the IDS, and learned that the an IDS really only generates value if the network is well-understood before the IDS is deployed.

2008-July-20

The International Journal of Digital Evidence is one of two publications by the Electronic Crime Institute (ECI) at Utica College. Current and previous issues are available online.
The current Fall 2007 issue has an interesting article Mobile Phone Forensics Tool Testing: A Database Drive Approach by Baggili, Mislan, and Rogers at Purdue University. Given that phones are increasingly a primary source of forensic information in many cases, we need to be sure that the tools that are used for forensic analysis present data that is accurate and repeatable. Unfortunately they frequently aren't because of there are so many different kinds of phones on the market and the forensic tools lag far behind the market.

Ibrahim M. Baggili, Richard Mislan, Marcus Rogers -
International Journal of Digital Evidence 6,2007
http://www.utica.edu/academic/institutes/ecii/publications/articles/1C33DF76-D8D3-EFF5-47AE3681FD948D68.pdf
Bibtex
Author : Ibrahim M. Baggili, Richard Mislan, Marcus Rogers
Title :
In : International Journal of Digital Evidence -
Address :
Date : 2007


2008-July-12

Anandabrata Pal, Taha Sencar, Nasir Memon - Detecting File Fragmentation Point Using Sequential Hypothesis Testing
,2008
http://www.digital-assembly.com/technology/research/pubs/dfrws2008.pdf
Bibtex
Author : Anandabrata Pal, Taha Sencar, Nasir Memon
Title : Detecting File Fragmentation Point Using Sequential Hypothesis Testing
In : -
Address :
Date : 2008

This DFRWS 2008 article presents an improved approach for carving fragmented JPEGs using sequential hypothesis testing. According to the authors, "The technique begins with a header block identifying the start of a file and then attempts to validate via SHT each subsequent block following the header block. The fragmentation point is identified when SHT identifies a block as not belonging to the file. By utilizing this technique, we are able to correctly and efficiently recover JPEG images from the DFRWS 2006 [1] and 2007 [2] test sets even in the presence of tens of thousands of blocks and files fragmented into 3 or more parts. The bifragment gap carving technique enhanced with SHT allows us to improve the performance result of DFRWS 2006 challenge test-sets, although the technique cannot be used for DFRWS 2007. We then show how Parallel Unique Path enhanced with SHT is able to recover all fragmented JPEGs from DFRWS 2006 and all recoverable JPEGs from 2007 challenge test-sets. As far as we are aware, no other automated technique can recover multi-fragmented JPEGs from the DFRWS 2007 test set."



2008-July-5

Yoginder Singh Dandass, Nathan Joseph Necaise, Sherry Reede Thomas - An Empirical Analysis of Disk Sector Hashes for Data Carving
Journal of Digital Forensic Practice 2:95--106,2008
http://www.informaworld.com/10.1080/15567280802050436
Bibtex
Author : Yoginder Singh Dandass, Nathan Joseph Necaise, Sherry Reede Thomas
Title : An Empirical Analysis of Disk Sector Hashes for Data Carving
In : Journal of Digital Forensic Practice -
Address :
Date : 2008

Authors Dandass et. al analyzed 528 million sectors from 433,630 unique files. They computed the CRC32, CRC64, MD5 and SHA-1 of each sector. Not surprisingly, they find that the MD5 and SHA-1s of the sectors are different if the sectors are different. They find 94 CRC64 collisions and 30 million CRC32 collisions. The conclusion is that, if you are search for a single sector or building a database of single sector hashes, you are better off building a database of CRC64s because they are easier to store and dramatically faster to calculate than the traditional hash functions, and they are nearly as accurate.