Past Selected Articles

From Forensics Wiki
Revision as of 17:52, 23 July 2009 by Simsong (Talk | contribs)

Jump to: navigation, search

Archived past selected research articles


Recovery of Damaged Compressed Files for Digital Forensic Purposes, Bora Park Savoldi, A. Gubian, P. Jungheum Park Seok Hee Lee Sangjin Lee

Korea Univ., Seoul, International Conference on Multimedia and Ubiquitous Engineering, 2008. MUE 2008/


Nowadays compressed files are very widespread and can be considered, without any doubt, with regard to the Digital Forensic realm, an important and precious source of probatory data. This is especially true when in a digital investigation the examiner has to deal with corrupted compressed files, which have been gathered in the collection phase of the investigative process. Therefore, in the computer forensic field, data recovery technologies are very important for acquiring useful pieces of data which can become, in a court of low, digital evidence. This kind of technology is used not only by law enforcement, but also by the multitude of users in their daily activities, which justify the relevant presence of tools in the software market which are devoted to rescue data from damaged compressed files. However, state-of-the-art data recovery tools have many limitations with regard to the capability of recovering the original data, especially in the case of damaged compressed files. So far, such recovery tools have been based on a which controls the signature/header of the file and, thus, provides the offset to the raw compressed data block. As a result, they cannot recover the compressed files if the first part of the raw compressed data block, which pertains to the header, is damaged or the signature/header block is corrupted. Therefore, in order to deal with this issue, we have developed a new tool capable of rescuing damaged compressed files, according to the DEFLATE compression scheme, even though the header block is missing or corrupted. This represents a new interesting opportunity for the digital forensic discipline.


Overcoming Impediments to Cell Phone Forensics, Wayne Jansen, Aurelien Delaitre, and Ludovic Moenner, Proceedings of the 41st Hawaii International Conference on System Sciences - 2008
Cell phones are an emerging but rapidly growing area of computer forensics. While cell phones are becoming more like desktop computers functionally, their organization and operation are quite different in certain areas. For example, most cell phones do not contain a hard drive and rely instead on flash memory for persistent storage. Cell phones are also designed more as special purpose appliances that perform a set of predefined tasks using proprietary embedded software, rather than general-purpose extensible systems that run common operating system software. Such differences make the application of classical computer forensic techniques difficult. Also complicating the situation is the state of the art of present day cell phone forensic tools themselves and the way in which tools are applied. This paper identifies factors that impede cell phone forensics and describes techniques to address two resulting problems in particular: the limited coverage of available phone models by forensic tools, and the inadequate means for validating the correct functioning of forensic tools


A Framework for Automated Digital Forensic Reporting, Lt. Paul Farrell, Master's Thesis, Naval Postgraduate School, Monterey, CA, March 2009
Forensic analysis is the science of finding, examining and analyzing evidence in support of law enforcement, regulatory compliance or information gathering. Today, almost all digital forensic analysis is done by humans, requiring dedicated training and consuming man-hours at a considerable rate. As storage sizes increase and digital forensics gain importance in investigations, the backlog of media requiring human analysis has increased as well. This thesis tests today's top-of-the-line commercial and open source forensic tools with the analysis of a purpose-built Windows XP computer system containing two users that engaged in email, chat and web browsing. It presents the results of a pilot user study of the PyFlag forensic tool. Finally, it presents a technique to use software to do a preliminary analysis on media and provide a human readable report to the examiner. The goal of the technique is to perform rapid triaging of media and allow the human examiner to better prioritize man hours towards media with high return on investment.


The impact of full disk encryption on digital forensics, ACM SIGOPS Operating Systems Review archive, Volume 42 , Issue 3 (April 2008) , Pages 93-98
The integration of strong encryption into operating systems is creating challenges for forensic examiners, potentially preventing us from recovering any digital evidence from a computer. Because strong encryption cannot be circumvented without a key or passphrase, forensic examiners may not be able to access data after a computer is shut down, and must decide whether to perform a live forensic acquisition. In addition, with encryption becoming integrated into the operating system, in some cases, virtualization is the most effective approach to performing a forensic examination of a system with FDE. This paper presents the evolution of full disk encryption (FDE) and its impact on digital forensics. Furthermore, by demonstrating how full disk encryption has been dealt with in past investigations, this paper provides forensics examiners with practical techniques for recovering evidence that would otherwise be inaccessible.


Forensic Investigation of the Nintendo Wii: A First Glance, Dr. Benjamin Turnbull, SMALL SCALE DIGITAL DEVICE FORENSICS JOURNAL, VOL. 2, NO. 1, JUNE 2008 ISSN# 1941-6164
The closed nature of the Wii makes it a challenging game console for forensic analysis. This article takes a first look at the platform, discussing the various places where forensically interesting information may be hidden. There's also a reference to an interesting article about how the ability of wimotes to move avatars from one system to another documents a woman's affair while her husband was serving in Iraq.


Data Acquisition from Cell Phone using Logical Approach, Keonwoo Kim, Dowon Hong, Kyoil Chung, and Jae-Cheol Ryou, PROCEEDINGS OF WORLD ACADEMY OF SCIENCE, ENGINEERING AND TECHNOLOGY VOLUME 26 DECEMBER 2007 ISSN 1307-6884
This article discusses three approaches for acquiring data from cell phones: physically removing the flash RAM chips and reading them directly; reading the data out using the JTAG interface, and running software on the cell phone to extract the files at a logical level. The authors have built a logical extraction system and are working on a system based on JTAG.


Semi-Supervised Named Entity Recognition
Learning to Recognize 100 Entity Types with Little Supervision, David Nadeau, PhD Thesis, University of Ottawa, 2007.
Named Entity Recognition is the process of analyzing text documents and automatically identifying the Who, What, Where and Whens. David Nadeau's thesis presents a novel approach which bootstraps a named entity recognizer using semi-structured documents on the web. Few forensic tools use NER today, but that may well change in the future. What makes this thesis so interesting to read is that it also has a history of the last 20 years or so of the field. Highly recommended.


Advanced JPEG carving, Michael I. Cohen, Proceedings of the 1st international conference on Forensic applications and techniques in telecommunications, information, and multimedia and workshop
Michael I. Cohen presents a fully automated carver which can carve fragmented JPEGs using typical fragmentation patterns. (Cohen argues the the DFRWS carving challenges do not represent typical fragmentation patterns.)


Threats to Privacy in the Forensic Analysis of Database Systems
Patrick Stahlberg, Gerome Miklau, and Brian Neil Levine
Proceedings of the 2007 ACM SIGMOD international conference on Management of data table of contents
Beijing, China.
This paper looks at residual data left behind in databases after DELETE, UPDATE, and VACUUM operations. The authors show that residual data is a real issue in databases, and that it's pretty easy to modify a database so that no residual data is left behind. MySQL with MyISM tables has clean delete, but InnoDB does not. Very much worth reading.


Lest We Remember: Cold Boot Attacks on Encryption Keys
J. Alex Halderman, Princeton University; Seth D. Schoen, Electronic Frontier Foundation; Nadia Heninger and William Clarkson, Princeton University; William Paul, Wind River Systems; Joseph A. Calandrino and Ariel J. Feldman, Princeton University; Jacob Appelbaum; Edward W. Felten, Princeton University
USENIX Security '08 Refereed Paper
Awarded Best Student Paper
Increasingly memory analysis is of interest in forensic research---both because new malware only resides in memory, and because memory analysis is frequently the only way for analysts to get the keys that are used to protect cryptographic file systems. In this paper the authors show that cryptographic keys in memory are vulnerable to exploitation after the computer is turned off. The authors show that the contents of dynamic RAM are retained seconds, and sometimes minutes, after power is turned off. By chilling the memory the data can be retained as long as necessary. And while most laptops wipe their memory when they reboot, the authors show that the chilled memory can be moved from one laptop that wipes to another laptop that does not wipe. Finally, the authors show that it is possible to find the cryptographic keys in memory and correct random bit errors by using the AES key schedule as an error-correction code. The authors demonstrate an attack USB stick which reboots a computer protected with BitLocker, finds the cryptographic keys, and then allows access to the cleartext information on the disk.


The Symposium on Usable Privacy and Security (2008) concluded this past week in Pittsburgh, PA. One paper that appeared which is interesting to the network forensics crowd is The Challenges of Using an Intrusion Detection System: Is It Worth the Effort?, by Rodrigo Werlinger, Kirstie Hawkey, Kasia Muldner, Pooya Jaferian and Konstantin Beznosov. slides
In this article, the authors conducted interviews with 9 IT security practitioners who have worked with IDSs performed ethnographic observations within an organization that was deploying a new IDS. They found that security practitioners were heavily divided on the value of the IDS, and learned that the an IDS really only generates value if the network is well-understood before the IDS is deployed.


The International Journal of Digital Evidence is one of two publications by the Electronic Crime Institute (ECI) at Utica College. Current and previous issues are available online.
The current Fall 2007 issue has an interesting article Mobile Phone Forensics Tool Testing: A Database Drive Approach by Baggili, Mislan, and Rogers at Purdue University. Given that phones are increasingly a primary source of forensic information in many cases, we need to be sure that the tools that are used for forensic analysis present data that is accurate and repeatable. Unfortunately they frequently aren't because of there are so many different kinds of phones on the market and the forensic tools lag far behind the market.

Ibrahim M. Baggili, Richard Mislan, Marcus Rogers -
International Journal of Digital Evidence 6,2007
Author : Ibrahim M. Baggili, Richard Mislan, Marcus Rogers
Title :
In : International Journal of Digital Evidence -
Address :
Date : 2007


Anandabrata Pal, Taha Sencar, Nasir Memon - Detecting File Fragmentation Point Using Sequential Hypothesis Testing
Author : Anandabrata Pal, Taha Sencar, Nasir Memon
Title : Detecting File Fragmentation Point Using Sequential Hypothesis Testing
In : -
Address :
Date : 2008

This DFRWS 2008 article presents an improved approach for carving fragmented JPEGs using sequential hypothesis testing. According to the authors, "The technique begins with a header block identifying the start of a file and then attempts to validate via SHT each subsequent block following the header block. The fragmentation point is identified when SHT identifies a block as not belonging to the file. By utilizing this technique, we are able to correctly and efficiently recover JPEG images from the DFRWS 2006 [1] and 2007 [2] test sets even in the presence of tens of thousands of blocks and files fragmented into 3 or more parts. The bifragment gap carving technique enhanced with SHT allows us to improve the performance result of DFRWS 2006 challenge test-sets, although the technique cannot be used for DFRWS 2007. We then show how Parallel Unique Path enhanced with SHT is able to recover all fragmented JPEGs from DFRWS 2006 and all recoverable JPEGs from 2007 challenge test-sets. As far as we are aware, no other automated technique can recover multi-fragmented JPEGs from the DFRWS 2007 test set."


Yoginder Singh Dandass, Nathan Joseph Necaise, Sherry Reede Thomas - An Empirical Analysis of Disk Sector Hashes for Data Carving
Journal of Digital Forensic Practice 2:95--106,2008
Author : Yoginder Singh Dandass, Nathan Joseph Necaise, Sherry Reede Thomas
Title : An Empirical Analysis of Disk Sector Hashes for Data Carving
In : Journal of Digital Forensic Practice -
Address :
Date : 2008

Authors Dandass et. al analyzed 528 million sectors from 433,630 unique files. They computed the CRC32, CRC64, MD5 and SHA-1 of each sector. Not surprisingly, they find that the MD5 and SHA-1s of the sectors are different if the sectors are different. They find 94 CRC64 collisions and 30 million CRC32 collisions. The conclusion is that, if you are search for a single sector or building a database of single sector hashes, you are better off building a database of CRC64s because they are easier to store and dramatically faster to calculate than the traditional hash functions, and they are nearly as accurate.