Difference between pages "File Carving" and "Carver 2.0 Planning Page"

From ForensicsWiki
(Difference between pages)
Jump to: navigation, search
m (File Carving challenges and test images)
 
(Requirements: afflib comment, extending Joachim's format list, noting which other dumb ideas are mine...)
 
Line 1: Line 1:
'''Carving''' is the practice of searching an input for files or other kinds of objects based on content, rather than on metadata. File carving is a powerful tool for recovering files and fragments of files when directory entries are corrupt or missing, as may be the case with old files that have been deleted or when performing an analysis on damaged media. Memory carving is a useful tool for analyzing physical and virtual memory dumps when the memory structures are unknown or have been overwritten.
+
This page is for planning Carver 2.0.
  
 +
Please, do not delete text (ideas) here. Use something like this:
  
=File Carving=
+
<pre>
 +
<s>bad idea</s>
 +
:: good idea
 +
</pre>
  
Most file carvers operate by looking for file headers and/or footers, and then "carving out" the blocks between these two boundaries. [[Semantic Carving]] performs carving based on an analysis of the contents of the proposed files.
+
This will look like:
  
File carving should be done on a [[disk image]], rather than on the original disk.
+
<s>bad idea</s>
 +
:: good idea
  
File carving tools are listed on the [[Tools:Data_Recovery]] wiki page.
+
= License =
  
Many carving programs have an option to only look at or near sector boundaries where headers are found. However, searching the entire input can find files that have been embedded into other files, such as [[JPEG]]s being embedded into [[Microsoft]] [[DOC|Word documents]]. This may be considered an advantage or a disadvantage, depending on the circumstances.
+
BSD-3.
  
Today most file carving programs will only recover files that are contiguous on the media.
+
= OS =
  
== File Carving Taxonomy==
+
Linux/FreeBSD/MacOS
[[Simson Garfinkel]] and [[Joachim Metz]] have proposed the following file carving taxonomy:
+
: Shouldn't this just match what the underlying afflib & sleuthkit cover? [[User:RB|RB]]
 +
:: Yes, but you need to test and validate on each. Question: Do we want to support windows? [[User:Simsong|Simsong]] 21:09, 30 October 2008 (UTC)
 +
:: [[User:Joachim Metz|Joachim]] I think we would do wise to design with windows support from the start this will improve the platform independence from the start
 +
:::: Agreed; I would even settle at first for being able to run against Cygwin.  Note that I don't even own or use a copy of Windows, but the vast majority of forensic investigators do. [[User:RB|RB]] 14:01, 31 October 2008 (UTC)
  
;Carving
+
= Requirements =
:General term for extracting data (files) out of undifferentiated blocks (raw data), like "carving" a sculpture out of soap stone.
+
  
;Block Based Carving
+
* [[User:Joachim Metz|Joachim]] A name for the tooling I propose coldcut
:Any carving method (algorithm) that analyzes the input on block-by-block basis to determine if a block is part of a possible output file. This method assumes that each block can only be part of a single file (or embedded file).
+
:: How about 'butcher'?  ;) [[User:RB|RB]] 14:20, 31 October 2008 (UTC)
  
;Characteristic Based Carving
+
[[User:Joachim Metz|Joachim]] Could we do a MoSCoW evaluation of these.
:Any carving method (algorithm) that analyzes the input on characteristic basis (for example, entropy) to determine if the input is part of a possible output file.
+
  
;Header/Footer Carving
+
* AFF and EWF file images supported from scratch. ([[User:Joachim Metz|Joachim]] I would like to have raw/split raw and device access as well)
:A method for carving files out of raw data using a distinct header (start of file marker) and footer (end of file marker).
+
:: If we base our image i/o on afflib, we get all three with one interface. [[User:RB|RB]]
 +
* [[User:Joachim Metz|Joachim]] volume/partition aware layer (what about carving unpartioned space)
 +
* File system aware layer.
 +
** By default, files are not carved. (clarify: only identified? [[User:RB|RB]]; I guess that it operates like [[Selective file dumper]] [[User:.FUF|.FUF]] 07:00, 29 October 2008 (UTC))
 +
* Plug-in architecture for identification/validation.
 +
** [[User:Joachim Metz|Joachim]] support for multiple types of validators
 +
*** dedicated validator
 +
*** validator based on file library (i.e. we could specify/implement a file structure for these)
 +
*** configuration based validator (Can handle config files,like Revit07, to enter different file formats used by the carver.)
 +
* Ship with validators for:
 +
[[User:Joachim Metz|Joachim]] I think we should distinguish between file format validators and content validators
 +
** JPEG
 +
** PNG
 +
** GIF
 +
** MSOLE
 +
** ZIP
 +
** TAR (gz/bz2)
  
;Header/Maximum (file) size Carving
+
[[User:Joachim Metz|Joachim]] For a production carver we need at least the following formats
:A method for carving files out of raw data using a distinct header (start of file marker) and a maximum (file) size. This approach works because many file formats (e.g. JPEG, MP3) do not care if additional junk is appended to the end of a valid file.
+
** Grapical Images
 +
*** JPEG (the 3 different types with JFIF/EXIF support)
 +
*** PNG
 +
*** GIF
 +
*** BMP
 +
*** TIFF
 +
** Office documents
 +
*** OLE2 (Word/Excell content support)
 +
*** PDF
 +
*** Open Office/Office 2007 (ZIP+XML)
 +
** Archive files
 +
*** ZIP
 +
*** 7z
 +
*** gzip
 +
*** bzip2
 +
*** tar
 +
*** RAR
 +
** E-mail files
 +
*** PFF (PST/OST)
 +
*** MBOX (text based format, base64 content support)
 +
** Audio/Video files
 +
*** MPEG
 +
*** MP2/MP3
 +
*** AVI
 +
*** ASF/WMV
 +
*** QuickTime
 +
*** MKV
 +
** Printer spool files
 +
*** EMF (if I remember correctly)
 +
** Internet history files
 +
*** index.dat
 +
*** firefox (sqllite 3)
 +
** Other files
 +
*** thumbs.db
 +
*** pagefile?
  
;Header/Embedded Length Carving
+
* Simple fragment recovery carving using gap carving.
:A method for carving files out of raw data using a distinct header and a file length (size) which is embedded in the file format
+
** [[User:Joachim Metz|Joachim]] have hook in for more advanced fragment recovery?
 +
* Recovering of individual ZIP sections and JPEG icons that are not sector aligned.
 +
** [[User:Joachim Metz|Joachim]] I would propose a generic fragment detection and recovery
 +
* Autonomous operation (some mode of operation should be completely non-interactive, requiring no human intervention to complete [[User:RB|RB]])
 +
** [[User:Joachim Metz|Joachim]] as much as possible, but allow to be overwritten by user
 +
* Tested on 500GB-sized images. Should be able to carve a 500GB image in roughly 50% longer than it takes to read the image.
 +
** Perhaps allocate a percentage budget per-validator (i.e. each validator adds N% to the carving time) [[User:RB|RB]]
 +
** [[User:Joachim Metz|Joachim]] have multiple carving phases for precision/speed trade off?
 +
* Parallelizable
 +
** [[User:Joachim Metz|Joachim]] tunable for different architectures
 +
* Configuration:
 +
** Capability to parse some existing carvers' configuration files, either on-the-fly or as a one-way converter.
 +
** Disengage internal configuration structure from configuration files, create parsers that present the expected structure
 +
** [[User:Joachim Metz|Joachim]] The validator should deal with the file structure the carving algorithm should not know anything about the file structure (as in revit07 design)
 +
**  Either extend Scalpel/Foremost syntaxes for extended features or use a tertiary syntax ([[User:Joachim Metz|Joachim]] I would prefer a derivative of the revit07 configuration syntax which already has encountered some problems of dealing with defining file structure in a configuration file)
 +
* Can output audit.txt file.
 +
* [[User:Joachim Metz|Joachim]] Can output database with offset analysis values i.e. for visualization tooling
 +
* [[User:Joachim Metz|Joachim]] Can output debug log for debugging the algorithm/validation
 +
* Easy integration into ascription software.
 +
** [[User:Joachim Metz|Joachim]] I'm no native speaker what do you mean with "ascription software"?
 +
:: I think this was another non-native requesting easy scriptability. [[User:RB|RB]] 14:20, 31 October 2008 (UTC)
  
;File structure based Carving
+
= Ideas =
:A method for carving files out of raw data using a certain level of knowledge of the internal structure of file types. Garfinkel called this approach "Semantic Carving" in his DFRWS2006 carving challenge submission, while Metz and Mora called the approach "Deep Carving."
+
* Use as much TSK if possible. Don't carry your own FS implementation the way photorec does.
 +
** [[User:Joachim Metz|Joachim]] using TSK as much as possible would not allow to add your own file system support (i.e. mobile phones, memory structures, cap files)
 +
I would propose wrapping TSK and using it as much as possible but allow to integrate own FS implementations.
 +
* Extracting/carving data from [[Thumbs.db]]? I've used [[foremost]] for it with some success. [[Vinetto]] has some critical bugs :( [[User:.FUF|.FUF]] 19:18, 28 October 2008 (UTC)
 +
* Carving data structures. For example, extract all TCP headers from image by defining TCP header structure and some fields (e.g. source port > 1024, dest port = 80). This will extract all data matching the pattern and write a file with other fields. Another example is carving INFO2 structures and URL activity records from index.dat [[User:.FUF|.FUF]] 20:51, 28 October 2008 (UTC)
 +
** This has the opportunity to be extended to the concept of "point at blob FOO and interpret it as BAR"
  
;Semantic Carving
+
.FUF added:
:A method for carving files based on a linguistic analysis of the file's content. For example, a semantic carver might conclude that six blocks of french in the middle of a long HTML file written in English is a fragment left from a previous allocated file, and not from the English-language HTML file.
+
The main idea is to allow users to define structures, for example (in pascal-like form):
  
;Carving with Validation
+
<pre>
:A method for carving files out of raw data where the carved files are validated using a file type specific validator.
+
Field1: Byte = 123;
 +
SomeTextLength: DWORD;
 +
SomeText: string[SomeTextLength];
 +
Field4: Char = 'r';
 +
...
 +
</pre>
  
;Fragment Recovery Carving
+
This will produce something like this:
:A carving method in which two or more fragments are reassembled to form the original file or object. Garfinkel previously called this approach "Split Carving."
+
<pre>
 +
Field1 = 123
 +
SomeTextLength = 5
 +
SomeText = 'abcd1'
 +
Field4 = 'r'
 +
</pre>
  
== File Carving challenges and test images ==
+
(In text or raw forms.)
  
[http://www.dfrws.org/2006/challenge/ File Carving Challenge] - [[Digital Forensic Research Workshop|DFRWS]] 2006
+
Opinions?
  
[http://www.dfrws.org/2007/challenge/ File Carving Challenge] - [[Digital Forensic Research Workshop|DFRWS]] 2007
+
Opinion: Simple pattern identification like that may not suffice, I think Simson's original intent was not only to identify but to allow for validation routines (plugins, as the original wording was). As such, the format syntax would need to implement a large chunk of some programming language in order to be sufficiently flexible. [[User:RB|RB]]
  
[http://dftt.sourceforge.net/test6/index.html FAT Undelete Test #1] - Digital Forensics Tool Testing Image (dftt #6)
+
=File System Awareness =
 +
==Background: Why be File System Aware?==
 +
Advantages of being FS aware:
 +
* You can pick up sector allocation sizes ([[User:Joachim Metz|Joachim]] do you mean file system block sizes?)
 +
* Some file systems may store things off sector boundaries. (ReiserFS with tail packing)
 +
* Increasingly file systems have compression (NTFS compression)
 +
* Carve just the sectors that are not in allocated files.
  
[http://dftt.sourceforge.net/test7/index.html NTFS Undelete (and leap year) Test #1] - Digital Forensics Tool Testing Image (dftt #7)
+
==Tasks that would be required==
  
[http://dftt.sourceforge.net/test11/index.html Basic Data Carving Test - fat32], Nick Mikus - Digital Forensics Tool Testing Image (dftt #11)
+
==Discussion==
 +
:: As noted above, TSK should be utilized as much as possible, particularly the filesystem-aware portion. If we want to identify filesystems outside of its supported set, it would be more worth our time to work on implementing them there than in the carver itself. [[User:RB|RB]]
  
[http://dftt.sourceforge.net/test12/index.html Basic Data Carving Test - ext2], Nick Mikus - Digital Forensics Tool Testing Image (dftt #12)
+
:::: I guess this tool operates like [[Selective file dumper]] and can recover files in both ways (or not?). Recovering files by using carving can recover files in situations where sleuthkit does nothing (e.g. file on NTFS was deleted using ntfs-3g, or filesystem was destroyed or just unknown). And we should build the list of filesystems supported by carver, not by TSK. [[User:.FUF|.FUF]] 07:08, 29 October 2008 (UTC)
  
== See also ==
+
:: This tool is still in the early planning stages (requirements discovery), hence few operational details (like precise modes of operation) have been fleshed out - those will and should come later.  The justification for strictly using TSK for the filesystem-sensitive approach is simple: TSK has good filesystem APIs, and it would be foolish to create yet another standalone, incompatible implementation of filesystem(foo) when time would be better spent improving those in TSK, aiding other methods of analysis as well.  This is the same reason individuals that have implemented several other carvers are participating: de-duplication of effort.  [[User:RB|RB]]
* [[Tools:Data_Recovery#Carving | FIle Carving Tools]]
+
* [[File Carving Bibliography]]
+
  
=Memory Carving=
+
[[User:Joachim Metz|Joachim]] I would like to have the carver (recovery tool) also do recovery using file allocation data or remainders of file allocation data.
 +
 
 +
[[User:Joachim Metz|Joachim]]
 +
I would go as far to ask you all to look beyond the carver as a tool and look from the perspective of the carver as part of the forensic investigation process. In my eyes certain information needed/acquired by the carver could be also very useful investigative information i.e. what part of a hard disk contains empty sectors.
 +
 
 +
[[User:Joachim Metz|Joachim]]
 +
I'm missing a part on the page about the carving challenges (scenarios)
 +
* normal file (file structure, loose text based structure (more a content structure?))
 +
* fragmented file (the file entirely exist)
 +
* a file fragment (the file does not entirely exist)
 +
* intertwined file
 +
* encapsulated file (MPEG/network capture)
 +
* embedded file (JPEG thumbnail)
 +
 
 +
[[User:Joachim Metz|Joachim]]
 +
I'm missing a part on the page about the carving algorithm
 +
* should we allow for multiple runs?
 +
* should we allow for multiple algorithms?
 +
* does the algorithm passes data blocks to the validators?
 +
* does a validator need to maintain a state?
 +
* does a validator need to revert a state?
 +
* do we use the assumption that a data block can be used by a single file (with the exception of embedded/encapsulated files)?
 +
 
 +
[[User:Joachim Metz|Joachim]]
 +
I'm missing a part on the page about supportive tooling
 +
* validator (definitions) tester (detest in revit07)
 +
* tool to make configuration based definitions
 +
* post carving validation
 +
* the carver needs to provide support for fuse mount of carved files (carvfs)
 +
 
 +
=Validator Construction=
 +
Options:
 +
* Write validators in C/C++
 +
** [[User:Joachim Metz|Joachim]] you mean dedicated validators
 +
* Have a scripting language for writing them (python? Perl?) our own?
 +
** [[User:Joachim Metz|Joachim]] use easy to embed programming languages i.e. Phyton or Lua
 +
* Use existing programs (libjpeg?) as plug-in validators?
 +
** [[User:Joachim Metz|Joachim]] define a file structure api for this
 +
 
 +
=Existing Code that we have=
 +
 
 +
[[User:Joachim Metz|Joachim]]
 +
Carvers
 +
* DFRWS2006/2007 carving challenge results
 +
* DFRWS2008 paper on carving
 +
* photorec
 +
* revit06 and revit07
 +
* s3/scarve
 +
 
 +
Possible file structure validator libraries
 +
* diverse existing file support libraries
 +
* libole2 (inhouse experimental code of OLE2 support)
 +
* libpff (alpha release for PFF (PST/OST) file support)
 +
 
 +
Input support
 +
* AFF
 +
* EWF
 +
* TSK device & raw & split raw
 +
 
 +
Volume/Partition support
 +
* disktype
 +
* testdisk
 +
* TSK
 +
 
 +
File system support
 +
* TSK
 +
* photorec FS code
 +
* implementations of FS in Linux/BSD
 +
 
 +
Content support
 +
 
 +
=Implementation Timeline=
 +
# gather the available resources/ideas/wishes/needs etc. (I guess we're in this phase)
 +
# start discussing a high level design (in terms of algorithm, facilities, information needed)
 +
## input formats facility
 +
## partition/volume facility
 +
## file system facility
 +
## file format facility
 +
## content facility
 +
## how to deal with fragment detection (do the validators allow for fragment detection?)
 +
## how to deal with recombination of fragments
 +
## do we want multiple carving phases in light of speed/precision tradeoffs
 +
# start detailing parts of the design
 +
## Discuss options for a grammar driven validator?
 +
## Hard-coded plug-ins?
 +
## Which exsisting code can we use?
 +
# start building/assembling parts of the tooling for a prototype
 +
## Implement simple file carving with validation.
 +
## Implement gap carving
 +
# Initial Release
 +
# Implement the ''threaded carving'' that [[User:.FUF|.FUF]] is describing above.

Revision as of 09:20, 31 October 2008

This page is for planning Carver 2.0.

Please, do not delete text (ideas) here. Use something like this:

<s>bad idea</s>
:: good idea

This will look like:

bad idea

good idea

License

BSD-3.

OS

Linux/FreeBSD/MacOS

Shouldn't this just match what the underlying afflib & sleuthkit cover? RB
Yes, but you need to test and validate on each. Question: Do we want to support windows? Simsong 21:09, 30 October 2008 (UTC)
Joachim I think we would do wise to design with windows support from the start this will improve the platform independence from the start
Agreed; I would even settle at first for being able to run against Cygwin. Note that I don't even own or use a copy of Windows, but the vast majority of forensic investigators do. RB 14:01, 31 October 2008 (UTC)

Requirements

  • Joachim A name for the tooling I propose coldcut
How about 'butcher'?  ;) RB 14:20, 31 October 2008 (UTC)

Joachim Could we do a MoSCoW evaluation of these.

  • AFF and EWF file images supported from scratch. (Joachim I would like to have raw/split raw and device access as well)
If we base our image i/o on afflib, we get all three with one interface. RB
  • Joachim volume/partition aware layer (what about carving unpartioned space)
  • File system aware layer.
    • By default, files are not carved. (clarify: only identified? RB; I guess that it operates like Selective file dumper .FUF 07:00, 29 October 2008 (UTC))
  • Plug-in architecture for identification/validation.
    • Joachim support for multiple types of validators
      • dedicated validator
      • validator based on file library (i.e. we could specify/implement a file structure for these)
      • configuration based validator (Can handle config files,like Revit07, to enter different file formats used by the carver.)
  • Ship with validators for:

Joachim I think we should distinguish between file format validators and content validators

    • JPEG
    • PNG
    • GIF
    • MSOLE
    • ZIP
    • TAR (gz/bz2)

Joachim For a production carver we need at least the following formats

    • Grapical Images
      • JPEG (the 3 different types with JFIF/EXIF support)
      • PNG
      • GIF
      • BMP
      • TIFF
    • Office documents
      • OLE2 (Word/Excell content support)
      • PDF
      • Open Office/Office 2007 (ZIP+XML)
    • Archive files
      • ZIP
      • 7z
      • gzip
      • bzip2
      • tar
      • RAR
    • E-mail files
      • PFF (PST/OST)
      • MBOX (text based format, base64 content support)
    • Audio/Video files
      • MPEG
      • MP2/MP3
      • AVI
      • ASF/WMV
      • QuickTime
      • MKV
    • Printer spool files
      • EMF (if I remember correctly)
    • Internet history files
      • index.dat
      • firefox (sqllite 3)
    • Other files
      • thumbs.db
      • pagefile?
  • Simple fragment recovery carving using gap carving.
    • Joachim have hook in for more advanced fragment recovery?
  • Recovering of individual ZIP sections and JPEG icons that are not sector aligned.
    • Joachim I would propose a generic fragment detection and recovery
  • Autonomous operation (some mode of operation should be completely non-interactive, requiring no human intervention to complete RB)
    • Joachim as much as possible, but allow to be overwritten by user
  • Tested on 500GB-sized images. Should be able to carve a 500GB image in roughly 50% longer than it takes to read the image.
    • Perhaps allocate a percentage budget per-validator (i.e. each validator adds N% to the carving time) RB
    • Joachim have multiple carving phases for precision/speed trade off?
  • Parallelizable
    • Joachim tunable for different architectures
  • Configuration:
    • Capability to parse some existing carvers' configuration files, either on-the-fly or as a one-way converter.
    • Disengage internal configuration structure from configuration files, create parsers that present the expected structure
    • Joachim The validator should deal with the file structure the carving algorithm should not know anything about the file structure (as in revit07 design)
    • Either extend Scalpel/Foremost syntaxes for extended features or use a tertiary syntax (Joachim I would prefer a derivative of the revit07 configuration syntax which already has encountered some problems of dealing with defining file structure in a configuration file)
  • Can output audit.txt file.
  • Joachim Can output database with offset analysis values i.e. for visualization tooling
  • Joachim Can output debug log for debugging the algorithm/validation
  • Easy integration into ascription software.
    • Joachim I'm no native speaker what do you mean with "ascription software"?
I think this was another non-native requesting easy scriptability. RB 14:20, 31 October 2008 (UTC)

Ideas

  • Use as much TSK if possible. Don't carry your own FS implementation the way photorec does.
    • Joachim using TSK as much as possible would not allow to add your own file system support (i.e. mobile phones, memory structures, cap files)

I would propose wrapping TSK and using it as much as possible but allow to integrate own FS implementations.

  • Extracting/carving data from Thumbs.db? I've used foremost for it with some success. Vinetto has some critical bugs :( .FUF 19:18, 28 October 2008 (UTC)
  • Carving data structures. For example, extract all TCP headers from image by defining TCP header structure and some fields (e.g. source port > 1024, dest port = 80). This will extract all data matching the pattern and write a file with other fields. Another example is carving INFO2 structures and URL activity records from index.dat .FUF 20:51, 28 October 2008 (UTC)
    • This has the opportunity to be extended to the concept of "point at blob FOO and interpret it as BAR"

.FUF added: The main idea is to allow users to define structures, for example (in pascal-like form):

Field1: Byte = 123;
SomeTextLength: DWORD;
SomeText: string[SomeTextLength];
Field4: Char = 'r';
...

This will produce something like this:

Field1 = 123
SomeTextLength = 5
SomeText = 'abcd1'
Field4 = 'r'

(In text or raw forms.)

Opinions?

Opinion: Simple pattern identification like that may not suffice, I think Simson's original intent was not only to identify but to allow for validation routines (plugins, as the original wording was). As such, the format syntax would need to implement a large chunk of some programming language in order to be sufficiently flexible. RB

File System Awareness

Background: Why be File System Aware?

Advantages of being FS aware:

  • You can pick up sector allocation sizes (Joachim do you mean file system block sizes?)
  • Some file systems may store things off sector boundaries. (ReiserFS with tail packing)
  • Increasingly file systems have compression (NTFS compression)
  • Carve just the sectors that are not in allocated files.

Tasks that would be required

Discussion

As noted above, TSK should be utilized as much as possible, particularly the filesystem-aware portion. If we want to identify filesystems outside of its supported set, it would be more worth our time to work on implementing them there than in the carver itself. RB
I guess this tool operates like Selective file dumper and can recover files in both ways (or not?). Recovering files by using carving can recover files in situations where sleuthkit does nothing (e.g. file on NTFS was deleted using ntfs-3g, or filesystem was destroyed or just unknown). And we should build the list of filesystems supported by carver, not by TSK. .FUF 07:08, 29 October 2008 (UTC)
This tool is still in the early planning stages (requirements discovery), hence few operational details (like precise modes of operation) have been fleshed out - those will and should come later. The justification for strictly using TSK for the filesystem-sensitive approach is simple: TSK has good filesystem APIs, and it would be foolish to create yet another standalone, incompatible implementation of filesystem(foo) when time would be better spent improving those in TSK, aiding other methods of analysis as well. This is the same reason individuals that have implemented several other carvers are participating: de-duplication of effort. RB

Joachim I would like to have the carver (recovery tool) also do recovery using file allocation data or remainders of file allocation data.

Joachim I would go as far to ask you all to look beyond the carver as a tool and look from the perspective of the carver as part of the forensic investigation process. In my eyes certain information needed/acquired by the carver could be also very useful investigative information i.e. what part of a hard disk contains empty sectors.

Joachim I'm missing a part on the page about the carving challenges (scenarios)

  • normal file (file structure, loose text based structure (more a content structure?))
  • fragmented file (the file entirely exist)
  • a file fragment (the file does not entirely exist)
  • intertwined file
  • encapsulated file (MPEG/network capture)
  • embedded file (JPEG thumbnail)

Joachim I'm missing a part on the page about the carving algorithm

  • should we allow for multiple runs?
  • should we allow for multiple algorithms?
  • does the algorithm passes data blocks to the validators?
  • does a validator need to maintain a state?
  • does a validator need to revert a state?
  • do we use the assumption that a data block can be used by a single file (with the exception of embedded/encapsulated files)?

Joachim I'm missing a part on the page about supportive tooling

  • validator (definitions) tester (detest in revit07)
  • tool to make configuration based definitions
  • post carving validation
  • the carver needs to provide support for fuse mount of carved files (carvfs)

Validator Construction

Options:

  • Write validators in C/C++
    • Joachim you mean dedicated validators
  • Have a scripting language for writing them (python? Perl?) our own?
    • Joachim use easy to embed programming languages i.e. Phyton or Lua
  • Use existing programs (libjpeg?) as plug-in validators?
    • Joachim define a file structure api for this

Existing Code that we have

Joachim Carvers

  • DFRWS2006/2007 carving challenge results
  • DFRWS2008 paper on carving
  • photorec
  • revit06 and revit07
  • s3/scarve

Possible file structure validator libraries

  • diverse existing file support libraries
  • libole2 (inhouse experimental code of OLE2 support)
  • libpff (alpha release for PFF (PST/OST) file support)

Input support

  • AFF
  • EWF
  • TSK device & raw & split raw

Volume/Partition support

  • disktype
  • testdisk
  • TSK

File system support

  • TSK
  • photorec FS code
  • implementations of FS in Linux/BSD

Content support

Implementation Timeline

  1. gather the available resources/ideas/wishes/needs etc. (I guess we're in this phase)
  2. start discussing a high level design (in terms of algorithm, facilities, information needed)
    1. input formats facility
    2. partition/volume facility
    3. file system facility
    4. file format facility
    5. content facility
    6. how to deal with fragment detection (do the validators allow for fragment detection?)
    7. how to deal with recombination of fragments
    8. do we want multiple carving phases in light of speed/precision tradeoffs
  3. start detailing parts of the design
    1. Discuss options for a grammar driven validator?
    2. Hard-coded plug-ins?
    3. Which exsisting code can we use?
  4. start building/assembling parts of the tooling for a prototype
    1. Implement simple file carving with validation.
    2. Implement gap carving
  5. Initial Release
  6. Implement the threaded carving that .FUF is describing above.