Difference between pages "Data Compass" and "Log2timeline"

From Forensics Wiki
(Difference between pages)
Jump to: navigation, search
(Overview)
 
(Description)
 
Line 1: Line 1:
== Overview ==
+
==log2timeline==
Data Compass is a hardware and software data recovery tool produced by [[SalvationDATA]].
+
  
According to our pioneer [[3+1 Data Recovery]] process, the stage following the drive restoration will be data recovery using Data Compass.
+
log2timeline is designed as a framework for artifact timeline creation and analysis. The main purpose is to provide a single tool to parse various log files and artifacts found on suspect systems (and supporting systems, such as network equipment) and produce a body file that can be used to create a timeline, using tools such as mactime from TSK, for forensic investigators.
  
After bringing the failed drive back to life, as a data recovery professional, you know now you need to recover the damaged file system by using file system recovery software; and maybe you know also you need to do a disk imaging in order to work from an accurate, stable hard drive image.
+
The tool is written in Perl for Linux but has been tested using Mac OS X (10.5.7 and 10.5.8). Parts of it should work natively in Windows as well (with ActiveState Perl installed).
  
Unfortunately, traditional disk imaging tools and methods are designed for and work on good HDDs only, not the patient HDDs that are unstable or inaccessible because of media defects and instable head, which are common challenges of Stage 2 in practice. Even more, with those traditional imaging tools, the time involved and the ordinary user-level repeated-read access to the media bring a risk of damaging the disk and head, making data lost irretrievable.
+
==Description==
 +
log2timeline takes a log file (or a directory) and parses it to produce a body file that can be imported into other tools for timeline analysis. The tool has both a modular based approach to the input file as well as the output file. The default behavior of the current version is to export the timeline in a body format readable by TSK's (The SleuthKit) [http://wiki.sleuthkit.org/index.php?title=Body_file mactime] (although this can be easily changed). log2timeline is build as a series of scripts, this one being the front-end, which uses other scripts to actually parse the log files (called modules). The tool is build to be easily extended for anyone that wants to create a new module.
  
But now there's a better way: The disk probing equipment included in the Data Compass suite bypassing the disk-level problems such as multiple bad sectors, damaged surfaces, malfunctioning head assembly, or corrupted servo info, In the meantime you can use the default software or any other defined software you have been familiar with (R-studio, Winhex, any) to perform file recovery. Through the Data Compass, problem drives will become intact hard drives and ready for file recovery attempts..
+
The tool contains (current version of 0.51 nightly build (20102608)) three front-ends:
 +
* '''log2timeline''' - The main front-end.  A tool capable of parsing a single log file/directory pointed to the tool using a selected input module.
 +
* '''timescanner''' - A recursive front-end capable of parsing a directory passed to the tool and recursively go through each and every file/dir and try to parse it with every or selected input modules (to provide an automatic method of creating a super timeline).
 +
* '''glog2timeline''' - A simple GUI front-end, with similar capabilities as log2timeline (the main front-end)
  
== What can Data Compass do? ==
+
==Currently Supported Input Modules==
  
 +
The currently supported input modules (as of version 0.51 nightly build (20102608)) are:
  
Data recovery from physically damaged HDDs is what Data Compass designed for.
+
* '''apache2_access''' - Parse the content of a Apache2 access log file
 +
* '''apache2_error''' - Parse the content of a Apache2 error log file
 +
* '''chrome''' - Parse the content of a Chrome history file
 +
* '''evt''' - Parse the content of a Windows 2k/XP/2k3 Event Log
 +
* '''evtx''' - Parse the content of a Windows Event Log File (EVTX)
 +
* '''exif''' - Extract metadata information from files using ExifTool
 +
* '''ff_bookmark''' - Parse the content of a Firefox bookmark file
 +
* '''firefox2''' - Parse the content of a Firefox 2 browser history
 +
* '''firefox3''' - Parse the content of a Firefox 3 history file
 +
* '''iehistory''' - Parse the content of an index.dat file containg IE history
 +
* '''iis''' - Parse the content of a IIS W3C log file
 +
* '''isatxt''' - Parse the content of a ISA text export log file
 +
* '''mactime''' - Parse the content of a body file in the mactime format
 +
* '''mcafee''' - Parse the content of a log file
 +
* '''opera''' - Parse the content of an Opera's global history file
 +
* '''oxml''' - Parse the content of an OpenXML document (Office 2007 documents)
 +
* '''pcap''' - Parse the content of a PCAP file
 +
* '''pdf''' - Parse some of the available PDF document metadata
 +
* '''prefetch''' - Parse the content of the Prefetch directory
 +
* '''recycler''' - Parse the content of the recycle bin directory
 +
* '''restore''' - Parse the content of the restore point directory
 +
* '''setupapi''' - Parse the content of the SetupAPI log file in Windows XP
 +
* '''sol''' - Parse the content of a .sol (LSO) or a Flash cookie file
 +
* '''squid''' - Parse the content of a Squid access log (http_emulate off)
 +
* '''syslog''' - Parse the content of a Linux Syslog log file
 +
* '''tln''' - Parse the content of a body file in the TLN format
 +
* '''userassist''' - Parses the NTUSER.DAT registry file
 +
* '''volatility''' - Parse the content of a Volatility output files (psscan2, sockscan2, ...)
 +
* '''win_link''' - Parse the content of a Windows shortcut file (or a link file)
 +
* '''wmiprov''' - Parse the content of the wmiprov log file
 +
* '''xpfirewall''' - Parse the content of a XP Firewall log
  
' Data recovery from HDDs with severe multiple BAD sectors, which appear because of platter surface scratch or malfunction or instability of the magnetic head assembly (MHA).
+
==Links==
 
+
; [http://log2timeline.net log2timeline web site]
' Data recovery from HDDs that start to produce "clicking" sounds, which may be caused by corruption of sector servo labels or a MHA malfunction. If some heads or surfaces are damaged it is possible (before installation of MHA replacement) to create a copy of data using the remaining good surfaces or drive heads.
+
; [http://www.sans.org/reading_room/whitepapers/logging/mastering-super-timeline-log2timeline_33438 SANS GCFA Gold paper about the tool]
 
+
; [http://blogs.sans.org/computer-forensics/2010/03/19/digital-forensic-sifting-super-timeline-analysis-and-creation/ A quick run on how to create a super timeline]
' Availability of tools for logical analysis of FAT and NTFS file systems in the software complex allows data recovery in cases, when a drive is functional and only logical data structure is corrupted.
+
; [http://blog.kiddaland.net/2009/08/log2timeline-artifact-timeline-analysis-part-i/ A blog post introducing the tool]
 
+
; [https://blogs.sans.org/computer-forensics/2009/08/13/artifact-timeline-creation-and-analysis-tool-release-log2timeline/ Part 1 of the SANS Forensic blog post about the tool]
'When used with malfunctioning drives, Data Compass complex often allows selective extraction of data necessary to your customers without reading all data from a drive ("recover data by file" without creating a complete disk image) saving a lot of time. In some cases, when drive malfunctions cause constant self-damage (like scratches on disks or instable MHA) these are the only means to accomplish this task. With the ShadowDisk technology adopted, users need not to worry about the drive degradation problem.
+
; [https://blogs.sans.org/computer-forensics/2009/08/14/artifact-timeline-creation-and-analysis-part-2/ Part 2 of the SANS forensic blog post about the tool]
 
+
== Related links ==
+
 
+
[http://www.salvationdata.com Official Webiste]
+

Revision as of 08:13, 26 August 2010

Contents

log2timeline

log2timeline is designed as a framework for artifact timeline creation and analysis. The main purpose is to provide a single tool to parse various log files and artifacts found on suspect systems (and supporting systems, such as network equipment) and produce a body file that can be used to create a timeline, using tools such as mactime from TSK, for forensic investigators.

The tool is written in Perl for Linux but has been tested using Mac OS X (10.5.7 and 10.5.8). Parts of it should work natively in Windows as well (with ActiveState Perl installed).

Description

log2timeline takes a log file (or a directory) and parses it to produce a body file that can be imported into other tools for timeline analysis. The tool has both a modular based approach to the input file as well as the output file. The default behavior of the current version is to export the timeline in a body format readable by TSK's (The SleuthKit) mactime (although this can be easily changed). log2timeline is build as a series of scripts, this one being the front-end, which uses other scripts to actually parse the log files (called modules). The tool is build to be easily extended for anyone that wants to create a new module.

The tool contains (current version of 0.51 nightly build (20102608)) three front-ends:

  • log2timeline - The main front-end. A tool capable of parsing a single log file/directory pointed to the tool using a selected input module.
  • timescanner - A recursive front-end capable of parsing a directory passed to the tool and recursively go through each and every file/dir and try to parse it with every or selected input modules (to provide an automatic method of creating a super timeline).
  • glog2timeline - A simple GUI front-end, with similar capabilities as log2timeline (the main front-end)

Currently Supported Input Modules

The currently supported input modules (as of version 0.51 nightly build (20102608)) are:

  • apache2_access - Parse the content of a Apache2 access log file
  • apache2_error - Parse the content of a Apache2 error log file
  • chrome - Parse the content of a Chrome history file
  • evt - Parse the content of a Windows 2k/XP/2k3 Event Log
  • evtx - Parse the content of a Windows Event Log File (EVTX)
  • exif - Extract metadata information from files using ExifTool
  • ff_bookmark - Parse the content of a Firefox bookmark file
  • firefox2 - Parse the content of a Firefox 2 browser history
  • firefox3 - Parse the content of a Firefox 3 history file
  • iehistory - Parse the content of an index.dat file containg IE history
  • iis - Parse the content of a IIS W3C log file
  • isatxt - Parse the content of a ISA text export log file
  • mactime - Parse the content of a body file in the mactime format
  • mcafee - Parse the content of a log file
  • opera - Parse the content of an Opera's global history file
  • oxml - Parse the content of an OpenXML document (Office 2007 documents)
  • pcap - Parse the content of a PCAP file
  • pdf - Parse some of the available PDF document metadata
  • prefetch - Parse the content of the Prefetch directory
  • recycler - Parse the content of the recycle bin directory
  • restore - Parse the content of the restore point directory
  • setupapi - Parse the content of the SetupAPI log file in Windows XP
  • sol - Parse the content of a .sol (LSO) or a Flash cookie file
  • squid - Parse the content of a Squid access log (http_emulate off)
  • syslog - Parse the content of a Linux Syslog log file
  • tln - Parse the content of a body file in the TLN format
  • userassist - Parses the NTUSER.DAT registry file
  • volatility - Parse the content of a Volatility output files (psscan2, sockscan2, ...)
  • win_link - Parse the content of a Windows shortcut file (or a link file)
  • wmiprov - Parse the content of the wmiprov log file
  • xpfirewall - Parse the content of a XP Firewall log

Links

log2timeline web site
SANS GCFA Gold paper about the tool
A quick run on how to create a super timeline
A blog post introducing the tool
Part 1 of the SANS Forensic blog post about the tool
Part 2 of the SANS forensic blog post about the tool