Difference between revisions of "Research Topics"

From ForensicsWiki
Jump to: navigation, search
m (Timeline analysis)
m
 
(12 intermediate revisions by 3 users not shown)
Line 3: Line 3:
 
Many of these would make a nice master's project.
 
Many of these would make a nice master's project.
  
=Programming Projects=
+
=Programming/Engineering Projects=
  
==Small-Sized Programming Projects==
+
; tcpflow:
* Modify [[bulk_extractor]] so that it can directly acquire a raw device under Windows. This requires replacing the current ''open'' function call with a ''CreateFile'' function call and using windows file handles.
+
* Modify [[tcpflow]]'s iptree.h implementation so that it only stores discriminating bit prefixes in the tree, similar to D. J. Bernstein's [http://cr.yp.to/critbit.html Crit-bit] trees.
* Rewrite SleuthKit '''sorter''' in C++ to make it faster and more flexible.
+
* Determine why [[tcpflow]]'s iptree.h implementation's ''prune'' works differently when caching is enabled then when it is disabled
  
==Medium-Sized Programming Projects==
+
;SleuthKit
* Create a program that visualizes the contents of a file, sort of like hexedit, but with other features:
+
** Automatically pull out the strings
+
** Show histogram
+
** Detect crypto and/or stenography.
+
* Extend [[fiwalk]] to report the NTFS alternative data streams.
+
* Create a method to detect NTFS-compressed cluster blocks on a disk (RAW data stream). A method could be to write a generic signature to detect the beginning of NTFS-compressed file segments on a disk. This method is useful in carving and scanning for textual strings.
+
 
* Write a FUSE-based mounter for SleuthKit, so that disk images can be forensically mounted using TSK.
 
* Write a FUSE-based mounter for SleuthKit, so that disk images can be forensically mounted using TSK.
 
* Modify SleuthKit's API so that the physical location on disk of compressed files can be learned.
 
* Modify SleuthKit's API so that the physical location on disk of compressed files can be learned.
  
==Medium-Sized Research Projects==
+
=Digital Forensics Education=
* Develop an image processing program that can reliably detect screen shots. (Screen shots are useful to find on a hard drive because they can imply the presence of a remote control or surveillance program.)
+
* Survey existing DFE programs and DF practitioners regarding which tools they use. Report if the tools being taught are the same as the tools that are being used.
* Develop improved techniques for identifying encrypted data. (It's especially important to distinguish encrypted data from compressed data).
+
* Quantify the error rate of different forensic tools and processes. Are these rates theoretical or implementation dependent? What is the interaction of the error rates and the [[Daubert]] standard?
+
*
+
  
==Big Programming Projects==
+
=Data Sniffing=
* Develop a new carver with a plug-in architecture and support for fragment reassembly carving. Take a look at:
+
* Create a method to detect NTFS-compressed cluster blocks on a disk (RAW data stream). A method could be to write a generic signature to detect the beginning of NTFS-compressed file segments on a disk. This method is useful in carving and scanning for textual strings.
** [[Carver 2.0 Planning Page]]
+
 
** ([mailto:rainer.poisel@gmail.com Rainer Poisel']) [https://github.com/rpoisel/mmc Multimedia File Carver], which allows for the reassembly of multimedia fragmented files.
+
=Anti-Frensics Detection=
 +
* A pluggable rule-based system that can detect the residual data or other remnants of running a variety of anti-forensics software
 +
 
 +
===Carvers===
 +
Develop a new carver with a plug-in architecture and support for fragment reassembly carving. Take a look at:
 +
* [[Carver 2.0 Planning Page]]
 +
* ([mailto:rainer.poisel@gmail.com Rainer Poisel']) [https://github.com/rpoisel/mmc Multimedia File Carver], which allows for the reassembly of multimedia fragmented files.
  
* Correlation Engine:
+
===Correlation Engine===
** Logfile correlation
+
* Logfile correlation
** Document identity identification
+
* Document identity identification
** Correlation between stored data and intercept data
+
* Correlation between stored data and intercept data
** Online Social Network Analysis
+
* Online Social Network Analysis
  
 +
===Data Snarfing/Web Scraping===
 
* Find and download in a forensically secure manner all of the information in a social network (e.g. Facebook, LinkedIn, etc.) associated with a targeted individual.
 
* Find and download in a forensically secure manner all of the information in a social network (e.g. Facebook, LinkedIn, etc.) associated with a targeted individual.
** Determine who is searching for a targeted individual. This might be done with a honeypot, or documents with a tracking device in them, or some kind of covert Facebook App.
+
* Determine who is searching for a targeted individual. This might be done with a honeypot, or documents with a tracking device in them, or some kind of covert Facebook App.
** Automated grouping/annotation of low-level events, e.g. access-time, log-file entry, to higher-level events, e.g. program start, login
+
* Automated grouping/annotation of low-level events, e.g. access-time, log-file entry, to higher-level events, e.g. program start, login
  
=Reverse-Engineering Projects=
+
 
==Reverse-Engineering Projects==
+
===Enhancements for Guidance Software's Encase===
=== Application analysis ===
+
* Develop an EnScript that allows you to script EnCase from Python. (You can do this because EnScripts can run arbitrary DLLs. The EnScript calls the DLL. Each "return" from the DLL is a specific EnCase command to execute. The EnScript then re-enters the DLL.)
* Reverse the on-disk structure of the [[Extensible Storage Engine (ESE) Database File (EDB) format]] to learn:
+
** Fill in the missing information about older ESE databases
+
** Exchange EDB (MAPI database), STM
+
** Active Directory (Active Directory working document available on request)
+
* Reverse the on-disk structure of the Lotus [[Notes Storage Facility (NSF)]]
+
* Reverse the on-disk structure of Microsoft SQL Server databases
+
  
 
=== Volume/File System analysis ===
 
=== Volume/File System analysis ===
 
* Analysis of inter snapshot changes in [[Windows Shadow Volumes]]
 
* Analysis of inter snapshot changes in [[Windows Shadow Volumes]]
* Add support to SleuthKit for [[FAT|eXFAT]], Microsoft's new FAT file system.
 
* Add support to SleuthKit for [[Resilient File System (ReFS)|ReFS]].
 
 
* Modify SleuthKit's NTFS implementation to support NTFS encrypted files (EFS)
 
* Modify SleuthKit's NTFS implementation to support NTFS encrypted files (EFS)
 
* Extend SleuthKit's implementation of NTFS to cover Transaction NTFS (TxF) (see [[NTFS]])
 
* Extend SleuthKit's implementation of NTFS to cover Transaction NTFS (TxF) (see [[NTFS]])
 
* Physical layer access to flash storage (requires reverse-engineering proprietary APIs for flash USB and SSD storage.)
 
* Physical layer access to flash storage (requires reverse-engineering proprietary APIs for flash USB and SSD storage.)
 +
* Add support to SleuthKit for [[Resilient File System (ReFS)|ReFS]].
  
==EnCase Enhancement==
+
==Error Rates==
* Develop an EnScript that allows you to script EnCase from Python. (You can do this because EnScripts can run arbitrary DLLs. The EnScript calls the DLL. Each "return" from the DLL is a specific EnCase command to execute. The EnScript then re-enters the DLL.)
+
* Develop improved techniques for identifying encrypted data. (It's especially important to distinguish encrypted data from compressed data).
 +
* Quantify the error rate of different forensic tools and processes. Are these rates theoretical or implementation dependent? What is the interaction of the error rates and the [[Daubert]] standard?
  
=Research Areas=
+
==Research Areas==
 
These are research areas that could easily grow into a PhD thesis.
 
These are research areas that could easily grow into a PhD thesis.
 
* General-purpose detection of:
 
* General-purpose detection of:
Line 69: Line 61:
 
* Visualization of data/information in digital forensic context
 
* Visualization of data/information in digital forensic context
 
* SWOT of current visualization techniques in forensic tools; improvements; feasibility of 3D representation;
 
* SWOT of current visualization techniques in forensic tools; improvements; feasibility of 3D representation;
 +
 +
==See Also==
 +
* [http://itsecurity.uiowa.edu/securityday/documents/guan.pdf Digital Forensics: Research Challenges and Open Problems, Dr. Yong Guan, Iowa State University, Dec. 4, 2007]
 +
* [http://www.forensicfocus.com/project-ideas Forensic Focus: Project Ideas for Digital Forensics Students]
  
 
__NOTOC__
 
__NOTOC__
  
 
[[Category:Research]]
 
[[Category:Research]]

Latest revision as of 21:33, 25 September 2014

Interested in doing research in computer forensics? Looking for a master's topic, or just some ideas for a research paper? Here is our list. Please feel free to add your own ideas.

Many of these would make a nice master's project.

Programming/Engineering Projects

tcpflow
  • Modify tcpflow's iptree.h implementation so that it only stores discriminating bit prefixes in the tree, similar to D. J. Bernstein's Crit-bit trees.
  • Determine why tcpflow's iptree.h implementation's prune works differently when caching is enabled then when it is disabled
SleuthKit
  • Write a FUSE-based mounter for SleuthKit, so that disk images can be forensically mounted using TSK.
  • Modify SleuthKit's API so that the physical location on disk of compressed files can be learned.

Digital Forensics Education

  • Survey existing DFE programs and DF practitioners regarding which tools they use. Report if the tools being taught are the same as the tools that are being used.

Data Sniffing

  • Create a method to detect NTFS-compressed cluster blocks on a disk (RAW data stream). A method could be to write a generic signature to detect the beginning of NTFS-compressed file segments on a disk. This method is useful in carving and scanning for textual strings.

Anti-Frensics Detection

  • A pluggable rule-based system that can detect the residual data or other remnants of running a variety of anti-forensics software

Carvers

Develop a new carver with a plug-in architecture and support for fragment reassembly carving. Take a look at:

Correlation Engine

  • Logfile correlation
  • Document identity identification
  • Correlation between stored data and intercept data
  • Online Social Network Analysis

Data Snarfing/Web Scraping

  • Find and download in a forensically secure manner all of the information in a social network (e.g. Facebook, LinkedIn, etc.) associated with a targeted individual.
  • Determine who is searching for a targeted individual. This might be done with a honeypot, or documents with a tracking device in them, or some kind of covert Facebook App.
  • Automated grouping/annotation of low-level events, e.g. access-time, log-file entry, to higher-level events, e.g. program start, login


Enhancements for Guidance Software's Encase

  • Develop an EnScript that allows you to script EnCase from Python. (You can do this because EnScripts can run arbitrary DLLs. The EnScript calls the DLL. Each "return" from the DLL is a specific EnCase command to execute. The EnScript then re-enters the DLL.)

Volume/File System analysis

  • Analysis of inter snapshot changes in Windows Shadow Volumes
  • Modify SleuthKit's NTFS implementation to support NTFS encrypted files (EFS)
  • Extend SleuthKit's implementation of NTFS to cover Transaction NTFS (TxF) (see NTFS)
  • Physical layer access to flash storage (requires reverse-engineering proprietary APIs for flash USB and SSD storage.)
  • Add support to SleuthKit for ReFS.

Error Rates

  • Develop improved techniques for identifying encrypted data. (It's especially important to distinguish encrypted data from compressed data).
  • Quantify the error rate of different forensic tools and processes. Are these rates theoretical or implementation dependent? What is the interaction of the error rates and the Daubert standard?

Research Areas

These are research areas that could easily grow into a PhD thesis.

  • General-purpose detection of:
    • Stegnography
    • Sanitization attempts
    • Evidence Falsification (perhaps through inconsistency in file system allocations, application data allocation, and log file analysis.
  • Visualization of data/information in digital forensic context
  • SWOT of current visualization techniques in forensic tools; improvements; feasibility of 3D representation;

See Also