Difference between revisions of "Research Topics"

From Forensics Wiki
Jump to: navigation, search
m (Open Research Topics moved to Simson's Open Research Topics: Nobody else is contributing to this page, so I might as well claim ownership of it.)
(See Also)
 
(75 intermediate revisions by 5 users not shown)
Line 1: Line 1:
; Research Ideas
+
Interested in doing research in computer forensics? Looking for a master's topic, or just some ideas for a research paper? Here is our list. Please feel free to add your own ideas.
  
Interested in doing research in computer forensics? Looking for a master's topic, or just some ideas for a research paper? Here is our list. Please feel free to add your own ideas.  
+
Many of these would make a nice master's project.
  
 +
=Programming/Engineering Projects=
  
=Hard Problems=
+
==Small-Sized Projects==
* Stream Based Disk Forensics. Process the entire disk with one pass, or at most two, to minimize seek time.
+
; Sleuthkit:
* Determine the device that created an image or video without metadata. (fingerprinting digital cameras)
+
* Rewrite SleuthKit '''sorter''' in C++ to make it faster and more flexible.
* Automatically detect falsified digital evidence.
+
; tcpflow:
* Use the location of where data resides on a computer as a way of inferring information about the computer's past.
+
* Modify [[tcpflow]]'s iptree.h implementation so that it only stores discriminating bit prefixes in the tree, similar to D. J. Bernstein's [http://cr.yp.to/critbit.html Crit-bit] trees.
* Detect and diagnose sanitization attempts.  
+
* Determine why [[tcpflow]]'s iptree.h implementation's ''prune'' works differently when caching is enabled then when it is disabled
* Recover overwritten data.
+
  
=Tool Development=
+
==Medium-Sized Non-Programming Projects==
==AFF Enhancement==
+
===Digital Forensics Education===
[[AFF]] is the Advanced Forensics Format, developed by Simson Garfinkel and Basis Technology.  
+
* Survey existing DFE programs and DF practitioners regarding which tools they use. Report if the tools being taught are the same as the tools that are being used.
* Evaluation of the AFF data page size. What is the optimal page size for compressed forensic work?
+
===Improving quality of forensic examination reports===
* Replacement of the AFF "BADFLAG" approach for indicating bad data with a bitmap.
+
* Defense asks you: "When did you update your antivirus program during the forensic examination?", what will you reply: date, or date/hour, or date/hour/minute? How many virus signatures can be added and then excluded as false positives in 24 hours? Does mirroring of signature update servers make date/hour, date/hour/minute answers useless?
* Modify aimage so that it can take a partial disk image and a disk and just image what's missing.
+
* Improve the data recovery features of aimage.
+
  
==Decoders and Validators==
+
==Medium-Sized Development Projects==
* A JPEG decompresser that supports restarts and checkpointing for use in high-speed carving. It would also be useful it the JPEG decompressor didn't actually decompress --- all it needs to do is to verify the huffman table.
+
===Forensic File Viewer ===
 +
* Create a program that visualizes the contents of a file, sort of like hexedit, but with other features:
 +
** Automatically pull out the strings
 +
** Show histogram
 +
** Detect crypto and/or stenography.
 +
* Extend SleuthKit's [[fiwalk]] to report the NTFS alternative data streams.
  
==Cell Phones==
+
===Data Sniffing===
Open source tools for:
+
* Create a method to detect NTFS-compressed cluster blocks on a disk (RAW data stream). A method could be to write a generic signature to detect the beginning of NTFS-compressed file segments on a disk. This method is useful in carving and scanning for textual strings.
* Imaging the contents of a cell phone memory
+
* Reassembling information in a cell phone memory
+
  
==Flash Memory==
+
===SleuthKit Modifications===
Flash memory devices such as USB keys implement a [http://www.st.com/stonline/products/literature/an/10122.htm wear leveling algorithm] in hardware so that frequently rewritten blocks are actually written to many different physical blocks. Are there any devices that let you access the raw flash cells underneath the wear leveling chip? Can you get statistics out of the device? Can you access pages that have been mapped out (and still have valid data) but haven't been mapped back yet? Can you use this as a technique for accessing deleted information?
+
* Write a FUSE-based mounter for SleuthKit, so that disk images can be forensically mounted using TSK.
 +
* Modify SleuthKit's API so that the physical location on disk of compressed files can be learned.
  
=Corpora Development=
+
===Anti-Frensics Detection===
==Realistic Corpora==
+
* A pluggable rule-based system that can detect the residual data or other remnants of running a variety of anti-forensics software
* Simulated disk imags
+
 
* Simulated network traffic
+
===Carvers===
 +
Develop a new carver with a plug-in architecture and support for fragment reassembly carving. Take a look at:
 +
* [[Carver 2.0 Planning Page]]
 +
* ([mailto:rainer.poisel@gmail.com Rainer Poisel']) [https://github.com/rpoisel/mmc Multimedia File Carver], which allows for the reassembly of multimedia fragmented files.
 +
 
 +
===Correlation Engine===
 +
* Logfile correlation
 +
* Document identity identification
 +
* Correlation between stored data and intercept data
 +
* Online Social Network Analysis
 +
 
 +
===Data Snarfing/Web Scraping===
 +
* Find and download in a forensically secure manner all of the information in a social network (e.g. Facebook, LinkedIn, etc.) associated with a targeted individual.
 +
* Determine who is searching for a targeted individual. This might be done with a honeypot, or documents with a tracking device in them, or some kind of covert Facebook App.
 +
* Automated grouping/annotation of low-level events, e.g. access-time, log-file entry, to higher-level events, e.g. program start, login
 +
 
 +
=== Timeline analysis ===
 +
* Mapping differences and similarities in multiple versions of a system, e.g. those created by [[Windows Shadow Volumes]] but not limited to
 +
* Write a new timeline viewer that supports Logfile fusion (with offsets) and provides the ability to view the logfile in the frequency domain.
 +
 
 +
===Enhancements for Guidance Software's Encase===
 +
* Develop an EnScript that allows you to script EnCase from Python. (You can do this because EnScripts can run arbitrary DLLs. The EnScript calls the DLL. Each "return" from the DLL is a specific EnCase command to execute. The EnScript then re-enters the DLL.)
 +
 
 +
=== Analysis of packet captures ===
 +
* Identifying various types of DDoS attacks from capture files (pcap): extracting attack statistics, list of attacking bots, determining the type of attack (TCP SYN flood, UDP/ICMP flood, HTTP GET/POST flood, HTTP flood with browser emulation, etc).
 +
 
 +
==Reverse-Engineering Projects==
 +
=== Application analysis ===
 +
* Reverse the on-disk structure of the [[Extensible Storage Engine (ESE) Database File (EDB) format]] to learn:
 +
** Fill in the missing information about older ESE databases
 +
** Exchange EDB (MAPI database), STM
 +
** Active Directory (Active Directory working document available on request)
 +
* Reverse the on-disk structure of the Lotus [[Notes Storage Facility (NSF)]]
 +
* Reverse the on-disk structure of Microsoft SQL Server databases
 +
 
 +
=== Volume/File System analysis ===
 +
* Analysis of inter snapshot changes in [[Windows Shadow Volumes]]
 +
* Modify SleuthKit's NTFS implementation to support NTFS encrypted files (EFS)
 +
* Extend SleuthKit's implementation of NTFS to cover Transaction NTFS (TxF) (see [[NTFS]])
 +
* Physical layer access to flash storage (requires reverse-engineering proprietary APIs for flash USB and SSD storage.)
 +
* Add support to SleuthKit for [[Resilient File System (ReFS)|ReFS]].
 +
 
 +
 
 +
 
 +
==Error Rates==
 +
* Develop improved techniques for identifying encrypted data. (It's especially important to distinguish encrypted data from compressed data).
 +
* Quantify the error rate of different forensic tools and processes. Are these rates theoretical or implementation dependent? What is the interaction of the error rates and the [[Daubert]] standard?
 +
 
 +
==Research Areas==
 +
These are research areas that could easily grow into a PhD thesis.
 +
* General-purpose detection of:
 +
** Stegnography
 +
** Sanitization attempts
 +
** Evidence Falsification (perhaps through inconsistency in file system allocations, application data allocation, and log file analysis.
 +
* Visualization of data/information in digital forensic context
 +
* SWOT of current visualization techniques in forensic tools; improvements; feasibility of 3D representation;
 +
 
 +
==See Also==
 +
* [http://itsecurity.uiowa.edu/securityday/documents/guan.pdf Digital Forensics: Research Challenges and Open Problems, Dr. Yong Guan, Iowa State University, Dec. 4, 2007]
 +
* [http://www.forensicfocus.com/project-ideas Forensic Focus: Project Ideas for Digital Forensics Students]
 +
 
 +
__NOTOC__
 +
 
 +
[[Category:Research]]

Latest revision as of 07:35, 10 September 2013

Interested in doing research in computer forensics? Looking for a master's topic, or just some ideas for a research paper? Here is our list. Please feel free to add your own ideas.

Many of these would make a nice master's project.

Programming/Engineering Projects

Small-Sized Projects

Sleuthkit
  • Rewrite SleuthKit sorter in C++ to make it faster and more flexible.
tcpflow
  • Modify tcpflow's iptree.h implementation so that it only stores discriminating bit prefixes in the tree, similar to D. J. Bernstein's Crit-bit trees.
  • Determine why tcpflow's iptree.h implementation's prune works differently when caching is enabled then when it is disabled

Medium-Sized Non-Programming Projects

Digital Forensics Education

  • Survey existing DFE programs and DF practitioners regarding which tools they use. Report if the tools being taught are the same as the tools that are being used.

Improving quality of forensic examination reports

  • Defense asks you: "When did you update your antivirus program during the forensic examination?", what will you reply: date, or date/hour, or date/hour/minute? How many virus signatures can be added and then excluded as false positives in 24 hours? Does mirroring of signature update servers make date/hour, date/hour/minute answers useless?

Medium-Sized Development Projects

Forensic File Viewer

  • Create a program that visualizes the contents of a file, sort of like hexedit, but with other features:
    • Automatically pull out the strings
    • Show histogram
    • Detect crypto and/or stenography.
  • Extend SleuthKit's fiwalk to report the NTFS alternative data streams.

Data Sniffing

  • Create a method to detect NTFS-compressed cluster blocks on a disk (RAW data stream). A method could be to write a generic signature to detect the beginning of NTFS-compressed file segments on a disk. This method is useful in carving and scanning for textual strings.

SleuthKit Modifications

  • Write a FUSE-based mounter for SleuthKit, so that disk images can be forensically mounted using TSK.
  • Modify SleuthKit's API so that the physical location on disk of compressed files can be learned.

Anti-Frensics Detection

  • A pluggable rule-based system that can detect the residual data or other remnants of running a variety of anti-forensics software

Carvers

Develop a new carver with a plug-in architecture and support for fragment reassembly carving. Take a look at:

Correlation Engine

  • Logfile correlation
  • Document identity identification
  • Correlation between stored data and intercept data
  • Online Social Network Analysis

Data Snarfing/Web Scraping

  • Find and download in a forensically secure manner all of the information in a social network (e.g. Facebook, LinkedIn, etc.) associated with a targeted individual.
  • Determine who is searching for a targeted individual. This might be done with a honeypot, or documents with a tracking device in them, or some kind of covert Facebook App.
  • Automated grouping/annotation of low-level events, e.g. access-time, log-file entry, to higher-level events, e.g. program start, login

Timeline analysis

  • Mapping differences and similarities in multiple versions of a system, e.g. those created by Windows Shadow Volumes but not limited to
  • Write a new timeline viewer that supports Logfile fusion (with offsets) and provides the ability to view the logfile in the frequency domain.

Enhancements for Guidance Software's Encase

  • Develop an EnScript that allows you to script EnCase from Python. (You can do this because EnScripts can run arbitrary DLLs. The EnScript calls the DLL. Each "return" from the DLL is a specific EnCase command to execute. The EnScript then re-enters the DLL.)

Analysis of packet captures

  • Identifying various types of DDoS attacks from capture files (pcap): extracting attack statistics, list of attacking bots, determining the type of attack (TCP SYN flood, UDP/ICMP flood, HTTP GET/POST flood, HTTP flood with browser emulation, etc).

Reverse-Engineering Projects

Application analysis

Volume/File System analysis

  • Analysis of inter snapshot changes in Windows Shadow Volumes
  • Modify SleuthKit's NTFS implementation to support NTFS encrypted files (EFS)
  • Extend SleuthKit's implementation of NTFS to cover Transaction NTFS (TxF) (see NTFS)
  • Physical layer access to flash storage (requires reverse-engineering proprietary APIs for flash USB and SSD storage.)
  • Add support to SleuthKit for ReFS.


Error Rates

  • Develop improved techniques for identifying encrypted data. (It's especially important to distinguish encrypted data from compressed data).
  • Quantify the error rate of different forensic tools and processes. Are these rates theoretical or implementation dependent? What is the interaction of the error rates and the Daubert standard?

Research Areas

These are research areas that could easily grow into a PhD thesis.

  • General-purpose detection of:
    • Stegnography
    • Sanitization attempts
    • Evidence Falsification (perhaps through inconsistency in file system allocations, application data allocation, and log file analysis.
  • Visualization of data/information in digital forensic context
  • SWOT of current visualization techniques in forensic tools; improvements; feasibility of 3D representation;

See Also