SCAN AN ENTIRE WEBSITE
The Dark Crawler recursively traverses web links it discovers on the target site, enabling the user to detect any suspect content hosted on the site.
MAP SITE’S “E-NEIGHBORHOOD”
The Dark Crawler exposes websites in the site “e-neighborhood” which are likely to host similar content.
By analyzing which websites contain web links to the target site, and which web links are found on the target site itself, the Dark Crawler establishes
STREAMLINE DATA COLLECTION…
Data collected by the Dark Crawler can be exported in multiple ways best suited to user’s needs – whether investigating or researching.
GENERATE DATA SETS
ENRICH RESEARCH SAMPLE
… AND INFORMATION SHARING
The data can be exported in a structured format for further analyses.
Additionally, by comparing content against external databases hash matching of media content is possible.
Discovering websites with similar content is valuable for saturating a sample with more data.
Starting with a small number of manually found sites, the number of websites in the sample will increase as the Dark Crawler discovers more web links.
The ease of information sharing facilitates interagency cooperation and simplifies disclosure of only the necessary data while keeping the rest confidential.