The trend and behaviour analysis performed by the iCOP engine identifies candidate child sexual abuse (CSA) media. This media is then sent to the Media Analysis component to be verified for pornographic content and prioritised. Cutting-edge feature extraction and machine learning technology are used to:
iCOP's content classification engine performs an automatic categorisation of pictures and video scenes into "harmless", "offensive", or "CSA". Though such visual recognition is a challenging problem (and detection rates are far from human accuracy), it offers an efficient prioritisation or filtering of evidence even in cases where queries or filenames (like "000256.jpg") are not indicative. The iCOP approach offers two key benefits:
When CSA material is detected, it is important for investigators to know whether it is entirely new or already known. Therefore, iCOP supports an efficient matching of candidate material with police databases collected in previous cases. In contrast to standard methods based on file hashes, the iCOP approach supports a similarity-based matching, which allows the identification of near-duplicates (like resized or re-econded images) as well as other content from the same scene or shoot. iCOP supports both images and videos, and exploits techniques of approximate nearest neighbor search to operate on large-scale datasets.