Child sexual abuse material: Half of viewers then contact children, study finds

A study investigating the behaviours of people who search for and view child sexual abuse material (CSAM) on the dark web found that nearly half of the respondents sought direct contact with a child after viewing the material.
The research, conducted by the Finnish organisation Protect Childen, gathered responses from over 1,000 German-language participants. The study also found that 51% first saw the material accidentally, and 70% were minors the first time they came across it.
“We saw a need to highlight the urgency of the issue in Germany. The amount of CSAM hosted in Germany has dramatically increased almost tenfold between 2020 and 2022, making it one of the EU’s largest CSAM hosts,” Tegan Insoll, Head of Research at Protect Children, told EURACTIV.
At the same time, according to the German federal government, there has been an increase in the investigative cases related to CSAM “due to a higher investigation rate and not reflecting an equally higher rate of the actual cases”.
In May 2022, the European Commission put forth legislation to fight the online and offline dissemination of CSAM. The EU proposal has spurred controversy, as it introduces the ability for judicial authorities to scan private messages to detect suspicious content.
Germany has been a vocal opponent of the measure that might compromise end-to-end encryption, a technology that would be at odds with these mandatory measures.
In comments dated 15 March, reported on by EURACTIV, Berlin categorically requested that “no technologies will be used which disrupt, weaken, circumvent or modify encryption”.
Protect Children’s Insoll acknowledged that Germany’s ‘hesitancy’ was the reason for researching the country.

Child sexual abuse: Data retention, quick removals top concerns for EU states
In formal comments on a draft law seeking to fight child sexual abuse material (CSAM), EU countries have highlighted end-to-end encryption, quick removals of such material, and preservation of evidence, according to internal comments seen by EURACTIV.
The analysis also found that nearly half of the respondents would actively want to stop viewing CSAM, while 65% have tried to stop.
For Konstantin Macher of the NGO Digitalcourage, policymakers should focus on prevention and rehabilitation, namely on those respondents who answered that they want to stop searching, using or sharing CSAM or illegal violent material.
Macher considers investing in prevention measures vital “so that such persons can be prevented from (re)offending”, arguing that the EU proposal falls short in this regard and redirects resources in the wrong direction.
In Germany, a clinical support service from the Prevention Project Dunkelfeld offers free therapy to those with sexual desires towards children. The NGO Protect Children also published research emphasising the importance of the prevention aspect, although it did not target Germany in that case.
However, for Emily Slifer, Director of Policy at Thorn, one of the several child protection organisations that supported the latest report, “the findings emphasise that the detection, reporting and removal of CSAM do not only stop revictimisation but indeed help prevent further abuse from ever happening”.
Moreover, the research supports the mandatory approach of the EU anti-CSAM proposal, noting that “internet service providers must be obliged, by law, to detect, report, and remove child sexual abuse material from their platforms”.
However, MEP Patrick Breyer opposes this argument, stressing that there is no evidence that indiscriminate scanning that has already been implemented in non-encrypted environments has made the communications platforms safer.
In addition, the EU proposal does not provide detection orders only to fight known CSAM but also new material and grooming, the practice whereby paedophiles try to lure children into meetings.
For Breyer, it is “unscientific to claim that child grooming is something you could ‘scan’ for”, adding that the CSAM proposal would disproportionately affect people’s private communications, whilst paedophile rings would easily circumvent the measures via the dark web.

EU watchdog: Online child abuse draft law creates ‘illusion of legality’
In a closed-door meeting with EU lawmakers, the European Data Protection Supervisor criticised the proposal to fight Child Sexual Abuse Material as trying to mask breaches of fundamental rights.
The European Data Protection Supervisor (EDPS), the authority responsible for advising EU …