The EU unveiled plans for stringent rules on Tuesday to tackle child sexual abuse, giving victims more time to bring their abusers to justice.
It is estimated that one in five children in the European Union suffer from some form of sexual abuse or exploitation, European Commission Vice President Dubravka Suica told a news conference.
The commission is seeking to expand the type of criminal offenses related to child sexual abuse, with new technologies blamed for a proliferation of new forms of abuse.
Reported cases have been rising across the 27-country EU, with concerns that easy-to-use AI tools will spur an even bigger spread of harmful content.
In 2022, 1.5 million cases of child sexual abuse were reported across the bloc, up from one million in 2020, Suica told reporters in Strasbourg.
The new rules would update crime definitions to include abusive material in deepfakes or AI-generated content, and livestreaming abuse.
New offences would also include possessing or exchanging so-called pedophile “handbooks” — in which abusers provide guidance to each other, said Ylva Johansson, the EU’s internal affairs commissioner.
The proposals will be debated by the European Parliament and the EU’s member states before any formal adoption.
“New technologies and the digital era we live in have, unfortunately, increased the threat and the abuse both offline and online,” Suica said.
The commission’s proposals would update rules from 2011.
The plans include changing the statutes of limitations, because officials said that all too often, victims were only able to come forward years after the event, and were unable to mount a case.
Under the new rules, the statutes would not start until the victim turns 18, and there would be varying limits of 20 and 30 years depending on the gravity of the crimes.
The aim was to “make sure that the perpetrator can still be prosecuted”, Johansson said.
Regulators are increasingly turning to the world’s biggest digital companies to do more to protect children online.
Big tech firms, including Google and Facebook-owner Meta, teamed up last year to tackle the issue under a new program called Lantern.
They would share signals of activity that violate their policies on child exploitation so that platforms can move quicker to detect, take down and report harmful content.
The commission proposed a law in 2022 to stop the online spread of child sexual abuse imagery, but the proposal is currently blocked because some member states worry it would allow mass surveillance of private communications.