Steve Grocki is the Chief of the Criminal Division’s Child Exploitation & Obscenity Section, US Department of Justice
The internet has many marketplaces where people can share child sexual abuse material (CSAM), as well as groom and solicit minor victims for sexual exploitation. In many instances, this illegal activity is happening in plain sight. Ongoing developments in technology, the move towards encryption, and widespread use of the Darkweb, are hindering identification and prosecution of crimes and we are seeing this across the spectrum. The technology is specifically designed to hide people’s presence online and traditional forms of investigation are being thwarted.
It is very difficult for the global community to keep pace with the vast, complex and constantly evolving nature of the Internet, which is fundamentally global and borderless. There clearly has been an increase in the volume of CSAM online, particularly the production of this type of content, which has increased three-fold between 2008 and 2019.
Apart from the biggest four or five companies, only a very small fraction of Internet companies voluntarily monitor their networks for CSAM and report illegal content involving children. Take a look at the App Stores or think of the number of websites out there, and quickly you’ll see how many Internet companies there are. In the U.S., Internet companies have complete civil immunity and are insulated from civil liability even if they are knowledgeable about CSAM being traded on their site. This insulation from liability provides them little incentive to spend limited resources and finances to develop robust online child safety practices.
Change needs to happen to improve online safety for children. There is lots of interest in the U.S., and we are working with foreign partners and Internet companies to institute a voluntary system outlining a baseline duty of care for child safety. In America, there is some appetite for legislative change (e.g. EARN IT Act of 2020) regarding civil liability for companies that are grossly negligent or reckless with child protection online. In other countries, regulatory schemes are also being discussed and considered.
There is clear evidence since the pandemic began – but also before it – of minors being enticed and groomed for self-production and other forms of child exploitation. It is a growing phenomenon and sextortion is a part of it. We see offenders targeting platforms that minors use and grooming children to self-produce content.
The biggest problem is that adults and children share platforms for online gaming, live streaming, or social media where it is quite difficult to know who is a child and who is an adult. It’s a worrying trend.
We need to change the law to provide more privacy protection for CSAM victims, who can develop something akin to celebrity status within offender communities. If their true identities become known, they can be stalked and harassed many years after their abuse and childhood have ended. Offenders can post and trade information to identify the person, and we need to provide heightened protection to safeguard individuals’ privacy before this can happen. Survivors already face the horror that recordings of their abuse continue to be circulated online and global law enforcement doesn’t have the ability to remove it. Preventing harassment and stalking that can cause further harm is critical.
We get citizen reports about abuse content online. A lot of times, someone has come across something problematic and the platform isn’t doing anything or is unaware of how its platform is being used to exploit children. These reports can be of great value because they signal where there are big problems and we can flag those issues to Internet companies, such as when platforms are being exploited by offenders, they aren’t meeting reporting requirements, or when children under the age restriction are accessing inappropriate content.
TOR (“the onion router”, a network that bounces traffic through random nodes, wrapping it in encryption each time, making it difficult to track) and the Darkweb is what concerns me most because it enables offenders to exchange best practices with one another, to find out how others are succeeding, and teach one another. Offenders and potential perpetrators can find a community that normalizes a sexual interest in children. For someone new, it is like going to college, they can learn tricks of the trade without being identified, and this is a very scary development.
When you are looking at production, offenders generally don’t care what country children are from, especially when the victims are younger. They will target children wherever it is easy to do so, regardless of nationality or language. Sometimes language is a bit of a barrier but sites have areas dedicated to specific common languages and it is easy to use online translation services to understand communication in another language.
TOR gives great insight into how these global networks work. Hidden services (websites) on TOR operate as a global community and there are offenders all around the world represented, although many of the larger players are in Europe and America. This cross-border activity increases the complexity enormously and makes it very difficult to investigate offenders when they are utilizing platforms outside of the US, even if they are based in America.
As offenders are located globally, we are more reliant than ever on foreign countries to respond. It makes investigations much more challenging because we have to employ international mechanisms which can cause huge delays in getting access to evidence or offenders. It’s even harder to investigate and punish offenders when they are based in places like Africa, Asia, and Latin America where law enforcement capacity and subject matter expertise may be challenged.
In many parts of the world there are fundamental deficits in resources which mean investigators and survivors don’t have access to the same legal remedies, victim support services, and online forensics that are available in the U.S. Through the U.S. State Department, we are sharing lessons learned in developing and implementing laws in America. For example, via the WePROTECT Global Alliance Model National Response, we have been training people working in African countries, where mobile phone infrastructure is improving, more kids are getting access to devices, and there is a corresponding increase in CSAM.
We are seeing many of the same things we come across in Western countries but it’s emerging at a much more accelerated speed in Africa. In Western countries, when people first obtained smartphones, tablets, and laptops, the same platforms weren’t available and cloud storage was much smaller. Now the online world is highly developed and you are entering a realm that is far more dangerous. There are a vast number of people coming online that aren’t digital natives, don’t know the potential risks, and as result, may have difficulty keeping children safe.
This interview was shared as part of our 2021 report, Ending Online Sexual Exploitation and Abuse of Women and Girls: A Call for International Standards.