Child exploitation taking place on popular chat platform Discord is “horrifying”, the company’s chief executive has said.
Jason Citron said any cases of child sexual abuse material, grooming, or abduction linked to the service, which he founded back in 2015, would be taken “very seriously”.
“As a parent, it’s horrifying,” he told Bloomberg’s Tech Summit event in San Francisco.
Mr Citron said Discord employed a dedicated child safety team, tasked with trying to prevent abuse “in a way that respects the privacy of all the people who are not doing these things”.
He was responding to an investigation by Sky News’ partner network, NBC News, which revealed dozens of prosecutions for child abuse in the US involving communications on the platform.
It included at least 35 child abduction, grooming or exploitation prosecutions, and 165 involving abuse material. One case saw a man kidnap a 12-year-old girl after meeting her in a video game via Discord, while another saw a teenage girl taken across state lines and raped after months of being groomed.
NBC News also identified hundreds of active Discord servers, which are the communities making up the platform, promoting child exploitation.
What is Discord?
Many gamers use Discord to communicate while playing online.
Its popularity exploded during the pandemic and it now hosts many servers unrelated to gaming, whether for chats among friends or for official sources to post news. The UK Treasury even opened its own server last year, where announcements from the government and chancellor are posted.
Discord servers can be public, but many need people to be added or have a request to join approved. They support voice and video calling, text messaging, livestreaming, and sharing files.
But unlike messaging platforms such as WhatsApp and Signal, Discord does not offer end-to-end encryption, while moderation is largely left up to owners of individual servers and volunteers.
Discord said in a recent blog post that its safety team “works with cutting-edge technology to detect and respond to abuse, both proactively and from reports received from users, moderators, and trusted third-party reporters”.
It disabled more than 37,000 accounts for child safety violations in the last quarter of 2022, it added.
Chief executive Mr Citron told the Bloomberg event the challenges were industry-wide, adding: “We have so many things happening at scale on the platform and it’s so hard to sort of identify things.”
He suggested AI could be used to help solve issues around child exploitation.