- News>
- Technology
Child Predators Using Discord For Sextortion, Abductions
The report identified an additional 165 cases, including four crime rings, in which adults were prosecuted for transmitting or receiving CSAM via Discord or for allegedly using the platform to extort children into sending sexually graphic images of themselves, also known as sextortion.
New Delhi: Discord, a popular chatting platform among teens, is being used in hidden communities and chat rooms by some adults to groom children before abducting them, trading child sexual exploitation material (CSAM) and extorting minors whom they trick into sending nude images, the media reported.
According to NBC News, over the past six years, around 35 cases of adults being prosecuted on charges of "kidnapping, grooming or sexual harassment" have been identified that allegedly involved Discord communications.
Among those, at least 15 have resulted in guilty pleas or verdicts, with "many more" awaiting trial. Those figures only include cases that were reported, investigated, and prosecuted, all of which present significant challenges for victims and their advocates.
"What we see is only the tip of the iceberg," Stephen Sauer, the director of the tipline at the Canadian Centre for Child Protection (C3P), was quoted as saying.
Moreover, the report said that a teen was taken across state lines, raped and found locked in a backyard shed in March, according to police, after she was groomed on Discord for months. According to prosecutors, in another case, a 22-year-old man kidnapped a 12-year-old girl after meeting her in a video game and grooming her on Discord.
The report identified an additional 165 cases, including four crime rings, in which adults were prosecuted for transmitting or receiving CSAM via Discord or for allegedly using the platform to extort children into sending sexually graphic images of themselves, also known as sextortion.
Further, the report said that Discord isn't the only tech platform dealing with the persistent problem of online child exploitation, as per the numerous reports over the last year. According to an analysis of reports made to the US National Center for Missing & Exploited Children (NCMEC), the reports of CSAM on Discord increased by 474 per cent from 2021 to 2022.
According to John Shehan, senior vice president of the NCMEC, child exploitation and abuse material has grown rapidly on Discord. "There is a child exploitation issue on the platform. That's undeniable," Shehan was quoted as saying. Launched in 2015, Discord quickly emerged as a hub for online gamers, and teens, and now it is used by over 150 million people globally. Last month, Discord notified users about a data breach following the compromise of a third-party support agent's account.
According to BleepingComputer, the agent's support ticket queue was compromised in the security breach, exposing user email addresses, messages exchanged with Discord support, and any attachments sent as part of the tickets. In April, cyber-security researchers discovered a new malware that is distributed over Discord which has more than 300 million active users.