Watchdog says tech giants still failing on child abuse

Savannah Meacham |

Tech giants aren’t doing enough to crackdown on online child sexual abuse.
Tech giants aren’t doing enough to crackdown on online child sexual abuse.

Tech giants have been slammed for failing to crack down on online child sexual abuse material after a safety watchdog raised the alarm about ongoing failures.

But one of those mega-companies has doubled down that it is successfully removing child abuse content, saying the watchdog focused on metrics over performance.

An eSafety report has revealed Apple, Google, Meta, Microsoft, Discord, WhatsApp, Snapchat and Skype are still not doing enough to stop online child sexual abuse even after three years of calls for action.

The watchdog reports Apple and Google’s YouTube were not tracking the number of user reports about child sexual abuse, nor could they say how long it took to respond to the allegations.

Australia's eSafety Commissioner Julie Inman Grant
Commissioner Julie Inman Grant says tech giants still fail to curb rampant child abuse material. (Joel Carrett/AAP PHOTOS)

The companies also did not provide their number of trust and safety staff to the watchdog.

Child justice advocates slam the tech giants for their lack of reporting that leaves the true rate of online sexual abuse in the dark.

“They’ve had all these years of warning to say this is unacceptable and continue to have the same safety gaps and shortcomings from past reports,” International Justice Mission Australia chief executive David Braga told AAP.

“We’re talking about crimes here against children.”

eSafety Commissioner Julie Inman Grant said the tech companies’ failure to detail how many reports they received indicates a winding back of content moderation and safety policies.

“What worries me is when companies say, ‘We can’t tell you how many reports we’ve received’ … that’s bollocks, they’ve got the technology,” she told ABC Radio.

teenager uses his mobile phone
YouTube is lobbying against being included in a social media ban for children under the age of 16. (Dean Lewins/AAP PHOTOS)

It comes as YouTube argues against being included in a social media ban for Australians under 16 years of age on the basis that it is not a social media platform, but rather is often used as an educational resource.

Google has claimed it has been leading the industry fight against child sexual abuse “since day one” to remove the content from its platforms.

“eSafety’s comments are rooted in reporting metrics, not online safety performance,” the spokesperson said.

“More than 99 per cent of all child sexual exploitation or abuse content on YouTube is proactively detected and removed by our robust automated systems before it is flagged or viewed.

“Our focus remains on outcomes and detecting and removing child sexual exploitation or abuse on YouTube.”

YouTube
eSafety’s report is rooted in metrics not performance, a spokesperson for YouTube owner Google says. (Joel Carrett/AAP PHOTOS)

The commission’s latest findings come three years after it uncovered that the platforms were not proactively detecting stored abuse material or using measures to find live-streams of child harm.

The latest report also criticises some platforms for not deploying tools to detect live-streams of child sex abuse, and others for not using the comparison technique called hash matching to detect and remove previously identified illicit content.

Some platforms also fail to use language analysis to detect grooming or sexual extortion, it found.

Justice advocates want the federal government to legislate digital duty of care laws that would make platforms take reasonable steps to prevent foreseeable harms.

“Digital duty of care would put the onus back onto the technology companies to make sure that the products that they provide, the way they design their business model, don’t facilitate the online sexual exploitation of children,” Mr Braga said.

Another watchdog report is due in 2026 with updates from the tech giants.

AAP