Tech behemoths lashed over child abuse protections

Jennifer Dudley-Nicholson |

Major tech companies are being urged to do more to make their services safer for children.
Major tech companies are being urged to do more to make their services safer for children.

Some of the world’s biggest technology companies are failing to take enough action against child sexual abuse material on their platforms, including its use in video calls, streaming services and messaging apps.

Apple, Google, Microsoft and Meta were among the companies urged to do more in an eSafety Commission report released on Thursday that found proactive detection tools for were not widely used to tackle criminal behaviour such as online grooming and sexual extortion.

The findings come after the commission issued reporting notices to eight technology firms requiring the companies detail actions to detect and remove child sexual exploitation and abuse material on their services.

The companies are required to report on their activities every six months, with fines of up to $825,000 a day if they do not respond.

eSafety Commissioner Julie Inman Grant (file image)
Tech companies should be doing all they can to protect children, Julie Inman Grant says. (Mick Tsikas/AAP PHOTOS)

The second eSafety Commission report into their submissions found the companies made some improvements to detect online abuse, but eSafety Commissioner Julie Inman Grant said detection tools were lacking in many critical areas.

“I think the Australian public has an expectation that tech companies should be doing all they can, including to innovate using new technologies, to protect children from sexual exploitation and abuse,” she said.

“This is particularly important for certain features such as live video-calling and live-streaming, where proactive tools have not yet been implemented or developed across all services.”

Many well known video-calling services did not feature proactive detection tools to identify child sexual abuse material, the report found, including Apple’s FaceTime, Microsoft Teams, Google Meet, Snapchat and Facebook Messenger.

The report also criticised Apple, Microsoft and Google for not using tools to detect new abuse material on some of their services, and for the lack of language analysis technology deployed in chat services to detect sexual extortion attempts.

The commission had shared a list of common scripts and language used in extortion cases with the tech companies to help them develop tools against it, Ms Inman Grant said.

Safety improvements had been tracked in some areas, however, as Apple enabled a communication safety feature by default for account holders under 13 years, Microsoft improved its detection of known abusive content, and Snapchat accelerated its response to content reports.

“The improvements, while more incremental than monumental, show that these platforms can improve when it comes to protecting the most vulnerable in our society,” Ms Inman Grant said.

“These companies have the resources and technical capability to make their services safer for not just children but all users and that is what these transparency reports are designed to achieve and are achieving.”

The eight companies, which include Discord and WhatsApp and did include now-defunct Skype, will be required to update the commission again in March and August 2026.

AAP