Uniform standards needed for social media: cyber expert

Andrew Brown and Kat Wong |

The federal government is creating a committee to investigate content people are exposed to online.
The federal government is creating a committee to investigate content people are exposed to online.

Social media giants should be required to have uniform standards in responding to harmful content online, a cyber safety expert has urged, as the federal government moves to examine the impact platforms have on users.

A parliamentary committee will be set up to put social media under the microscope, focusing on how platforms use algorithms to determine what users see, the extent of harmful and violent content produced, and its impact on mental health.

The committee’s formation comes as the federal government has been at loggerheads with social media giants over the removal of violent content following the stabbing of a church leader in Sydney in April.

Platforms such as X refused to comply with take-down requests from the internet safety watchdog. (Marion Rae/AAP PHOTOS)

Platforms such as X refused to comply with take-down requests from the internet safety watchdog after the stabbing.

Communications Minister Michelle Rowland said the inquiry was essential, with social media threatening Australia’s democracy and public safety.

“Social media has a civic responsibility to its Australian users and to our society more broadly,” she told ABC Radio on Friday.

“The decisions that have been made by social media platforms in recent months can really demonstrate the wide ranging negative impacts not only on our economy but also on our democratic institutions.

“Parliament needs to understand how social media companies dial up and down the content that supports healthy democracies, as well as the anti-social content that undermines public safety.”

Communications Minister Michelle Rowland (file)
Michelle Rowland: social media has a civic responsibility to Australian users. (Lukas Coch/AAP PHOTOS)

Cyber safety expert and former Victorian police officer Susan McLean said the inquiry needed to focus on how social media responds to distressing content online.

She said there had been major disparities with how platforms take action with reports on violent or sexually explicit material.

“I had helped a family report a sextortion attempt of their young son on Instagram, Snapchat and TikTok,” she told AAP.

Instagram owner Meta “immediately acted and blocked the account but TikTok and Snapchat sent a message to say this doesn’t breach community standards”, Ms McLean said.

“There needs to be uniformity for ease of reporting and uniformity in the willingness to respond to sextortion; there needs to be action.”

While Ms McLean said the committee would address the impacts of social media, she said better education of the harms platforms pose was also needed.

“None of these inquiries and laws are going to work if we don’t get to educate the user into what they’re accepting when they sign up,” she said.

“We have to be mindful that people are continuing to be exposed to harmful content and it does have an impact on mental health.”

Facebook logo
An inquiry will look at Facebook parent company Meta’s move to abandon deals with media companies. (Lukas Coch/AAP PHOTOS)

Meta’s decision to abandon deals with media companies to support public interest journalism will also be in focus.

“What is at stake here is the sustainability of the information ecosystem,” Ms Rowland said.

“Australians rely, and our democracy relies, on news that is trustworthy, that is informative and that is available.

“Meta’s decision to withdraw from news actually threatens (this).”

The opposition has urged government to include social media sites as it considers a trial of age verification technology.

“Social media use can be immensely damaging for Australian children,” coalition spokesman David Coleman said.

“We must take action now to limit the access of children to platforms like Instagram and TikTok.”