News Sports

Why TikTok suicide video horror will happen again

Social media platforms are struggling to keep traumatising videos like the kind that took over TikTok this week off their sites, and the law is apparently powerless to stop them.Media law expert and practice director of Perth law firm Cove Legal, Roger Blow, told governments around the world are “probably 10 years too late”…

Social media platforms are struggling to keep traumatising videos like the kind that took over TikTok this week off their sites, and the law is apparently powerless to stop them.

Media law expert and practice director of Perth law firm Cove Legal, Roger Blow, told governments around the world are “probably 10 years too late” to give social media regulation the importance they should have.

“As a result they’re now so far behind the eight ball it’s hard for them to try and reverse that trend,” he said.

“Their job is so much harder because they’ve left it so late.”

This week parents were warned to keep their children off of TikTok after a horrifying video of a man taking his own life, that originally appeared on Facebok Live in August, was repeatedly uploaded to the viral video app.

The videos of Missisippi man Ronnie McNutt reportedly started trending while the platform struggled to find and delete them.

RELATED: Parents could sue over shock video

The Office of the eSafety Commissioner doesn’t have the power to lay charges, but if the content meets a criminal threshold, the Australian Federal Police can.

It’s understood the Commissioner hasn’t referred the Ronnie McNutt video to the AFP, and it’s unlikely it would meet that threshold anyway, as the laws at the AFP’s disposal primarily target things like child exploitation material and terrorist propaganda.

If a television station were to air such content they could have a case to answer to the Australian Communications and Media Authority (ACMA), but “social media platforms such as TikTok currently sit outside of the ACMA’s regulatory remit,” an ACMA spokesperson told

Viewers who saw the footage reported being traumatised by the video, which was also insidiously cut into popular viral videos or cute cat videos in order to trick people into watching it.

TikTok said it has been automatically deleting the videos when detected and banning accounts that continue to upload them.

Australia’s eSafety Commissioner Julie Inman Grant said in a joint statement with Christine Morgan, the National Suicide Prevention Adviser to the Prime Minister, that her office has “been working closely with TikTok and other social media services to help limit the spread of the video”.

“This is yet another example of social media platforms struggling to incorporate safety protections at the core of their product offerings,” she said.

The video of Mr McNutt is the second suicide streamed live on Facebook in the past two months.

A Facebook spokesperson told the company already made changes to its live platform a year ago.

Those changes came after Australian terrorist Brenton Tarrant livestreamed himself murdering 51 people while they prayed at mosques in Christchurch in March 2019.

Prime Minister Scott Morrison on Wednesday echoed the calls for social media platforms to do a better job protecting their users.

“Those who run these organisations have a responsibility to those who are watching it, and particularly when it comes to children,” Mr Morrison said.

“The rules in the real world, how you behave in the real world … have to be the same in the social media world.

“You need to be accountable. You need to be responsible. My government will be doing everything to make sure we hold you to account for that.”

While the Prime Minister has promised to hold TikTok and other social media platforms to account, Mr Blow thinks the government will have a difficult time doing so.

“The second anyone tries to regulate them they go on the offensive, and they’re incredibly powerful, they would naturally have some of the most expensive and powerful lawyers, they would have some of the most expensive and experienced PR people, so when you go up against Google and Facebook … whenever you’re up against someone that well resourced it’s a big fight.”

Mr Blow said he feared we are “still a very long way from any significant changes in how our children are exposed” to content they shouldn’t be.

He said while the laws stay the same, similar incidents will “for sure” happen again.

“The big companies will resist any attempt to regulate that space, they’ll keep referring to policies, and procedures, community guidelines, the ability to contact people — to be honest it’s all relatively ineffective when you’re actually dealing with a crisis like this.”

He said Facebook’s policy changes, such as a one-strike policy that would ban people from livestreaming if they violate the community guidelines was like sticking a “small plaster over a gaping wound”, adding “they’re never going to have much impact”.

“All you can do is stop people livestreaming full stop.”

Cybersecurity expert and child safety advocate Susan McLean told on Tuesday that the people who continue to upload the abhorrent material are “99 per cent of the problem”, and the law seems to agree.

In the eyes of the law that treats social media sites as platforms rather than publishers, “it’s down to the individual user as to what they watch and it’s down to the individual author to what they put online”, Mr Blow said.

“It’s a free-for-all to a degree, and from a criminal point of view, if a piece of content is just offensive, the criminal law doesn’t really step into that space.”

If a piece of content is reported to Facebook it takes time to get deleted.

“The reports Facebook receives are largely dealt with by effectively call centre type operations, where Facebook subcontracted staff have to deal with a ridiculously high number per hour of reports they have to address,” Mr Blow said.

“The idea that there’s genuine scrutiny of even the stuff that’s reported — forget the stuff that isn’t — the level of scrutiny is not good: It’s far too quick, because there’s so much of it, Facebook is literally dealing with millions of millions of reports, and it tries to do it as quickly and cheaply as possible.”

Facebook contracts around 15,000 content moderators, one for every 166,000 of its more than 2.5 billion users.

Last year the company made a profit of $25.4 billion.

Leave a Comment