Sydney teenager Tilda,15,uses Instagram,TikTok and Snapchat.

Sydney teenager Tilda,15,uses Instagram,TikTok and Snapchat.Credit:Louise Kennerley

Tilda was officially too young when she joined Instagram - the terms and conditions state users must be at least 13 - but this is common. As we now know,thanks to damning testimony from a whistleblower in the US,former Facebook employee Frances Haugen,Instagram is not safe for young people aged 13 and older anyway.

Haugen took a trove of documents - thousands of pages of internal research,briefing notes,presentations and memos,legal advice and messages posted on the Facebook Workplace forums - when she left the company in May.

She shared the documents withThe Wall Street Journal(which has been publishing its series The Facebook Files since mid-September),the US Securities and Exchange Commission,and the US Congress,where Haugen testified this week.

It confirms what many of us always suspected - that Facebook and its family of products are damaging to society and individuals. It also appears that the company knew how to fix many of the problems,but instead tried to cover it up.

“The choices being made inside of Facebook are disastrous for our children,for our public safety,for our privacy and for our democracy,” Haugen said in her testimony to the US Senate Commerce Committee this week.

“Left alone,Facebook will continue to make choices that go against the common good. Our common good. When we realised big tobacco was hiding the harms it caused,the government took action. When we figured out cars were safer with seat belts,the government took action.”

Advertisement

As a result of Haugen’s testimony,Facebook has come under united bipartisan pressure from both Democrats and Republicans and has been forced to shelve plans for an Instagram Kids app for the preteen market - at least for now.

Former Facebook employee and whistleblower Frances Haugen provided damning testimony about the social media giant.

Former Facebook employee and whistleblower Frances Haugen provided damning testimony about the social media giant.Credit:AP

Even Facebook itself appears to be no longer standing in the way of regulation.

“It’s been 25 years since the rules for the internet have been updated,and instead of expecting the industry to make societal decisions that belong to legislators,it is time for Congress to act,” a Facebook spokesperson says.

In Australia the story has reignited a debate about how to regulate content on the internet,and not just in terms of cyber-safety. In an extraordinary intervention,both Prime Minister Scott Morrison and Deputy Prime Minister Barnaby Joyce this week flagged the possibility of making the platforms legally responsible for defamatory comments.

Morrison accused the tech giants of allowing their platforms to become a “coward’s palace” for anonymous trolls who “destroy people’s lives and say the most foul and offensive things to people,and do so with impunity”.

Haugen’s Facebook dossier is wide-ranging. It reveals details about a program called “cross check” or “XCheck” that whitelists high-profile accounts so that the company’s normal enforcement measures against harassment and incitement to violence don’t apply. It suggests the company was consistently willing to accept 10-20 per cent more misinformation if it meant 1 per cent more engagement. It describes how Facebook prematurely removed controls put in place before the November 2020 US presidential election in December to reprioritise engagement,just a few weeks before the US Capitol riot. It exposes the weakness of the company’s response to criminal issues in developing countries,from drug cartels in Mexico to human traffickers in the Middle East.

One of the most striking disclosures was what Facebook knew about the harmful effects of its photo-sharing app Instagram for many users,especially teenage girls who account for a large chunk of the audience.

Australia’s Assistant Minister for Mental Health David Coleman,a former chairman of NineMSN,says the whistleblower files demonstrate the social media giants “can’t be trusted to act in the best interests of children”. He is scathing of Facebook and Instagram’s “abysmal” efforts to enforce their own age-limit restrictions.

“There are undoubtedly millions of children who are on social media platforms at an age where it is unsafe for them to be there,” he says.

“What is the role of society and government if not to protect kids,and we know that we can’t trust the social media platforms to do that.”

Australia has led the way on cyber-safety,establishing the world’s first eSafety commissioner in 2015 and passing the Online Safety Act 2021,which technology companies must comply with by mid-2022. Australia was among only a handful of countries to force technology platforms to pay news publishers for content,and debate is now turning to defamation.

There are clear signs of a growing appetite within some sections of the Australian government to crack down further on the social media giants. But with just four already-packed parliamentary sitting weeks left this year,momentum for reform could be stymied by the headwinds of the looming federal election and campaign season.

Facebook says the company removed more than 600,000 underage accounts on Instagram over the past three months,and has thousands of staff as well as AI technology dedicated to removing accounts belonging to underage users.

Many people can use Instagram and not be harmed,or the problems can fade with time like in Tilda’s case. For others,the app’s relentless focus on social competition and the algorithms that can lead users from healthy recipes to pro-anorexia content at warp speed,can contribute to the development of eating disorders or self-harm.

Haugen’s documents show that internal Facebook research found more than 40 per cent of teenage Instagram users who reported feeling “unattractive” said the feeling began on the app,one in five teens say Instagram makes them feel worse about themselves,and many teens reported the app undermined their confidence in their friendships. Teens regularly said they wanted to spend less time on Instagram but lacked the self-control to do so.

Loading

Facebook researchers concluded some problems around social comparison were specific to Instagram,not social media more generally. Some Facebook executives resisted an internal push for change,saying the social competition was the “fun part” of Instagram for users,and in public the company cited external research which downplayed the correlation between social media usage and mental health harms.

In a public post Facebook founder Mark Zuckerberg said it was false that Facebook prioritised profit over safety. He said the Instagram research had been mischaracterised because it also showed many teenage girls who struggled with loneliness,anxiety,sadness and eating issues said Instagram made these problems better,not worse.

This week Morrison and Joyce seized on the debate about online abuse proliferating on social media as a stick to threaten a further crackdown through defamation law reform.

In a deliberate choice of words,Morrison said that platforms that refused to unmask trolls were “not a platform any more,they’re a publisher”. Joyce,whose daughter has been the subject of scurrilous gossip by anonymous commenters,declared that platforms “must be held liable”,saying if “they enable the vice,they pay the price”.

Their comments follow a High Court decision last month that found media outlets were legally responsible as “publishers” for third parties’ comments on their Facebook pages even if they were not aware of the comments. The bombshell ruling also has implications for other administrators of Facebook pages,including MPs and regular citizens.

Facebook founder and CEO Mark Zuckerberg said it was false that Facebook prioritised profit over safety.

Facebook founder and CEO Mark Zuckerberg said it was false that Facebook prioritised profit over safety.Credit:AP

Associate professor Jason Bosland,director of the Centre for Media and Communications Law at Melbourne Law School,says making the social media giants liable for defamatory remarks circulating on their platforms as soon as they are published would be an “extreme” outcome,and probably make Facebook,Twitter and other companies unable to operate due to the legal risk.

“You would have very few experts that are consulted that would suggest that Facebook should be liable for absolutely everything that’s published on their platform without notice,” Bosland says.

The nation’s attorneys-general,led by Mark Speakman in NSW,are considering the options for defamation law reform.

Australia’s eSafety commissioner Julie Inman Grant says the whistleblower revelations,while not surprising,could galvanise action in the US and that in turn would bolster Australia’s efforts. She too likens it to efforts to regulate car safety and mandate seatbelts in the 1960s to 1980s.

“This is the tech industry’s seatbelt moment,” she says. “For too long,they have not had any brakes put on them whatsoever and the primary reason is because they served as a driver of innovation and inspiration and growth and development and no government wants to put the brakes on that.”

Loading

Australia’s efforts include giving the eSafety commissioner statutory powers to order the removal of content,working with the industry to promote the concept of “safety by design” and the Online Safety Act 2021. The Act takes a co-regulatory approach - eSafety has produced a white paper outlining the expected outcomes,and technology companies or industry bodies (for sectors such as social media platforms,internet of things,or gaming providers) have until June 2022 to register codes showing how they plan to comply. Those codes need to be approved by eSafety and will be registered under the Act,giving regulatory force.

Inman Grant says previously Australia regulated for a specific set of harms,such as cyber-bullying,image-based abuse and illegal online content such as child sexual abuse or terrorist content. The new Online Safety Act is about basic online safety expectations or a social licence to operate.

However,the US is a market 12 times larger than Australia and the home jurisdiction for Facebook and most other technology platforms,so regulation in the US would be of far greater import.

Communications Minister Paul Fletcher and Inman Grant this weekjointly wrote to the US Senate Committee for Commerce sharing details about Australia’s regulatory approach and offering for Inman Grant to appear at the hearings.

Australia’s eSafety commissioner Julie Inman Grant says the whistleblower revelations,while not surprising,could galvanise action in the US and that in turn would bolster Australia’s efforts.

Australia’s eSafety commissioner Julie Inman Grant says the whistleblower revelations,while not surprising,could galvanise action in the US and that in turn would bolster Australia’s efforts.Credit:Alex Ellinghausen

Inman Grant believes international standards are inevitable for technology,just like they are now embraced by the car industry.

Conflicts of profits and safety

During her two years at Facebook,Frances Haugen says she saw the company “repeatedly encounter conflicts between its own profits and our safety[and] consistently resolve these conflicts in favour of its own profits”.

Haugen,who had previously worked at Google,Pinterest and Yelp,grew so concerned at what she saw at Facebook that she resigned and decided to compile evidence before she left.

She says the solution lies not just in regulation but in a demand for full transparency about Facebook’s data and algorithms. She says at other large tech companies such as Google,independent researchers can download and analyse company search results from the internet,but Facebook “hides behind walls that keeps researchers and regulators from understanding the true dynamics of their system”.

However,Inman Grant says it would be very difficult to regulate algorithms because they are not static - you would also need to be given information about how the algorithms adapt and change through machine learning,and regulators like eSafety would have to employ a team of data scientists and data engineers.

There is also a question over whether private companies should be compelled to share proprietary information - algorithms being like the “secret sauce” that helps their products compete in the marketplace.

In response to Haugen’s testimony,a Facebook spokesperson says:“A Senate Commerce subcommittee held a hearing with a former product manager at Facebook who worked for the company for less than two years,had no direct reports,never attended a decision-point meeting with C-level executives – and testified more than six times to not working on the subject matter in question. We don’t agree with her characterisation of the many issues she testified about.”

Inman Grant,who has worked at Microsoft,Twitter and as a lobbyist to the US Congress,describes this response as “a classic obfuscation technique”.

“I’ve seen those talking points written before. I don’t think they hold much water.”

In need of some good news? The Greater Good newsletterdelivers stories to your inbox to brighten your outlook.Sign up here.

Most Viewed in Technology

Loading