Late on January 6,when the riot was mostly over,Trump released a video telling his supporters to go home even as he told them'we love you'. It was removed from many platforms,but stayed up on Parler,pictured here on a Team Trump campaign account. Trump himself does not have a personal account on Parler.

Late on January 6,when the riot was mostly over,Trump released a video telling his supporters to go home even as he told them 'we love you'. It was removed from many platforms,but stayed up on Parler,pictured here on a Team Trump campaign account. Trump himself does not have a personal account on Parler.Credit:Parler

What is Parler and what did it have to do with the US riot?

In the months leading up to the storming of the Capitol,researchers of online extremism,such as Griffith University's Audrey Courty,were spending more and more time on Parler and Gab. The big platforms Facebook and Twitter,long criticised for being too passive in policing dangerous content,had become more aggressive about fact-checking and shutting down radical groups. That meant some users were migrating to these smaller sites,which look similar enough but bill themselves as"free speech havens"with very little moderation.By early January,Parler was Apple's most downloaded free app. But Courty says it wasn't just getting more popular,it was becoming more extreme.

"Suddenly there were posts calling for the death of politicians and journalists,you didn't see that a couple of months ago,"she says."They were taking things to the next level,it was very concerning."

Parler is like a clunkier hybrid of Twitter and Instagram,only with less photos of brunch and more conspiracy theories. Instead of tweets,you can post"parleys"of up to 1000 characters each and share other users'posts in"echoes"rather than retweets. With a much smaller user base of almost exclusively conservative voices,Courty says it feels more like an echo chamber than the"bipartisan town square"it claims to be – almost like an online Trump rally.

On signing up,she was immediately prompted to follow high-profile,right-wing personalities such as Fox News host Sean Hannity."But that might be because they are the main people on Parler. There's a lot of virtual fist-bumping going on,but not a whole lot of conversation. It's not a serious replacement to Twitter or Facebook."

The site does have some rules – nothing illegal,no spam or porn and,very specifically,no images of fecal matter. Content reported as problematic may face a"jury"of random volunteer users to decide if it breaks the community's standards or not and,buried in its user terms and conditions,is the caveat that Parler still reserves the right to"remove any content or terminate your access … at any time and for any reason or no reason".

Advertisement

"Free speech doesn't mean no moderation,not even it seems to Parler,"Courty says.

Likewise,while Parler promises not to sell user data to advertisers,in the fine print of its privacy policy she notes that it still admits to harvesting it ("but they don't say why"). In the days since the riot,activists have been scraping this data,like"a bunch of people running into a burning building trying to grab as many things as we can"and the FBI are also combing through related images,videos and posts.

In June,Parler chief executive John Matze said he wanted more diverse debate on the platform,even offering a"progressive bounty"of $US20,000 ($25,750) to a prominent left-wing influencer who signed up,but by the time Parler had been unplugged by Amazon on January 9,the reward still appeared unclaimed. Matze has sworn the site will return soon.

The Wall Street Journalrecently revealed Parler is bankrolled by some deep-pocket conservatives,most notably Rebekah Mercer,a former Trump adviser and daughter of the billionaire investor behind the now disgraced political consulting company Cambridge Analytica.

Courty says it's unclear whether the site's executives really want to further the cause of the right or if they are simply capitalising on a frustrated demographic to grow users fast. However,she says the strategy will likely harm Parler in the long term,keeping its user base (and so its influence) narrow and its advertiser and investor pool shallow."It's tainted now."

“Parler is just the tip of the iceberg.”

Dr Olga Boichak

To survive,Parler needs to find a new host or invest in its own servers and infrastructure – no small feat,says digital cultures expert at the University of Sydney Dr Olga Boichak. Gab,itself a notorious hotbed for white supremacy and conspiracy theories,managed to find a new home with controversial hosting company Epik within a week when it was unplugged in 2018 following revelations a shooter who murdered 11 people at a US synagogue had first posted his plans to Gab. The notorious messageboard 8chan,which helped radicalise the Christchurch shooter with alt-right ideology,was reborn as 8kun. Parler has already moved its domain name registration to Epik. In a statement,the host company said it hadnot yet discussed the shift with Parler,even as it hit out against big tech"deplatforming"and ostracising the right.

Advertisement

With Parler down,Gab now claims to be clocking up its own surge in new users. But when the big platforms close their doors to such groups,they inevitably limit their reach,pushing them further to the fringes of the internet. Boichak says we shouldn't get too sidetracked by any one platform. Groups will migrate where they need to."Parler is just the tip of the iceberg."

Members of alt-right groups such as the Proud Boys as well as the cult conspiracy Qanon descended on the US Capitol on January 6.

Members of alt-right groups such as the Proud Boys as well as the cult conspiracy Qanon descended on the US Capitol on January 6.Credit:AP

Why was Trump banned from social media?

On January 8,after dozens of warnings (and more than a few international incidents),Twitter finally banned one of its most famous and prolific Tweeters – for good. Trump had just Tweeted that his followers were"great American patriots"who"will not be disrespected,or treated unfairly in any way,shape or form"and then announced that he would not attend Biden's inauguration.

Twitter said this could be a signal to Trump’s base that he was disavowing the (begrudging) election concession and promise of an"orderly transition"he had made a day earlier – it could even be interpreted by those already threatening more violence that the inauguration was now a"safe target"for a second attack.

Previously,Twitter had erred on the side of keeping Trump's tweets up,as matters of public interest given their use as official statements by the leader of the free world. His removal is now sure to cause headaches for those in charge of the administration's record-keeping.

Reflecting on the ban in a tweetstorm of his own on January 14,Twitter chief executive Jack Dorsey said he took no pride in the decision,seeing it ultimately as a"failure"to create a service that could sustain civil discourse and healthy conversations. He stood by the move as the right one for public safety but worried it now set a dangerous precedent.

Advertisement

Indeed,Facebook and other big social networks soon followed Twitter with bans and suspensions of their own against the President,though Dorsey rejected suggestions by Matze and others that the bans were coordinated. Some such as Snapchat are permanent but most are temporary. In a series of fiery (and since removed) Tweets sent from his little-used official POTUS account,Trump lashed the ban and said he is now considering creating his own platform.

Loading

"Trump and co could even team up with Parler to develop something new,"Courty says."Trump,it's a brand right? It's like the Kardashians. If he comes out with his own,or joins with Parler that would be something to watch."

So could he sue over the ban? At the Australian National University's College of Law,Dr Jelena Gligorijević says no – while the US has explicit free speech protections under its First Amendment,these are designed to stop censorship from governments,not private companies. If you signed the terms and conditions that say Twitter can kick you off,then there's no legal remedy,including in Australia,where the right to free expression is implied,rather than explicitly codified,in law.

'Social media doesn’t radicalise in a vacuum. There’s always other factors at play.'

Audrey Courty

But in Italy,it's a different story,Gligorijević says. Facebook was forced by the courts in 2019 to unblock a fascist political party spouting the beliefs of former dictator Mussolini and restore its content,even though the company had deemed them to be dangerous."So,if Facebook were to ban the Italian President … it would likely be actually breaking the law,"she says.

In the case of a figure as influential as Trump,still in the White House and likely to remain in headlines even after he leaves it,the social media blackout won't actually silence him.

Advertisement

But Courty,who studied Islamic extremism before turning her attention to the far-right,says when groups are deplatformed it can nonetheless have a big impact on their ability to recruit,radicalise and spread their ideas.

"Islamic State had a very good information campaign so[taking them off] did have an impact. Of course,social media doesn't radicalise in a vacuum. There's always other factors at play."

Meanwhile,a group of Republicans and even some Australian conservative politicians have been quick to voice their unease at the bans on Trump. Treasurer Josh Frydenberg said he was"uncomfortable"with the move and acting Prime Minister Michael McCormack called it censorship,questioning why the company had not acted on other concerning posts,such as the fabricated image of an Australian soldier appearing to kill an Afghan child posted by a Chinese diplomat in November.

But others in the party room and across the aisle said Twitter was right to ban Trump as this was a question of inciting further violence. Technology Minister Karen Andrews wants social media companies to adopt consistent and transparent rules to both protect Australians from"vile"hate speechand justify bans,as more than 50 MPs join a new parliamentary group dedicated to reining in the technology giants.

Loading

How is social media regulated?

Most people get their news through social media,but such platforms are not bound by the same rules as the press. Professor Ingrid Volkmer who heads up the University of Melbourne's new International Digital Policy Lab calls it"wildly unchecked".

There are some exceptions. Germany has forced companies to crack down on anti-semitic speech;Europe has rolled out tough privacy laws giving users more rights over their data and Australia,along with New Zealand and France,has moved to penalise platforms hosting graphic criminal content in the wake of the Christchurch massacre livestreamed on Facebook. But mostly the tech giants take down and leave up content on their own terms.

In the US,one law passed in 1996 known as Section 230 (sometimes nicknamed"the 26 words that created the internet") explicitly absolves digital platforms from liability for the content posted by their users so long as they are not aware of any crime being committed. Australia does not have an equivalent law,but,while our notably restrictive defamation lawshold the press liable for the comments made on stories they publish to social media,the companies who actually control the platforms are not.

'It’s a bit of a Wild West at the moment. I don’t think we should just be leaving it up to Silicon Valley to decide what’s acceptable and what’s not.'

Audrey Courty

Liberal MP David Sharma has calledTwitter’s ban on Trump the"right decision on the facts",but wants more transparency around how corporations make such calls. In fact,he said the thought ofleaving free speech up to companies beholden only to shareholders sends a "chill" down his spine.

Experts and victims of online abuse agree that bans and removals seem inconsistent,more informed by politics and optics than any set threshold. Users can breach rules again and again without facing consequences while others can find themselves the target of"weaponised reporting"and booted off without any right of reply. Sometimes too much is removed,even evidence or historical records,Boichak says.

"It's a bit of a Wild West at the moment,"Courty adds."I don’t think we should just be leaving it up to Silicon Valley to decide what's acceptable and what's not. All it takes is someone to buy out Facebook who may have really dubious intent and we could really regret giving up that power."

Volkmer,who helped draft the first global framework on artificial intelligence for the OECD,agrees researchers – and the general public – need to be involved in developing new guidelines. But she says the solution required is now beyond any one country."These platforms don't have borders. It should be an international regulatory effort[by] democracies."

Boichak says the big sites – Facebook and Twitter – have felt the wind shift. Having long argued they are a neutral spaces of civil discourse,she says,"now we're seeing them try to self-regulate. That way they can avoid[external scrutiny]. It's the same way Uber says it's not an employer,it's just[a platform] in the gig economy. These[companies] don't even call themselves media now,they say social technology."

Twitter,under Dorsey's direction,is funding work into developing an open-source social media standard,noting other sites such as Mastodon,often heralded by progressives as the"Twitter without Nazis"already use an open-source model. In 2019,the platform also introduced more measures such as labels,warnings and distribution restrictions to reduce the need to delete content entirely.

But,in blocking the US President,many,including Australian Liberal MP Andrew Bragg,say the tech giants have now"crossed the Rubicon"from neutral town square to publishers in their own right.

Gligorijević adds:"If[they] continue to behave like publishers (or editors) in regulating content and imposing sweeping bans,there is a chance the law will then follow to hold them liable."

And can platforms truly still call themselves"neutral"spaces when they are engineered to grab attention,harvesting data to sell onto corporations and using algorithms to feed users content based on what they engage with,for better or worse? YouTube,in particular,has been shown to lead people down radical rabbit holes."You type in vaccine and the third suggested video will be an anti-vax conspiracy,"Boichak says.

Courty admits she mostly engages with the stuff on Facebook that"makes[her] mad"and that means she keeps seeing more of it in her feed. Social media sites may be having something of an identity crisis right now and there's no easy answer to that,she says,"but what we do know is that they're not just neutral platforms any more".

Loading

What would happen if social mediawas held accountable?

While these big companies might be headquartered in the US under the purview of Section 230,Gligorijević says they would still be expected to abide by the laws of the countries in which they operate. That means the future of your feed could come down to where your government falls on the balance between free speech and reducing harm.

"We’re not in that utopian cyberspace of the 90s where everyone is equal as long as they have an internet connection,"Boichak says. “ The[web] has centralised[into] platforms...And we've seen the bad that can come from them now too."

Right now,the UK is weighing up imposing a specific duty of care on platforms in what Gligorijević calls the most protectionist stance considered so far. But even in the US,long the bastion of free speech,there is growing appetite among both Republicans and Democrats to repeal Section 230and end the free rein of social media companies.

No one is suggesting democracies follow countries such as China and Russia in creating their own state-controlled social platforms,but many experts stress that the Capitol riot has made the case for at leastsome external regulation. Courty says Europe's privacy protections have shown it can work without squashing innovation.

"This is the moment,even more than[Facebook's] Cambridge Analytica[scandal],"Volkmer says."This has to be the wake-up call."

Australia's Parliament is already considering giving powers to the eSafety Commissioner to order companies to delete particularly abusive posts. While Julie Inman Grant,the commissioner,has said that allowing hate speech to flourish unchecked online can itself stifle free speech,she has also cautioned that the platforms will still be the first port of call for complaints if the laws pass – her agency will not be policing political speech."It's not like we're going to be out there issuing rapid fire judgments … because all of these have to stand up in the court of law,"she said.

'It wasn’t just violence,they can destabilise democracies.'

Ingrid Volkmer

Meanwhile,in reaction to the Trump ban,Liberal MPs in Australia want social media companies bound by a code of conduct when they do take action to silence speech or shut down accounts.

Justifying such decisions can be a legal minefield for companies,Courty says. How do you show a clear link between a comment online and violence in the street?

Loading

Volkmer says the Capitol riotwas the link and it unfolded right before our eyes."And it wasn't just violence,they can destabilise democracies."

If platforms do face more accountability for the content on their sites,they will need to invest in more resources and more reliable technology to ensure harmful posts and accounts can be removed faster than they are now. No one wants to wait two days for their Tweet to be approved by moderators.

So far,disinformation and fake news has proven at least as difficult to contain as the pandemic. Boichak describes the moderators hired to monitor social media as the cleaners of the internet. They face some of the worst content day in,day out and yet work largely in the shadows,sometimes under poor conditions. And still,due to the sheer scale of content,the industry must rely largely on algorithms trawling for key words and patterns.

Sometimes it gets it wrong,such as the case of rare French masterpieces flagged as pornography. ("That almost started another revolution,"laughs Volkmer). Sometimes extremists and conspiracy theorists such as the cult fringe group QAnon simply learn to outsmart the system,avoiding phrases that will draw unwanted attention.

"There's a reason it's called a blackbox society,"Boichak says."We still don't really know what's happening out there."

With Nick Bonyhady and David Crowe

Let us explain

If you'd like some expert background on an issue or a news event,drop us a line atexplainers@smh.com.au orexplainers@theage.com.au. Read more explainershere.

Most Viewed in National

Loading