In his videos,the creator of the clips calls Tame a liar and seeks to minimise the seriousness and impact of her sexual assault. He recently broadcast an interview he conducted with Tame’s abuser,as well as a video suggesting serial alleged abuserBill Cosby was a victim of an “epidemic” of false allegations.
Most trolls operate with impunity due to the practical difficulty faced in holding them to account,even when a tech company is willing. Occasionally,a court will intervene.
Justice Lee’s order requires Google to provide the subscriber information relating to the YouTube account to enable the judge to identify the account holder and assess whether he may be charged with contempt of court.
What the judge may not have known is that others have been trying to identify the anonymous YouTuber for some time.
Among his hundreds of uploads,“Powers” left a trail of crumbs pointing to his real identity. On one occasion,he accidentally left open internet tabs revealing his partner’s name,job and business. In one video,the troll published a screenshot that included a tiny image of an email address that included his full name. Other posts offered more clues to his identity,with snippets that revealed his interests and the interests and employment of his family members.
Loading
When this masthead texted a mobile number linked to his partner’s account,it was a Melbourne man named Glenn who responded. Asked if he had recently posted videos of the livestream of the Lehrmann trial,Glenn repeatedly responded:“I can’t recall.”
But when told by a reporter that he would be identified as the YouTuber,Glenn said:“the reason I am making the videos” was to highlight what he claimed was the mainstream media’s failure to report on female perpetrators of violence.
Although this masthead has confirmed Glenn’s identity,we have chosen to withhold his surname in order to protect others with the same,relatively common,name from false accusations. We also chose not to reveal the identity of his family members to distinguish him from other Glenns with the same surname.
Many months before Lee’s judicial intervention,online researchers from the White Rose Society,an organisation usually devoted to exposing white supremacists,had started monitoring his posts. Glenn is not a neo-Nazi.
The society provided this masthead with a file of evidence it had collected.
If Glenn were to face legal consequences for rebroadcasting the Lehrmann trial,this would probably reignite debate over the ability of online trolls to anonymously abuse their victims,and pose fresh questions about the role the tech giants play in facilitating that abuse.
High-profile cases over the past year have highlighted instances of US-based tech giants being reluctant to remove abusive content until they have been forced to do so by the courts.
Last year the Federal Court ordered Google to pay former NSW deputy premier John Barilaro more than $700,000 for “relentless,racist,vilificatory,abusive and defamatory” videos published by YouTuber Jordan Shanks on his channel Friendlyjordies.
Loading
The court found that Google was a publisher ultimately responsible for the defamatory videos. Last year the Federal Court also ordered Twitter,now known as X,to reveal the name and email address connected to high-profile anonymous user PRGuy17 after a defamation claim from far-right activist Avi Yemeni. A man namedJeremy Maluta later claimed to be behind the account.
University of Sydney professor David Rolph said the court system was often seen as a last resort for dealing with trolls and online harassment,given the global nature of the tech giants. X,for example,no longer has any Australian presence.
“The eSafety commissioner and other regulatory approaches becomes really important,because the nature of litigation means it’s often not the most effective mechanism for having something taken down,” he says.
Rolph said victims of online abuse also often did not have the resources to take a defamation case to court.
Loading
The office of the eSafety commissioner was established in 2015 and is Australia’s watchdog in charge of online safety. It doesn’t comment on specific cases but says that in instances of online harm it typically contacts the tech platform to remove the content,and can then issue a fine or take legal action if the platform fails to comply.
The office weighs the impact of its actions on free speech,requiring a high bar to be met prior to using its powers. It also is reactive,responding only to complaints.
“Platforms need to enforce their own policies and be more vigilant and proactive to identify and remove harmful content quickly,” acting eSafety Commissioner Toby Dagg said.
The federal government is currently grappling with how to best force the tech giants to police their own platforms. Communications Minister Michelle Rowland last month announced a review of the Online Safety Act,along with expanded rules forcing the tech giants to better protect against hate speech and publish regular reports on how they are helping keep users safe.
Most states and territories,including Victoria and NSW,have passed or will pass laws to prevent tech companies using the defence of innocent dissemination unless they set up complaints processes and take timely and reasonable steps to delete or restrict access to defamatory material.
Start the day with a summary of the day’s most important and interesting stories,analysis and insights.Sign up for our Morning Edition newsletter.