Internet Watch Foundation analysts recently noted a number of very young children,aged between three and six years,being sexually abused. Companies have proven tools available to stop this horrific content being recirculated but publish little information about where or how these tools operate.
The grooming of children on popular online platforms also highlights the need for industry to put in place age assurance mechanisms,appropriate restrictions on adults contacting children,and robust policies and interventions to prevent banned users from creating various accounts.
The “on demand” streaming of child sexual abuse,and the implementation of end-to-end encryption offered by some online services,makes identifying child sexual exploitation increasingly difficult for companies.
The limited information published by many in the industry also hampers our understanding of the true scale of these harms,exacerbated by a lack of transparency around the steps they are taking – and their effectiveness – across their services.
We are sometimes told that certain measures are not technically feasible,or infringe on privacy,while at the same time other services are already deploying the very technology that is deemed impossible.
More than two years ago,some of the largest companies committed to improve transparency through an industry group,the Technology Coalition. What we have seen since then has amounted to anonymised and aggregated information and more high-level commitments to improve in the future.
Industry must be specific and upfront on the steps they are taking now so that we can collectively focus on the real challenges.
Under the Online Safety Act 2021,eSafety has a range of world-leading powers to improve transparency and accountability of online services and encourage proactive and systemic change.
The act has established basic online safety expectations,articulating the foundational steps that companies should take to protect users. The act also provides the eSafety commissioner with powers to require digital services to detail how they are implementing the expectations,with financial penalties for companies who do not respond.
Today I am using those powers for the first time – now that a six-month grace period has elapsed –issuing legal orders to some of the biggest digital service providers so we can better understand the steps that companies are taking to protect young Australians from child sexual exploitation and abuse.
The insights gained through responses to these notices will help to identify best practice,shine a light on any gaps and vulnerabilities,and act as an incentive to companies to prioritise the safety of the people who use their platforms.
eSafety will use these powers regularly to progressively gain unique insights into a range of harms and start to track issues – and,I hope,progress – over time.
Loading
But industry does not need to wait for us to mandate transparency:I challenge tech companies to step up to the plate now and share more about the steps they are taking to protect their users from online harms. It is actions now,not words or future promises,that are needed.
The Opinion newsletter is a weekly wrap of views that will challenge,champion and inform your own.Sign up here.