Tech companies including Meta, Apple and Microsoft must disclose how they police online child sexual exploitation material within the next 28 days or face potentially hefty fines, according to demands from Australia’s eSafety Commissioner published on Monday.
The requirements are part of updated rules that came into force last year, and give the country’s online content regulator greater powers to coerce social media companies into publishing what steps they are taking to keep people, and most notably children, safe online.
“Every company should have a zero tolerance policy around having their platforms weaponized in that way, for either the proliferation, the hosting, or the live streaming of this material,” Julie Inman Grant, Australia’s eSafety Commissioner, told POLITICO about why her agency was asking for more details about how these firms policed such content. “If they’re not doing enough to proactively detect, prevent and remove this (content), then what else are they letting happen on their platforms?”
As part of the legal notices served to Meta, Microsoft, Apple, Snap and Omegle — a niche anonymous online chat service — the companies must provide details answers to how they are finding and removing child sexual exploitation material, as well as what steps they are taking to keep children safe online within the next 28 days. If they fail to comply, the companies face daily penalties of up to $550,000 Australian dollars, or 383,000 euros.
Almost all of these companies publish granular information on these processes in regular transparency reports. But Inman Grant said these documents often did not help Australians from becoming victims to often international organized gangs spread across countries like the Philippines or Nigeria. The existing reports also did not give enough specifics on what steps the firms were taking to track the problem, or how many cases of online child sexual exploitation were happening on their platforms.
“We don’t really know the scale of child sexual exploitation material,” said Inman Grant, a former Microsoft executive. “Part of the problem is no one’s held their feet to the fire or had any tools to be able to say, ‘do you have any actual knowledge of how your platforms are being weaponized?'”
Representatives for Apple, Microsoft, Snap and Omegle did not respond immediately for comment. Meta confirmed that it had received the legal notice.
Australia’s efforts form part of a wider push across the West to force companies to take greater responsibility for how their platforms are being used to spread online child sexual exploitation material. Countries including those of the European Union, Canada and the United Kingdom are all seeking to pass new rules aimed at pushing these firms to do more, including potentially scanning the encrypted messages of their users for such illegal content.
These plans have pitted children advocacy groups, who want companies to clamp down on such abuse, against privacy campaigners, who urge firms not to weaken so-called end-to-end encryption, or technology that makes it impossible for the platforms to read messages sent between individuals.
Inman Grant, the Australian regulator, said she was not in favor of watering down encryption. But she added these companies already scanned encrypted messages for harmful code and malware, so should take further steps to protect children from being exploited online.
“I do see it as the responsibility of the platforms that are using this technology to also develop some of the tools that can help uncover illegal activity when it’s happening while preserving privacy and safety,” she added.
This article is part of POLITICO Pro
The one-stop-shop solution for policy professionals fusing the depth of POLITICO journalism with the power of technology
Exclusive, breaking scoops and insights
Customized policy intelligence platform
A high-level public affairs network