The notion of platform as publisher deserves consideration when determining who is responsible for the digital undermining of everything from personal safety and wellbeing to the pillars of democratic society.
By Jackie Pearson
It took Deputy Prime Minister Barnaby Joyce complaining about his daughter being abused on social media for the Prime Minister to utter the idea that the enormous platform operators – facebook, google and others – should be treated as publishers and made responsible for the content they distribute.
Minister for Communications, Paul Fletcher, hasn’t wholeheartedly committed to the Prime Minister’s thought bubble, which means it will probably disappear into the ether but it is worth consideration.
The recent High Court decision in the Dylan Voller defamation case took case law in the opposite direction by ruling that the operators of social media pages or feeds are responsible for what is published on them, even if it is a comment by an unrelated third party. If I go onto a News Ltd or Nine social media channel and make a defamatory comment about another individual, for instance, according to the Voller judgement, the news company could find itself sued for defamation – not facebook or twitter or insta. The aggrieved individual may also come after the person making the comment but the platform operator/owner is treated like the printer or the paper supplier n analogue news publishing terms.
Quite a few pages, groups, and conventional news organisations have stopped allowing comments on their social posts since that ruling – mainly because the cost of administering those posts would be too great an impost on their business model.
The ruling has subsequently shut down public debate and undermined democracy, it has been argued.
Arguably by not making the platform operators themselves, those multi-trillion-dollar global generators of wealth, responsible for the content they publish, we are permitting those who own and operate the business, those who control the algorithm that highlights certain content and buries other content, to get away with all the power and wealth and none of the responsibility.
Australia’s solution, thus far, in addition to resorting to case law, has been self-regulation.
On February 22, 2021, DIGI launched a new code of practice that “commits a diverse set of technology companies to reducing the risk of online misinformation causing harm to Australians”.
The Australian Code of Practice on Disinformation and Misinformation has been adopted by Adobe, Apple, Facebook, Google, Microsoft, Redbubble, TikTok and Twitter.
“All signatories commit to safeguards to protect Australians against harm from online disinformation and misinformation, and to adopting a range of scalable measures that reduce its spread and visibility.
“Participating companies also commit to releasing an annual transparency report about their efforts under the code, which will help improve understanding of online misinformation and disinformation in Australia over time.”
The first set of transparency reports were published on May 22, 2021 and are available to read here.
Commenting on the new governance arrangements for DIGI’s Australian disinformation code of practice, including independent oversight and public complaints mechanism, Dhakshayini Sooriyakumaran, Reset Australia’s Director of tech policy said: “DIGI’s proposed new governance arrangements are laughable given the problem they seek to address: Big Tech’s fundamental threat to democracy.
“The DIGI code is voluntary and opt-in, with no enforcement and no penalties. Clearly, self regulation does not work.
“As Facebook whistleblower Frances Haugen said last week: ‘until incentives change at Facebook, we should not expect Facebook to change’. The incentives have not changed. DIGI has pulled together some great minds for their proposed board, but their ability to affect meaningful reform will not be realised without proper regulation.
“DIGI’s code is not much more than a PR stunt given the negative PR surrounding Facebook in recent weeks. If DIGI are serious about cracking down on the serious harms posed by misinformation and polarisation then it should join Reset Australia and other civil society groups globally in calling for proper regulation. We need answers to questions like, how do Facebook’s algorithms rank content? Why are Facebook’s AI based content moderation systems so ineffective? The proposed reforms to the code do not provide this.”
DIGI developed this code with assistance from the University of Technology Sydney’s Centre for Media Transition and First Draft, a global organisation that specialises in helping societies overcome false and misleading information. The final code has been informed by a robust public consultation process.
The Code was developed in response to the Australian Government policy announced in December 2019, where the digital industry was asked to develop a voluntary code of practice on disinformation, drawing learnings from a similar code in the European Union.
The digital empires created by the likes of Mark Zuckerberg are now being referred to, by very credible commentators as rogue states. They’ve broken through the levels of power and persuasion, and implied evil, achieved by the likes of Big Tobacco and Big Pharma to be compared to the likes of North Korea.
Perhaps what is needed here is legislation and regulation.