Australian leaders react to Meta dropping fact-checking: ‘removing the muzzle will change the algorithm’

Meta - Mark Zuckerberg

eSafety Commissioner Julie Inman Grant, Bench Media’s Anthony Fargeot, and Seven’s Sarah Greenhalgh shared their thoughts and opinions on Zuckerberg’s decision.

Mark Zuckerberg announced that tech giant Meta would “dramatically reduce the amount of censorship” across its family of apps by removing its fact-checkers program and lifting restrictions on divisive mainstream topics, replacing that with a “community notes” feature.

Mediaweek fielded thoughts and opinions from across the industry about what this change means for the people online, advertisers, marketers and brands.

‘Automated systems are not a substitute for real people who can evaluate content contextually’

Julie Inman Grant, eSafety Commissioner, said that in light of the changes to its community standards, as an entity operating within Australia, Meta must comply with Australian law, including the Online Safety Act.

“Under the Act, eSafety is empowered as the national online safety educator and coordinator to protect Australians from online harms, including child sexual abuse, terrorism, cyberbullying, image-based abuse and serious adult cyber abuse directed against individuals.

“We encourage Australians to report such material to the platform where it appears in the first instance. If the company does not respond, is not effectively enforcing its own policies or otherwise failing to tackle material that may constitute online abuse or harassment under the Online Safety Act, report to us at eSafety.gov.au.”

Inman Grant said that eSafety will continue to ensure compliance, noting Meta’s comments in relation to its ongoing focus on this area.

Julie Inman Grant

“eSafety will also continue to use its transparency powers to require or request answers from tech companies about what they are and are not doing to tackle a range of online harms and whether they are living up to the Government’s Basic Online Safety Expectations.”

She also noted that eSafety knows the tech giant’s plan to focus its automated systems on removing high-severity violations of its terms and illegal content.

“While we welcome any move by tech companies to make their platforms safer and encourage investment in innovative solutions that reduce risk, it is important to note that automated systems are not a substitute for real people who can evaluate content contextually,” she said.

Inman Grant added: “We would also welcome further information from Meta about the long-term efficacy of this automated approach. eSafety is driven by its core corporate values and takes principled, balanced and fair regulatory actions to protect Australians.

“We will continue holding all technology companies to account for online harms and safety inadequacies. For all the change that is happening in the world, our commitment to these values remains consistent.”

‘It then comes down to who is the loudest over who is correct, which is far from ideal’

Anthony Fargeot, Bench Media’s VP of growth, called Meta’s abandonment of its fact-checker program both a win for free speech and a potential risk for spreading misinformation.

“When you let a community moderate itself, you incur the risk of misinformation roaming more freely on a platform with a massive global reach. However, giving independent fact-checkers a lot of authority to decide what is accurate information and what is not is a huge hit for free speech.

“Ultimately, it then comes down to who is the loudest over who is correct, which is far from ideal.”

On whether there is a “community-driven” approach to fact-checking that is fair and unbiased, Fargeot said that it depends on the definition of “community” and if there is enough diversity present within the community for there to be a healthy mix of “views and scrutiny from different viewpoints.”

But he noted that such a mix doesn’t shelter one from bias and lack of expertise. “Ideally, you’d want a diverse community input mixed with expert oversight to achieve something close to fair and unbiased fact-checking,” he said.

Anthony Fargeot

When it comes to who is responsible for ensuring the accuracy of information online, Fargeot said everyone – from platforms, independent fact-checkers, or the users – has a share of responsibility to ensure the accuracy of information online.

“However, it must be acknowledged that it is a huge job to moderate social media content that is fully user-generated by billions of people daily.

“Independent fact-checkers can be helpful in that undertaking while adding a layer of credibility, but they also need some checks and balances to ensure they are not biased themselves.”

In terms of impact, Fargeot highlighted two potential outcomes of this shift. The pessimistic being a “polarised public discourse with misinformation dominating sensitive topics, trust dwindling and public division increasing.”

The other, a more optimistic outcome, could see a “shift encouraging people to seek credible sources of information and engage in meaningful discussions over social media.”

But he noted that the most likely outcome could be land “somewhere on that spectrum, we just don’t know towards which end just yet.”

‘It’s horrifying to see these guardrails being rolled back even further’

Sarah Greenhalgh, Seven’s Spotlight correspondent, said she thought news of Meta’s policy change was “possibly a parody.”

“It’s that farcical that, at a time when tech giants are under so much intense scrutiny for privacy and bad actors on these platforms, they would even consider—let alone implement—measures that roll back guardrails. It’s crazy.”

On accountability, Greenhalgh said: “I’ve spoken to someone who worked in content moderation at Meta. They told me the guidelines are very strict to avoid bias. There’s no real room for bias if the fact-checkers follow the guidelines.”

Greenhalgh said it was difficult to get in touch with anyone from Meta to answer questions and that they denied all requests for interviews during the making of the Spotlight documentary.

Sarah Greenhalgh

“The only time they act to make things better is usually after bad publicity. They’ve shown time and time again that they react only when there’s pressure from the media.”

She also expressed concern about the broader implications of Meta’s decision: “The fact that Mark Zuckerberg openly admits they are ‘going to catch less bad stuff’ but still proceeds is, quite frankly, unforgivable. Sure, it’s inconvenient if a post is flagged for review, but the alternative—where more children are potentially harmed—is unacceptable.”

Greenhalgh recounted her experience working on a documentary about sextortion and said: “It’s horrifying to see these guardrails being rolled back even further, especially when teenagers are so vulnerable to harmful content.”

She also emphasised the risk to young users: “Teenagers are impressionable, and now they’re going to be exposed to information that hasn’t even been fact-checked but could be sold to them as truth. It’s dangerous, particularly for those who rely on social media for their news.”

‘Removing the muzzle will change the algorithm’

Sabri Suby, founder of digital marketing agency, King Kong, and Shark Tank Australia, called Zuckerberg’s move a “reaction to a changing of the guard”.

“We will see a dramatic shift in content over the coming months, giving brands greater freedom in how they communicate and what they can promote – a freedom they haven’t really had for the past six years. Brands have the opportunity to push the boundaries and speak to a broader audience, and if they don’t they will miss out on cut-through while they are lumped in a stagnant sameness. They better move fast – money loves speed.

“Removing the muzzle will change the algorithm. All of us are going to see different content,” he said.

Sabri Suby

Suby noted there has been since Donald Trump was elected as president and used newer long-form media to not have his speech suppressed.

“Zuck sees the writing on the wall with the alignment between Trump and Elon, meaning that unless he jumps on board and gives the people what they want, there will be major recourse on the performance of his platform.

“This is a move to have the internet more aligned with what it was designed to be, which is to allow for free speech. And yes, there is certainly an ugly underbelly that will no doubt open up a whole lot of negativity, but that is also the world we live in. You can’t put guardrails on the internet,” Suby added.

‘Meta has a responsibility to ensure its platform doesn’t amplify hate’

Dani Maynard, communications lead for Orange Line, told Mediaweek it was an interesting move by the tech giant that has a lot of potential, “but fairness and accuracy aren’t a given.” 

“It really depends on how diverse the contributors are and how effectively Meta can prevent issues like groupthink. Without strong safeguards, there’s a real risk that popular opinion could outweigh actual facts.”

She noted that people’s attention spans are shorter and people often take things at face value and asked: “How often do we see someone share a clickbait headline without even reading the article? This kind of dynamic makes it even more important to ensure the system is designed to prioritise accuracy over noise.”

Maynard said dropping fact-checking was both a win for free speech and a potential risk for misinformation. “On the one hand, it opens up the conversation and allows more people to have their say, which feels like a win for free speech.”

She said that while proper personal opinions and experiences are incredibly valuable, and one of the reasons why platforms like Reddit are popular.

Dani Maynard

“But without proper checks, this could also mean misinformation spreads faster or harmful narratives take root. Meta’s challenge will be to balance free expression with safeguards that keep conversations constructive,” she added.

Much like fellow digital media agency leader Fargeot, Maynard said that ensuring accuracy of information online is a “shared responsibility.”

“Platforms need systems to catch misinformation early, and independent fact-checkers bring credibility and expertise. But it’s also about individual responsibility. People need to take the time to do their own research and avoid trusting everything they read on social media. But it’s fair to assume that most people don’t have the time or interest to do that, which is why good systems are so important,” she added.

On the impact of the shift on politics and social issues, Maynard said it could go two ways. “On the positive side, Community Notes could encourage people to engage more thoughtfully and add context without having their comment reported or flagged as misinformation. But on the flip side, if it’s not monitored well, it could become a platform for divisive or harmful content.

Maynard concluded: “I think Meta has a responsibility to ensure its platform doesn’t amplify hate, particularly for younger users.”

‘There is nothing to stop tech oligarchs weaponising their platforms’

Emma Briant, associate professor, news and political communication at Monash University, said tech oligarchs such as Zuckerberg and Musk run “their companies to maximise profits and minimise costs, not to be society’s protector or mediate a neutral, democratic town hall.”

“There is nothing to stop tech oligarchs weaponising their platforms to suit political objectives when the moment is right.

“Fact-checking is only one small part of the solution to the problem of contemporary propaganda. Policymakers often put too much faith in labelling false claims, and in so doing they miss an opportunity to take on the larger problems of a manipulative technology infrastructure hiding behind claims of neutrality and free speech.”

Emma Briant

Professor Briant added that with at least 13 billionaires involved as part of his administration, Trump has sent a powerful message to America’s wealthy right wing elite that “now is your time, not theirs.”

“Clearly Mark Zuckerberg heard him loud and clear. Ordinary citizens should be very concerned.”

Keep on top of the most important media, marketing, and agency news each day with the Mediaweek Morning Report – delivered for free every morning to your inbox.

To Top