Commentary: Someone needs to do something about Facebook — but what?
Legislating against fake news is insufficient. Media literacy education — where communities play a role in distinguishing what’s fake and real — offers the best protection, says an observer from NTU’s Wee Kim Wee School of Communication & Information.
SINGAPORE: “Facebook cannot be trusted to regulate itself”, a Democratic US lawmaker, Representative David Cicilline, said last week.
A recent New York Times report “makes clear that Facebook executives will always put their massive profits ahead of the interests of their customers”, he said.
In Singapore, Senior Minister of State for Law Edwin Tong echoed similar sentiments when he said Singapore “cannot rely on the goodwill” of social media platforms to protect the nation from disinformation campaigns.
As calls to regulate social media grow worldwide, legislative options include censorship and changes to how the networks operate, while education offers perhaps the best inoculation against fake news.
CALLS FOR REGULATION
In September, the Singapore Parliament’s Select Committee on Deliberate Online Falsehoods recommended combating those falsehoods with both education and legislation.
This month, the will to regulate social media may have gotten a boost in Singapore. Completely unfounded claims linking Prime Minister Lee Hsien Loong to Malaysia’s 1MDB scandal circulated on the States Times Review’s website and its Facebook page.
IMDA asked Facebook to take down a post on States Times Review’s page that repeated the allegations and shared the article. Facebook refused, saying it does not take down allegedly false material unless it contributes to imminent violence or physical harm, a policy announced in July.
Global calls to regulate Facebook first got loud in 2016, with allegations that Russians used ads and fake news to attempt to sway the US presidential election to Trump and to sway the Brexit vote to the exit side.
Calls to regulate were renewed last week when the New York Times reported that CEO Mark Zuckerberg and his number two, Sheryl Sandberg, were slow to respond to evidence of Russian election interference.
The report suggested that Sandberg saw it as a personal betrayal that the company’s security chief revealed to the board that in 2017, Facebook had not yet contained Russian interference.
Then, rather than simply focusing on fixing vulnerabilities, the report says Facebook hired a public relations firm to lash out against its critics and other tech giants.
Zuckerberg has acknowledged the company “stumbled”. Its reported foot-dragging and deflection are hardly surprising. Large corporations rarely err on the side of too much transparency or taking too much responsibility too soon for oversights.
It’s difficult to argue with the premise that incentives of a social media company are not perfectly aligned with the public interest. The company profits from selling advertising, not from circulating truth or building communities.
One could counter that any private company has a strong incentive to maintain trust and keep customers.
But users are locked into social networks. It takes effort to build up your network on a platform. Even if you exit, you’re on your own unless you convince your friends to go with you. And if you leave Facebook, where do you go? Instagram? Maybe you’ll just send messages on WhatsApp? Facebook owns both.
To prod social media companies to focus beyond their need to grow profits, regulation seems reasonable.
Most calls to regulate social media stop at “someone has to do something”. There are two categories of legislative proposals: Those targeting certain kinds of content and those that would change how services operate.
America is paralysed when it comes to content-based regulations. You won’t see a government commission deciding what’s fake or real news. That would be seen as usurping the public’s job to make such decisions.
Singapore — less squeamish about regulating content — can reach local lawbreakers, but in cases where authorities needs a foreign social media company’s cooperation to enforce its law, things get complex.
Nations other than the US and Ireland, where Facebook has headquarters, face obstacles when trying to control what happens on its platforms.
If authorities of a nation allege that content violates local laws, Facebook says it sometimes makes the content inaccessible in that country, even if it does not violate Facebook’s own community standards.
Facebook’s transparency report shows that it sometimes takes down content without a court order, but doesn’t detail how it decides whether to act.
Even when social media cooperate with law enforcement, Singapore’s Select Committee highlighted concerns from around the world about how speedily they act.
OPERATIONAL CHANGE TO PLATFORMS
More dramatic legislative proposals would change how social networks function. In July, Senator Mark Warner, Democrat of Virginia, released a white paper with around twenty such proposals.
Changes to how a service operates may be applied globally because of the difficulty of applying different rules to a global network. To comply with the European Union’s General Data Protection Regulation, many tech firms, including Facebook, made worldwide changes to their services.
One of Warner’s less controversial proposals is to force social media to flag accounts operated by bots, which have been used to spread disinformation. Singapore’s Select Committee similarly recommended empowering the authorities to take action against bots.
Another set of Warner’s proposals aims to authenticate online identities and locations. This would help combat anonymous and pseudonymous posts and impersonation.
It could also be part of a strategy to stop foreigners from creating political ads. If Americans had known that posts and ads were created by agents in Saint Petersburg, Russia — rather than voters in St. Petersburg, Florida — fake news would have fizzled in 2016. Singapore’s Select Committee also recommended curbs on anonymity.
Laws against anonymity raise potential objections, though. Dissidents around the world sometimes publish anonymously. Even a law that does not require users to post with their real names — but requires a social network to confirm their real names or locations — could chill public debate.
People may not post unless they trust social media companies not to reveal their identities to authorities.
A promising set of proposals in Warner’s paper is designed to promote competition by minimising the extent to which we are locked into a network. Competition would pressure existing networks to maintain trust in order to keep their users.
If a social network were legally required to allow us to download our data, in a machine-readable format, so we could transfer it to another network, this could help competitors emerge.
Alternatively, a social media platform like Facebook could be required to make its network inter-operable with a rival like Twitter.
Granted, proposals to enable competitors to arise are long-term solutions, of little value to the victim of fake news spreading virally today.
MEDIA LITERACY EDUCATION
Fake news is a moving target that will, to some extent, elude law enforcement and technological fixes. Artificial intelligence is making it possible to make videos of people appearing to say things they didn’t say. “Legislation cannot be a silver bullet in itself”, Singapore’s Select Committee concluded in its report.
Minister for Communications and Information S Iswaran said:
You and I remain Singapore’s first and most important line of defence against deliberate online falsehoods.
It’s difficult to find opponents of media literacy education. But at a recent talk I gave to junior college students, they were quick to point out that media literacy education must be done well.
When compared to law, a media literacy programme doesn’t diminish the need for messy conversations about definitions of truth and how to identify it. It just changes who is participating in the conversations—not just legislators and judges, but students, parents, teachers, and concerned citizens.
In the US, some parents echo Donald Trump and teach their children that CNN is fake news. Others teach that Fox News is fake. Devising media literacy programmes to satisfy both remains elusive.
Singapore may be less ideologically divided, but agreeing how to distinguish truth from falsehood requires answering questions about who gets the last word.
Try this: List a few mainstream news sources and a few alternative ones, local and foreign. Think about what you’d tell a student about how to evaluate their credibility. Imagine how you’d teach students to evaluate messages of political candidates. Consider which cases you’d select to illustrate your approach. Would your choices get universal agreement?
Media literacy education can backfire, warns Microsoft researcher danah boyd (who spells her name in lowercase). If we question everything, and trust nothing, we may stop trusting the best available sources of facts.
Donald Trump’s refrain that respected media outlets are fake news and “enemies of the people” appears to be an attempt to undermine trust in sources of authority that conflict with him.
Media literacy training must not only undermine trust but build it, helping us detect not only what’s fake but what’s not.
The authorities can punish purveyors of falsehoods, and legislators may be able to change how the platforms operate, but building media literacy is every nation’s best hope for building a discerning citizenry with high resistance to disinformation.
It takes practice to evaluate information. Schools and community programmes everywhere can provide it. In the process, families, communities, and nations must engage in difficult conversations about truth, falsity, and trust.
Dr Mark Cenite teaches communication law at the Wee Kim Wee School of Communication & Information at Nanyang Technological University, where he is Associate Chair (Academic).