Facebook CEO Mark Zuckerberg—whose company has blundered its way into controversies over everything from user privacy and data breaches to amplification of extremist content and literal genocide as of late—responded to growing criticism of the tech sector by calling for more outside regulation in an op-ed in the Washington Post (and on his own personal Facebook page) on Saturday.
Zuckerberg broke down the areas where he is now saying regulation could be helpful into four sections: harmful content, election integrity, privacy, and data portability. Somewhat more surprisingly, he offered specifics of what that might look like.
On the first, harmful content, Zuckerberg wrote that platforms face a “responsibility to keep people safe on our services” and that “internet companies should be accountable for enforcing standards on harmful content.” He also said that to do this effectively, Facebook needs to be able to identify and eliminate violent or hateful speech, but also called for “a more standardized approach” across the industry that includes third-party oversight:
One idea is for third-party bodies to set standards governing the distribution of harmful content and to measure companies against those standards. Regulation could set baselines for what’s prohibited and require companies to build systems for keeping harmful content to a bare minimum.
Seeing as it took Facebook years to concede that white nationalism and white separatism are actually the same thing as white supremacy, independent oversight of content decisions is probably not that bad an idea.
As to election integrity, Zuckerberg said that the company has already made steps such as forcing political ad buyers to verify their real life identities and creating a political ad database, but suggested that what is really needed is a… total overhaul of the campaign finance environment:
… Deciding whether an ad is political isn’t always straightforward. Our systems would be more effective if regulation created common standards for verifying political actors.
Online political advertising laws primarily focus on candidates and elections, rather than divisive political issues where we’ve seen more attempted interference. Some laws only apply during elections, although information campaigns are nonstop. And there are also important questions about how political campaigns use data and targeting. We believe legislation should be updated to reflect the reality of the threats and set standards for the whole industry.
These are all things that are probably true, but also dodge the question of why Facebook is vulnerable to political machinations in the first place, as well as whether the whole Facebook information economy is in fact the problem. Notably, at least in the U.S., these kinds of changes would entail a massive overhaul of campaign finance and disclosure laws that is unlikely to emerge for years, if it does anytime in the visible future.
There’s also the fact that the company has historically tried its best to be exempt from ad disclosure rules, and has generally been muddled in ethical issues around ads, like how the Department of Housing and Urban Development just slapped it with charges of enabling housing discrimination.
As to privacy, Zuckerberg called for the U.S. to pass legislation similar to the European Union’s sweeping General Data Protection Regulation, which he said he would prefer to become a “common global framework” (as opposed to a patchwork of laws in each nation). He also called for data portability, which he described as free flow of information between services—though he alluded to Facebook Login as an example, which is really more a way the company has extended its tracking tendrils across the web than anything about safeguarding user rights:
If you share data with one service, you should be able to move it to another. This gives people choice and enables developers to innovate and compete.
This is important for the Internet — and for creating services people want. It’s why we built our development platform. True data portability should look more like the way people use our platform to sign into an app than the existing ways you can download an archive of your information. But this requires clear rules about who’s responsible for protecting information when it moves between services.
(As TechCrunch noted, Facebook itself has been dragging its feet on data portability, only allowing users to export friends lists in a way that makes it difficult to find them on other social networks.)
Still, this is a big switch from a year ago, when Zuckerberg was publicly on the fence about whether regulation was necessary at all, described the GDPR as good in principle but only for Europe, and was suggesting self-regulation was the better approach. What seems to have changed in the meantime is that the external political pressure on Facebook has continued to mount: It and other tech companies have faced an increasingly hostile reception from the public and elected officials, including threats of regulation and talk of antitrust action. One example: The Australian government is threatening to pass laws in the wake of the Facebook-livestreamed Christchurch massacre that would land platform execs in jailand impose significant fines if they didn’t act quickly to remove terroristic content.
In other words, Zuckerberg et al. perhaps now believe that the GDPR-like regulations, as well as others on topics like content moderation, are inevitable and it’s best for Facebook to get ahead of the bandwagon.