Mark Zuckerberg is really, really sorry.

Last year he dismissed as "crazy" the critics who said “fake news” delivered by Facebook might have given the election to Donald Trump. Last week he said he regretted it.

On Yom Kippur, the Jewish Day of Atonement, he apologized for what Facebook has wrought.

On Monday, a senior Facebook executive repented some more, reporting that $100,000 from Russian-sponsored troll farms bought 4.4 million page views before the 2016 election. "We understand more about how our service was abused and we will continue to investigate to learn all we can," said vice president Elliot Schrage.

The Facebook leadership, like the U.S. government and the rest of us, is belatedly facing up to what Zuckerberg once denied: the social harms that can be inflicted by digital platform monopolies. The contrition and the voluntary remedies, notes Quartz, are “designed to head off looming regulations."

What Is To Be Done

Facebook came to dominate social media with an ingenious interface that enables users to escape the Wild West of the open internet and join a sentimental community of family and friends, knitted together by likes, links, timelines, photos and videos.

Along the way, the company employed a scalable and amoral business model: use algorithms of people' personal data to mix messages of "promoted posts" with family messages and friendly momentos. Its an automated system that is profitable because it requires relatively little human intervention and can be used by anyone who wants to influence the behavior of Facebook users.

When the Russia government wanted to use the platform to confused and demoralize Democratic voters and promote favorite son Donald Trump, Facebook was ready, willing and able to monetize the opportunity. As sociologist Zeynep Tufekci has explained, "Facebook's Ad Scandal Isn't a 'Fail,' It's a Feature."

The question is, what can government and civil society do to protect the public interest from a $300 billion monopoly with 2 billion users? "Facebook is so gargantuan," says Siva Vaidhyanathan, director of the Center for Media and Citizenship at the University of Virginia, “it’s exceeded our capability to manage it.”

One tool is traditional antitrust laws, created in the late 19th century and early 20th century to control railroads, the oil industry and electrical utilities. The reformers, in the Progressive era and the New Deal, passed legislation like the Sherman Anti-Trust Act and the Glass-Steagall Act to prevent and break up concentrations of economic power.

The problem is that since the 1970s, antitrust law has been interpreted through the lens of University of Chicago “free-market” economics. In this view, the test of a monopoly is the short-term harm it does to consumers; i.e., does it raise prices?

If a monopoly doesn’t raise prices, the Chicago School claims, it's not doing any harm. As a result, most of the legal precedents in antitrust law, developed over the last 40 years, are ideologically hostile to the notion of a “public interest.”

To deal with 21st-century platform monopolies, antitrust law needs to be revitalized or reinvented. A host of new monopoly critics, including economist Barry Lynn, journalist Matt Stoller, law professors Jonathan Zittrain and Frank Pasquale, and elected officials such as Sen. Elizabeth Warren (D-Mass.), propose to do just that.

As Pasquale, a law professor at the University of Maryland, said, “We need to have institutions that guarantee algorithmic accountability.”

Six Remedies

1. FCC regulation


Jeff John Roberts of Fortune compares Facebook to the highly regulated TV broadcast networks, "at a time when Facebook has become the equivalent of a single TV channel showing a slew of violence and propaganda, the time may have come to treat Facebook as the broadcaster it is."

In the immediate aftermath of the Las Vegas shooting, a Facebook search yielded a page created by a chronic hoaxer who calls himself an investigative journalist for Alex Jones' Infowars. “To Facebook’s algorithms, it’s just a fast-growing group with an engaged community,” notes Alex Madrigal of the Atlantic.

Roberts:

"Just imagine if CBS inadvertently sold secret political ads to the Chinese or broadcast a gang rape—the FCC, which punished the network over a Super Bowl nipple incident, would come down like a ton of bricks."

This would require rewriting the Federal Communications Act to include platform monopolies. Not impossible, but not likely, and probably not the right regulator regime to diminish Facebook's monopoly power over information.

2. Mandatory FEC Disclosure

One solution is to use existing institutions to force full disclosure of buyers of political ads, a requirement Facebook successfully resisted in 2011.

Last week, Democrats in the House and Senate sent a letter to the Federal Election Commission urging it to "develop new guidance" on how to prevent illicit foreign spending in U.S. elections.” The letter was signed by all of the possible 2020 Democratic presidential aspirants in the Senate, including Warren, Sherrod Brown (Ohio), Cory Booker (N.J.), and Kamala Harris (Calif.).

Another Democratic proposal floated in Congress would require digital platforms with more than 1 million users to publicly log any “electioneering communications” purchased by anyone who spends more than $10,000 in political ads online. The FEC defines electioneering communications as ads “that refer to a federal candidate, are targeted to voters and appear within 30 days of a primary or 60 days of a general election.”

But such measures probably would not have prevented—or called attention to—the Russian intervention in 2016, because the Russian-sponsored ads usually played on social divisions without referencing a federal candidate, and buyers could have evaded the reporting requirement with smaller payments.

Such measures address the symptoms of Facebook’s dominance, not the causes.

3. Empower Users

Luigi Zingales and Guy Rolnik, professors at the University of Chicago Booth School of Business, have a market solution: empower Facebook users to take their friends and their “likes” elsewhere. They propose giving Facebook users something they do not now possess: “ownership of all the digital connections" that they create, or a "social graph."

Right now Facebook owns your social graph, but that is not inevitable.

“If we owned our own social graph, we could sign into a Facebook competitor — call it MyBook — and, through that network, instantly reroute all our Facebook friends’ messages to MyBook, as we reroute a phone call."

The idea is to foster the emergence of new social networks and diminish the power of Facebook’s monopoly.

Such a reform alone isn’t going to undermine Facebook. In conjunction with other measures to create competition, it could be helpful.

4. Make Data Ephemeral

Facebook’s data collection is a form of surveillance that endangers dissent, says internet entrepreneuer Maciej Ceglowski.

Last January, opponents of President Trump organized the Women’s March on Facebook, and several million people participated.

"The list of those who RSVP’d is now stored on Facebook servers and will be until the end of time, or until Facebook goes bankrupt, or gets hacked, or bought by a hedge fund, or some rogue sysadmin decides that list needs to be made public."

To ensure privacy and protect dissent, Ceglowski says, “There should be a user-configurable time horizon after which messages and membership lists in these places evaporate.”

Again, this is a small but worthwhile step. If Facebook won’t implement it voluntarily, it could be compelled to do so.

5. Break up Facebook

But Ceglowski has a more audacious idea: break up Facebook into different companies for social interaction and news consumption.

The problem, he said in an April 2017 talk, is the algorithms Facebook deploys to maximize engagement and thus ad revenue.

“The algorithms have learned that users interested in politics respond more if they’re provoked more, so they provoke. Nobody programmed the behavior into the algorithm; it made a correct observation about human nature and acted on it.”

When a monopoly controls the algorithms of engagement, commercial power is converted into political power.

“Decisions like what is promoted to the top of a news feed can swing elections. Small changes in UI can drive big changes in user behavior. There are no democratic checks or controls on this power, and the people who exercise it are trying to pretend it doesn’t exist.”

So government has to step in, he says.

“Just like banks have a regulatory ‘Chinese wall’ between investment and brokerage, and newspapers have a wall between news and editorial, there must be a separation between social network features and news delivery.”

Just as the government broke up the Standard Oil monopoly in the early 20th century and the Bell telephone monopoly in the 1970s and 1980s, splitting up a monopoly firm to reduce its power is a time-tested remedy.

6. Think Big

Most important is political imagination. The ascendancy of free-market thinking since the heyday of Ronald Reagan and Margaret Thatcher has transformed citizens into consumers and failed civil society in the process. The rise of income inequality is one result. The emergence of unaccountable platform monopolies is another.

Facebook, the website, is the creation of Zuckerberg and clever programmers. But their enormous power is the result of a selfish and short-sighted ideology that privatizes public space at the expense of most people.

With the Democrats incorporating anti-monopoly ideas into their "Better Deal" platform and right-wing nationalists such as Steve Bannon talking about regulating internet giants "like utilities,” the free-market ideology has lost credibility and there is a growing demand for action. As the Roosevelt Institute puts it, "Let’s Reimagine the Rules."

The urgency of reining in Facebook is that if the public does not control its surveillance and engagement technologies, those techniques will be used to secretly manipulate, if not control, the public sphere, as they were in the 2016 election.

“Either we work with government to regulate algorithmic systems,” says Pasquale of the University of Maryland, “or we will see partnerships with governments and those running algorithmic systems to regulate and control us.”

Controlling Facebook, in other words, is a matter of self-protection.

(This article has been made available by the readers of AlterNet. It first appeared here).