Earlier this month Facebook Inc. announced it would be changing its terms and conditions for users effective October 1. The company continues to face flak around the world regarding privacy concerns, freedom of speech violations, and the social media platform’s seeming inability to control hate speech.

One of the changes will allow Facebook to censor content that could prove to be legally adverse to the company. According to Aman Taneja, senior associate at IKIGAI Law, a technology focused law firm, “There has been speculation that this change is due to Australia requiring FB to share its revenues with news publishers.”

The company had earlier stated that it would be willing to block news from being shared on the platform in Australia if revenue sharing were made mandatory. “The terms of use as they currently stand would not allow Facebook to block news sharing,” said Taneja, “so it does appear that the changes have been made to give legal protection to Facebook.”

Divij, an independent researcher and a Mozilla fellow believes these changes are likely a response to potential legal claims about Facebook’s censoring of content that is not expressly in violation of their current terms and conditions. “These changes are an abundant caution against such potential claims,” he said.

In India the company recently landed in controversy after reports broke of its top management in the country going easy on hateful content violative of its terms and conditions because such content came from members of the party in government. After the news broke BJP MLA from Telangana Tiger Raja Naval Singh’s Facebook page was taken down.

While the company has often publicly stated its commitment to letting users exercise free expression, Taneja believes that legal challenges to Facebook on the grounds that it denied free expression will hold no water.

“Private companies have no legal obligation to uphold freedom of speech and expression. Neither should they be expected by the law to determine what amounts to free expression and what is a reasonable restriction,” he told The Citizen.

Shivangi Nadkarni, CEO of the cybersecurity consultancy firm Arrka Consulting, points out that despite being a business, Facebook has to operate according to the law.

“Like all entities, these entities too are governed by various laws and regulations of the countries they operate in. Therefore, by definition, acts mandated by laws and governments are required to be implemented by these entities. In the process, there definitely exists a possibility of curbing of freedom of speech being a fall out,” she said.

Many users of the social media programme would agree with Divij, however, who doesn’t think the new changes will make much of a difference, as the company already censors and amplifies content according to existing pressures.

“It is possible that this additional contractual privilege gives them some more leeway to take down otherwise legal content, but on the whole, I do not think it will make much of a difference to the way FB governs its platforms. It already takes down and censors what it wants and amplifies what it likes.”

Aren’t big businesses part of the state? Divij believes that “private entities” like Facebook are not bound by the same guarantees of free speech and expression as a public agency or government.

“Facebook should make the whole process [of content restriction] more transparent, and providing users with more information is one way to do it,” says Taneja. Failing which, it will lose customers - provided the customers are properly informed of what is happening to their content.

“No platform is ever going to be perfect and it is impossible to be viewed as fair and just in the eyes of every user. Eventually market forces should be able to self-correct and allow users to choose which platforms they remain on, which they trust and which they ignore.”

The only way users will be able to make that decision, he points out, is if they are provided with more information about how the whole thing works, which means greater transparency.

Transparency brought forth by legal obligations is the way forward, Divij agrees. “I would say at the minimum more transparency and accountability in content moderation practices, which should be backed by clear legal obligations and independent regulatory oversight.”

Facebook recently set up an independent regulatory ‘oversight board’, with 20 people on board who will regulate controversial content on the platform with 2.5 billion monthly active users.

There is also the opposite problem, of giving the other branches of the state power to censor the social media. As Taneja points out,“Giving states power to censor seems far more dangerous. If market forces demand it, platforms will be compelled to do a better job at content moderation. The challenge is that we are only beginning to understand what can happen when certain content is allowed to stay up or if certain content is taken down. It can affect society in powerful ways.”