Facebook says it is updating its Terms of Service (TOS) to enable it to remove or restrict access to content, services, or information on its platform which may expose the company to legal troubles.
The technology giant in a notification to users announced that implementation of the new policy, an update to its Terms of Service (TOS), takes effect on October 1.
“Effective October 1, 2020, section 3.2 of our Terms of Service will be updated to include: “We also can remove or restrict access to your content, services or information if we determine that doing so is reasonably necessary to avoid or mitigate adverse legal or regulatory impacts to Facebook.”
Offences which may result in content being removed or restricted include content that is considered to be unlawful, fraudulent, discriminatory or content which infringes on other peoples rights and intellectual. Content which violates Facebook’s community standards may also be affected by the new policy.
Affected users will be notified of such actions and the review options available to them to reverse them.
“If we remove content that you have shared in violation of our Community Standards, we’ll let you know and explain any options you have to request another review, unless you seriously or repeatedly violate these Terms or if doing so may expose us or others to legal liability; harm our community of users; compromise or interfere with the integrity or operation of any of our services, systems or products; where we are restricted due to technical limitations; or where we are prohibited from doing so for legal reasons,” the notification reads.
These changes may also affect Facebook’s Community Standards, Commercial Terms, Advertising Policy, Self-Serve Ad Terms, Pages, Groups and Events Policy, Facebook Platform Policy, Developer Payment Terms, Community Payment Terms, Commerce Policies, Facebook Brand Resources, and Music Guidelines.
Read full terms here.
Has Facebook already started rolling out the pending changes?
Facebook has already started implementing some restrictive activities on its platform in a bid to stem misinformation and infringements.
On September 3rd, 2020, Facebook introduced a forwarding limit on messenger, stating that the move is to help slow the spread of viral misinformation on the platform.
“As a part of our ongoing efforts to provide people with a safer, more private messaging experience, today we’re introducing a forwarding limit on Messenger, so messages can only be forwarded to five people or groups at a time. Limiting forwarding is an effective way to slow the spread of viral misinformation and harmful content that has the potential to cause real-world harm,” the company said.
Earlier in the year, Facebook also introduced other features to stop the spread of misinformation by using artificial intelligence to identify certain behavioural patterns that are considered unusual and may be correlated to scamming or harmful activities.