How to Spot Coordinated Misbehavior on Facebook, According to Snopes
“Coordinated misconduct,” a phrase coined by Facebook, is the use of multiple social media accounts or pages to mislead or influence people for political or financial purposes. As part of the deception, these pages hide the identities and motives of the people behind them.
For example, the web pages presented here revolve around spreading misinformation about the coronavirus, propaganda for “blue lives” and other pro-police and pro-war content. Facebook pages and groups have over 2 million American subscribers, but they are mostly run by Kosovars. They garnered followers and likes, promising to give away free products and free mobile homes, but the attempt to enter bogus contests ultimately required users to provide credit card information to a site outside of Cyprus.
This is a big problem. According to a report by Jeff Allen, a former data analyst at Facebook, in 2019, content from troll farms reached around 100 million US users per week.
There is probably not much you can do to protect yourself from these kinds of things, other than giving up social media for good. But you can learn to recognize him as he is. Here are some tips on how to spot this particular fake news brand, according to Snopes.com .
Check out the Facebook Page Transparency section.
Every Facebook page has a section called Page Transparency that allows you to see which country the page managers are posting from, and what name changes that page has undergone. The page transparency section is to the right of the page on the Facebook desktop; you will notice this when you scroll down to messages in mobile mode.
Check for check
There are blue icons next to the name of a group or profile on verified pages. If you see one of them, it means that the page or profile probably represents who or what it represents. If it is missing, it may mean that something is wrong. (An unverified page representing, for example, Lorenzo Lamas is probably not actually Lorenzo Lamas.) To be verified, page administrators must submit documentation to Facebook to verify their identity.
Like and Share messages may indicate a problem
If a page is filled with images and memes that encourage readers to like and / or share them, this could indicate coordinated misbehavior. This does not clearly indicate questionable behavior, but Snopes staff note that they often see this behavior with inauthentic pages.
Check the page creation date
You can tell a lot from the page when it was created. Snopes cautions to look for highly politically charged pages with very recent creation dates. If a page is dedicated to hot American political issues, but was created two days ago and is run by people from another country, be suspicious. To find out the date of creation, click the “Page transparency” link on the pages or the “About” page in the groups.
Check Facebook Group Admins and Moderators
Facebook Groups (but not Pages) list their Admins, Moderators, and Members, so click Members in a group to see who is running it so you can determine if the Admins are tier.
Does any of this really work?
It is difficult to say how effective the provision of this information is. Facebook says it has committed “aggressive enforcement action against similar foreign and domestic inauthentic groups,” but others, such as former Facebook employee Allen, argue that Facebook could have done more .
“Adding even some simple features like the Graph Authority and dropping a feature set based solely on interoperability will probably pay off a ton in both integrity and … probably interoperability as well,” Allen wrote.