Register

Facebook Announced Moves Toward Platform ‘Integrity’ – Here are 5 You May Have Missed

Culture

photo

Facebook’s sweeping list of changes to combat misinformation was vast. Here, we highlight five crucial, but largely overlooked, measures.

Since 2016 and the aftermath of Facebook’s many scandals around misinformation, the social media giant has maintained a mantra of what they call “reduce, remove, inform.” By their own word, they describe the practice as

[r]emoving content that violates our policies, reducing the spread of problematic content that does not violate our policies, and informing people with additional information so they can choose what to click, read, or share.

Last week, Facebook convened a gathering in Menlo Park with a number of journalists to outline the latest ways they’re enacting these principles. A few parts of the process have been heavily reported on, including the new “click gap” metric (lessening the impact of sites that are optimized to spread virally on Facebook); a Group Quality metric designed to de-prioritize groups that “repeatedly share misinformation”; and Trust Indicators, an initiative from The Trust Project dedicated to assessing the credibility of linked or cited news sources.

An overall benefit of the announced moves: in what seems to follow a lead set by Pinterest, “freedom of reach” is being limited for those who aim to use the site to sow discord and toxicity.

While these measures got several mentions in coverage of the summit, there are a number of additional measures that will also have an impact on the platform’s efforts to curb its debilitating misinformation and abuse. The full list is available in Facebook’s News Room, but here are five moves worth mentioning:

1. Applying Facebook’s Verified User badge to communications in Messenger

The Verified User badge on Facebook proper us designed to distinguish authenticated and highly visible users to post and participate on the site without fear of being impersonated by clone accounts. However, the badge didn’t “carry over” into Messenger. As a result, communications in the app were vulnerable to impersonators. But now that the badge applies in both places, the “tool will help people avoid scammers that pretend to be high-profile people by providing a visible indicator of a verified account.”

2. Expanding the Context Button to images

The Context Button debuted last year to provide additional details about publishers and article contents, in hopes that the information would help people decide whether or not to share a news item. In expanding this functionality to images, the platform hopes to prevent doctored or otherwise falsified images from similarly spreading misinformation.

3. Expanding the role of The Associated Press as part of the third-party fact-checking program

After Snopes’ departure from a partnership with Facebook over a need to examine “the ramifications and costs of providing third-party fact-checking services,” the platform has decided to work with a coalition of academics, journalists, and others to take up the charge of verifying as much content shared on the site as possible. Given the sheer volume of information being shared there, this is no easy task. But the reputation of a partner like the Associated Press will hopefully provide some sorely needed credibility to Facebook’s fight.

4. Transparent tracking of updates to Facebook’s Community Guidelines

Technically, users have had open access to the often-updated Community Guidelines since 2018. But in the spirit of transparency, this new version will make monthly notes of what has changed, making it easier to understand what moves are being legislated against, and why actions that hadn’t previously been taken, are now within the platform’s purview.

5. Forward Indicator and Context Button make the move from WhatsApp to Messenger

Despite its major challenges with misinformation, WhatsApp was curiously absent from Facebook’s list of advances to combat its truth challenges. However, one announcement that alluded to it was a measure that was originally deployed there. The Forward Indicator, a mechanism designed to track messages, lets someone know if a message they received was forwarded by the sender. The Context Button, which provides more background on shared articles, now also works in Messenger.

As The Verge’s Casey Newton correctly notes, these sorts of measures are a step in the right direction for any platform aiming to rein in misuse and regain the trust of its most vulnerable users, who have been burned by past inaction. But hopefully, these small steps will collectively move Facebook (and other websites struggling with similar challenges) toward an identifiable goalpost, one where success in these measures can be clearly articulated and its impact clearly measured. It’s perfectly valid to commend Facebook for this sweeping slate of changes, but it’s also okay to ponder the question: “to what end?”

Join 100,000+ fellow marketers who advance their skills and knowledge by subscribing to our weekly newsletter.

WATCH OUR 2019 PROMO




Newsletter Subscription

Get the latest insights, trends and best practices from today's leading industry voices.


Watch keynote speaker Seth Godin

Bestselling Author and Entrepreneur, Seth Godin spoke at #SMWNYC. Watch his talk + hundreds of others on SMW Insider.

View Talk

Watch SMW Live

SMW Insider is a premium video platform that streams more than 300+ hours of talks, presentations, and interviews from leading industry experts.

Subscribe Now

Write for Us

Interested in sharing your ideas and insights with the world? Become a SMW News contributor and reach 300k readers each month.

Apply Here