Register

How Facebook is Controlling the Spread of Misinformation Ahead of the 2020 Election

Marketing

photo

In the coming months, the Facebook’s new Oversight Board will add 20+ new members and begin handling content moderation cases.

Join us for #SMWONE May 5 - 28, 2020 and hear from 300+ speakers across 150 sessions.


Approaching 3 billion monthly active users across its platforms, Facebook has swiftly earned the title of lead provider of news and information for the majority of the global population. Now more than ever managing this impact is critical as we navigate the uncertainty of COVID-19 and prepare for a future for life after the pandemic.

For some additional context, last year Facebook alluded to an Oversight Board project this past January, in direct response to calls for increased action from the company on potentially dangerous or harmful content following the November election Several months later, the first members of the Board have been announced, which will help the platform make decisions on what content should be allowed, what content should be taken down, and most importantly, why.

“These decisions often are not easy to make – most judgments do not have obvious, or uncontroversial, outcomes and yet many of them have significant implications for free expression,” wrote Nick Clegg, VP of Global Affairs and Communications in the official announcement.

The process

Selecting this group began with a global consultation process of workshops and roundtables that brought together more than 650 people in 88 different countries. Ultimately, the conversations resulted in:

  • The unveiling of a final charter, outlining the structure, scope, and authority of the board
  • Setting up the Oversight Board Trust to safeguard members’ ability to make independent decisions and recommendations
  • Publication of the Board’s bylaws
  • The hiring of the Board’s director
  • The launch of a recommendations portal where the Board can accept nominations and applications from those interesting in becoming a member

With these formalities discussed and established, the actual selection process was initiated and a shortlist of 20 members was released.

Meet the board

Facebook helped kick off the member selection process by choosing four co-chairs, who worked alongside the platform to select the additional 16 members recently announced. Membership selection will continue in this way until the board has selected up to 40 members, at which point it alone will take responsibility for the selection of members in the future. An important criterion for the long-term success of the board is onboarding members who bring different perspectives and expertise to the table. This is essential in making holistic and informed decisions looking ahead.

This list of 20 individuals include lawyers, journalists, human rights advocates, and academics with insights into religious freedom, content moderation, digital rights, internet censorship, civil rights, and more. The announcement also shared that the members have lived in over 27 countries and speak at least 29 languages. “We expect them to make some decisions that we, at Facebook, will not always agree with – but that’s the point: they are truly autonomous in their exercise of independent judgment,” Clegg added.

Making decisions

The Board will govern appeals through a content management system tied to Facebook’s own platforms. Due to the volume, they’ll handpick which content moderation cases are in need of the most attention and then gather as a group to make the final decision around whether the content will be allowed to stay up or if it will be removed. As more members are onboarded, the platform hopes to expand its scope so more cases can be handled. Regarding reporting, the board will publish transparency reports annually and monitor what Facebook has done with its recommendations to adapt its approach by applying the feedback.

The future of content moderation

“It’s one thing to complain about content moderation and challenges involved, it’s another thing to actually do something about it,” said Jamal Greene, co-chair of the board in a recent statement. While content moderation issues have existed since the dawn of social media, Facebook is taking the reins to lead the solution in an innovative way through a first-of-its-kind initiative.

Unarguably the biggest area the Board will face in the coming months is that of political advertising.

“It is our ambition and goal that Facebook not decide elections, not be a force for one point of view over another, but the same rules will apply to people of left, right and center,” said Michael McConnell, another co-chair of the board.

Whether this effort will serve as a springboard for similar approaches to content governance in the online sphere remains to be unseen but it is a step in a positive direction. With consumer behavior dramatically changing due to COVID-19 it is likely this will not only be “nice to have” but necessary as digital content evolves and communities engage in new conversations.

Join 100,000+ fellow marketers who advance their skills and knowledge by subscribing to our weekly newsletter.




Newsletter Subscription

Get the latest insights, trends and best practices from today's leading industry voices.


Learn More

Write for Us

Interested in sharing your ideas and insights with the world? Become a SMW News contributor and reach 300k readers each month.

Apply Here