Social media giant Facebook published its “Content Distribution Guidelines” (CDG) on Wednesday, which details the types of content the platform is actively demoting on its News Feed, which drives traffic for news publishers and other media outlets.
While the site’s public “Community Guidelines” are currently used to justify the removal of misinformation, violent videos, and other forms of problematic content, the company has thus far remained mum on its policies for promoting or demoting certain types of content. The release of the CDG is part of an effort to improve the company’s transparency when it comes to its ranking decisions.
Hinting at the release of the CDG in March, Facebook Vice President of Global Affairs and former British politician Nick Clegg published a blog post titled “You and the Algorithm: It Takes Two to Tango.”
“Other measures coming this year include providing more transparency about how the distribution of problematic content is reduced; making it easier to understand what content is popular in News Feed; launching more surveys to better understand how people feel about the interactions they have on Facebook and transparently adjusting our ranking algorithms based on the results; publishing more of the signals and predictions that guide the News Feed ranking process; and connecting people with authoritative information in more areas where there is a clear societal benefit, like the climate science and racial justice hubs,” Clegg wrote in the article.
On Wednesday, Facebook’s publication of the CDG detailed the types of content that receive reduced distribution in the News Feed.
“Our Content Distribution Guidelines outline some of the types of content that receive reduced distribution in News Feed. As these guidelines develop, we will continue to provide transparency about how we define and treat problematic or low quality content,” Facebook announced. “Our enforcements to reduce problematic content in News Feed are rooted in our commitment to the values of Responding to People’s Direct Feedback, Incentivizing Publishers to Invest in High-Quality Content, and Fostering a Safer Community.”
The guidelines, which are used to determine the demotion of content, are separated into three categories including “Responding to People’s Direct Feedback,” “Incentivizing Creators to Invest in High-Quality and Accurate Content” and “Fostering a Safer Community.”
The first category, which is largely based on user feedback, includes the demotion of “clickbait links,” which “lure people into clicking on an included link by creating misleading expectations about the post or article’s content,” “engagement bait,” which seeks out “votes, shares, comments, tags, likes, or other reactions” for bad-faith purposes, and “low quality” posts. It is subjective what constitutes a “low quality” post, which can include comments, events, and videos — but users are able to submit feedback to indicate that the quality of the content is poor.
For the second category, Facebook intends to encourage content creators to produce “interesting, new material” by demoting domains with limited original content, or that have published articles marked as fake news by fact-checking organizations. The category also demotes content from “untrusted” news publishers or that do not offer transparent authorship, such as author bylines.
Crucially, an item worth highlighting in the second category involves the demotion of “Links to Domains and Pages with High ‘Click-Gap,’” which refers to websites that receive traffic primarily from Facebook and nowhere else. The understanding is that these websites are not naturally popular and do not enjoy organic engagement from their users and derive the bulk of their traffic through Facebook shares to be heavily monetized.
The third and final category, “Fostering a Safer Community,” addresses content that includes so-called “borderline” content from Facebook Groups and Pages associated with “Violence-Inducing Conspiracy Networks” such as QAnon, and other content that violate Facebook’s community standards.
“The Content Distribution Guidelines outline what content receives reduced distribution in News Feed because it’s problematic or low quality — things like misinfo, clickbait, and ad farms,” wrote Facebook’s policy communications director Andy Stone. “Our Community Standards describe what we remove because we don’t allow it on the platform. The Content Distribution Guidelines focus on what we reduce via ranking.”