Facebook is taking a major step to appease its mostly liberal post-election critics, who charged that disinformation that proliferated on its platform affected the election outcome (read: helped elect a candidate they oppose). Buzzfeed reports:
Facebook today announced several initiatives to help reduce the spread of fake news, and a major element involves giving fact checking organizations unprecedented prominence in the News Feed.
The largest social network in the world is partnering with organizations that have signed on to the International Fact-Checking Network fact-checkers’ code of principles to enable them to verify selected links being shared on Facebook and have those fact checks attached to the original link. This is the first time Facebook has given third parties special placement in the News Feed, which is the biggest single referrer of traffic to news websites in the United States, and a huge traffic driver in other parts of the world.
The company’s leadership is presenting this as a kind of technical tweak that will simply out transparent scams. Facebook already enforces various content standards for its site; it could be that the new protocol will affect ostentatiously fabricated items—”Pope Francis endorsed Donald Trump,” for example—and nothing else. That kind of limited system could run into difficulties—what to do about parody sites, for example?—but would probably not be fatal to Facebook’s mission of free and open communication and debate.
But conservatives are already raising concerns that the new regime will go far beyond its stated aims, and for good reason. In the wake of the election, Clinton supporters eager to blame ostensibly less enlightened people for her loss and media mandarins distressed about the collapse of their authority expanded the definition of “fake news” to include any content they found politically objectionable. The Washington Post published a hysterical report decrying the supposedly vast influence of fake news that relied on a now-discredited report that used broad and opaque criteria to dismiss partisan news sites as “Russian propaganda.” The anti-fake news crusade, in other words, has gathered momentum in part by exploiting all of the same human impulses that can make actual propaganda so potent in the first place—tribalism, hysteria, and confirmation bias.
And then there is the fact that some of the fact-checkers Facebook has enlisted to help with its effort—most notably, PolitiFact—have a clear record of bias against conservative viewpoints, rating as “true” or “false” statements that are essentially expressions of opinion and then casually mixing their own predispositions with objective facts in a way that tends to subject the Right to greater scrutiny. Not that these fact-checkers favor the far-Left—their partisanship is more often one of “sober-minded Democratic centris[m],” as Nathan Johnson has written.
It’s still not entirely clear how much leeway Facebook will give to fact-checking organizations to designate news and opinion items as “disputed”—Facebook says it will turn things over to fact-checkers based on “reports from our community, along with other signals”—but if groups like PolitiFact had their way, it’s almost certain that conservative-leaning publications would face a non-negligible disadvantage in Facebook’s regulated marketplace of ideas.
As its political influence grows, Facebook is increasingly finding itself at the center of partisan firefights. The company is now gambling that it can respond to the latest round of criticism in a way that is muscular enough to placate its legions of post-election left-wing critics without so antagonizing the Right that it loses credibility even further and any content checks in does implement are simply disregarded. This is a delicate tightrope to walk. If elites in business, government, and media respond to reports about the real but limited problem of “fake news” by de-legitimizing non-liberal opinions more broadly, they will simply undermine their authority further and reinforce the problem they are hoping to solve.