Facebook’s Dystopian Definition of ‘Fake’

For the social-media platform, a doctored video of Nancy Pelosi is content, not a phony.

This line of thinking seemed to perplex Cooper, and rightly so. Why would an immediate impact, such as inciting violence in an acute conflict, be wrong, but a deferred impact, such as harming the reputation of the woman who’s third in line for the presidency, be okay?

Once the content exists, Bickert implied, the company supports it as a tool to engender more content. “The conversation on Facebook, on Twitter, offline as well, is about the video being manipulated,” Bickert responded, “as evidenced by my appearance today. This is the conversation.” The purpose of content is not to be true or false, wrong or right, virtuous or wicked, ugly or beautiful. No, content’s purpose is to exist, and in so doing, to inspire “conversation”—that is, ever more content. This is the truth, and perhaps the only truth, of the internet in general and Facebook in particular.


Some journalists, commentators, and observers seemed to empathize with Bickert’s position. “Think about the implications if they did delete it,” the venture capitalist Kim-Mai Cutler said on Twitter. “Would you want this company being the arbiter of truth of billions of videos a day?” The University of California at Irvine law professor David Kay noted that it’s not so easy to “draft the rule that prohibits [the] doctored Pelosi video but protects satire, political speech, dissent, humor, etc.” And the New York Times technology journalist Farhad Manjoo invited suggestions for a “specific policy Facebook should adopt to remove this video but not other edited videos,” suggesting that the answer was hardly obvious.

These interventions are telling, because they take for granted that a simple or juridical process is necessary or desirable for Facebook to operate. They seek general rules rather than specific actions. But Facebook is not a court, or a state, or—by its own insistence—even a media company subject to defamation or libel laws. That means that Facebook can do whatever it wants, anytime it wants. It can take down breastfeeding posts if it thinks they contain nudity, which it can decide it doesn’t want on its platform. It can take down pages for alleged copyright infringement, no matter the veracity of those claims, because the Digital Millennium Copyright Act’s safe-harbor provisions protect corporate overreach. And yes, it can continue to disseminate a video that dangerously misrepresents the speaker of the House just because it feels like it.

If it chose to do so, Facebook could also remove the Pelosi video for no reason whatsoever, or for an official reason that might make as little sense as the rationale for retaining it. Facebook is a private company in the business of capturing and harnessing public attention. Given sufficient reason not to remove popular content, Facebook would like to benefit from the exchange of symbols and ideas about that content. That’s hardly a novel bit of knowledge about how Facebook operates, but Bickert confirmed the matter in an official way during her CNN appearance. That’s the conversation.

Sammy Singh

Graduate of UCLA and Wharton School of Business and Media Personality. World renowned global entrepreneur, venture capitalist, financial technology professional, tax specialist, marketing mogul, and more! Connect with me at: www.linkedin.com/in/cfo www.instagram.com/champagnegqpapi www.facebook.com/sammysinghcxo www.twitter.com/cxosynergy

Next Post

Go chat yourself with Facebook’s new Portal companion app

Tue May 28 , 2019
<div>Ignoring calls that it’s creepy, Facebook is forging onward with its Portal smart display. Today Facebook quietly launched iOS and Android Portal apps that let owners show off photos on the screen without sharing them to the social network, and video call their home while they’re out. The app isn’t likely to move the needle […]</div>
%d bloggers like this: