Facebook does not control what news people see

(written by lawrence krubner, however indented passages are often quotes). You can contact lawrence at: lawrence@krubner.com, or follow me on Twitter.

If Facebook stops promoting viral content, it will quickly disappear, just like MySpace disappeared:

First, there is no incentive for Facebook to do any of this; while the company denies this report in Gizmodo that the company shelved a change to the News Feed algorithm that would have eliminated fake news stories because it disproportionately affected right-wing sites, the fact remains that the company is heavily incentivized to be perceived as neutral by all sides; anything else would drive away users, a particularly problematic outcome for a social network.
Moreover, any move away from a focus on engagement would, by definition, decrease the time spent on Facebook, and here Tufekci is wrong to claim that this is acceptable because there is “no competitor in sight.” In fact, Facebook is in its most challenging position in a long time: Snapchat is stealing attention from its most valuable demographics, even as the News Feed is approaching saturation in terms of ad load, and there is a real danger Snapchat will beat the company to the biggest prize in consumer tech: TV-centric brand advertising dollars.

There are even more fundamental problems, though: how do you decide what is fake and what isn’t? Where is the line? And, perhaps most critically, who decides? To argue that the existence of some number of fake news items amongst an ocean of other content ought to result in active editing of Facebook content is not simply a logistical nightmare but, at least when it comes to the potential of bad outcomes, far more fraught than it appears.

That goes double for the filter bubble problem: there is a very large leap from arguing Facebook impacts its users’ flow of information via the second-order effects of driving engagement, to insisting the platform actively influence what users see for political reasons. It doesn’t matter that the goal is a better society, as opposed to picking partisan sides; after all, partisans think their goal is a better society as well. Indeed, if the entire concern is the outsized role that Facebook plays in its users’ news consumption, then the far greater fear should be the potential of someone actively abusing that role for their own ends.

I get why top-down solutions are tempting: fake news and filter bubbles are in front of our faces, and wouldn’t it be better if Facebook fixed them? The problem is the assumption that whoever wields that top-down power will just so happen to have the same views I do. What, though, if they don’t? Just look at our current political situation: those worried about Trump have to contend with the fact that the power of the executive branch has been dramatically expanded over the decades; we place immense responsibility and capability in the hands of one person, forgetting that said responsibility and capability is not so easily withdrawn if we don’t like the one wielding it.

Post external references

  1. 1
    https://stratechery.com/2016/fake-news/
Source