Facebook has announced some major changes to how content will be placed in the newsfeed. Why is this news? Facebook has routinely changed the process by which content is placed in the feed. What is new is that the process used to be opaque. No one really knew what Facebook did or why. For example just two months ago, FB had decided to allow their closest publisher partners to begin deploying sponsored content in the feed. What a difference ten weeks make. From double dipping to banished.
Facebook’s new position on friends and family marks an important change as they actually are letting on what they plan to do with the platform. Mike Issac and Sydney Ember writing in the NYT do a fine job covering the main storylines: Facebook changes the formula regularly, many businesses are dependent on a totally unstable platform, nearly half of all referral traffic on the web comes from the platform, and that Facebook can choose at anytime to shut off that traffic (see Zynga, Upworthy and whoever else). Publishers are irate. Each time Facebook improves the feed their traffic seems to decrease. After all Facebook works “to understand what kinds of stories people find misleading, sensational and spammy, to make sure people see those less.” You know what gets clicks: the sensational, the misleading, and the spammy. It was common advice after the first big change in the newsfeed to write even spammier headlines. An entire cluster of spam companies rose out of that software — those golden days of Upworthy, ViralNova, Buzzfeed, and HuffPo. Remember “filter bubbles?” Eli Pariser turned around to deploy clickbait for good with his startup Upworthy, which went on to be relegated from the feed as a superspammer.
Facebook is Old Media
Facebook isn’t “new media.” My go to example is CBS — Facebook is about the same size as CBS. Also, they act like CBS. Would Moonves allow you to run free commercials on the diamond network? Of course not. Why would Zuckerberg? For a number of years the easiest way to write a new media research article was to find some way in which the new media was in fact old. This became such a common move that professors steered their grad students away from making this argument. After Gitelman and Pingree the shelf life on this arguments was short. Yes, FB is the dominant channel of the social media in the United States. It is also done growing. The people who want Facebook, have it. They are also increasingly skilled in the use of Facebook, they don’t click willy-nilly. If international news doesn’t regularly cross your feed, there are data sovereignty laws and increasing nationalism worldwide. Silicon Valley is looking smaller and more provincial all the time.
The era of “Free Clicks from Facebook” is over. They want to control the placement of ads in the mobile feed. Publishers are right to fear Facebook just as authors feared them. The publishers have been reduced to mere authors on the FB publishing platform. With Twitter decaying, FB increasingly is the only game in town.
Facebook wants to get out of the Editorial Game
Just days before the new newsfeed, Facebook announced that existing bias courses would be changed to include “political” bias as a category. This is an interesting idea, but what would that really mean? Would Facebook editors be involved in deciding what is factual? In a world with active conspiracy theory communities in fields as diverse as climate science, virology, economics, sociology, civil engineering, and many others: how would Facebook even manage to keep straight what the “truth” is, especially when some of these truths could easily be political constructs? Would they build an internal version of Snopes? Could they sell this as an API to other firms? Or would they be better off filling the feed with their bread and butter — content from friends and family? How would the public react to knowing the list of authorized facts from King Zuckerberg? Would that list of facts finally catalyze the formation of a “Conservative Alternative” to Facebook? Would it be called Limbaugher? Could Congress regulate that list of facts? Does Facebook really want these questions answered? No, they really don’t.
Worse for Facebook, the revelation that human editors were quite involved in trending topics presents a nightmare scenario for litigation. Section 230 of the Communication Decency Act provides immunity from suits for services transporting third party content. If the process of writing the headlines was automated, that is one thing. If Facebook is involved in the process of writing the headlines, their liability profile may change. AirBnB is already planning to force the section 230 issue in another case. This is not to say that liability is a straightforward matter here, of course there would be plenty of conventional defamation defenses, but move to muddy the water with 230 would be much more difficult.
As a mature company it is far simpler for Facebook to control costs than grow new revenue. Employing human editors, even third-party contractors, is expensive. The more that Facebook can automate processes and strip away entire subsets of content that might be offensive they can cut wages in human staffed roles. This is not just a matter of cutting wages in the editorial division, but also in content moderation. A severely restricted feed could decrease the number of content moderation flags. Facebook employs many people to check those flags, in what Adrian Chen covered for Wired as a horrible job. This is not to say that these jobs would cease to exist, but that a more happy, friendly, family oriented feed would likely see fewer flags attached. And those Facebook reviewers also pose problems — remember those years when they opposed to breastfeeding?
Affective Modulation is Critical to Social Network Success
In other places I have described the idea of affective modulation — the idea that the affect of users/viewers of media changes over time and that the task for flow in a post-network television/social media world is to precisely modulate advertising content to the affect of the public and to cautiously attempt to modulate the affect of that public. Stories from friends and family were already pegged in the feed because they are interesting. Facebook wants to come toFacebook and be happy. A happy world might be a good world? At least Facebook is consistent.
You might say — Dan people need alternative view points, the echo-chamber is destroying democracy, the filter bubble, etc. Aside from the alternative to the filter bubble being the spam sham Upworthy, there is no reason to believe either that people have access to fewer opinions to day or that access to more opinions would do anything. First, the network news era was incredibly homogeneous. Just because you only had three options and likely one option didn’t mean you believed them all. As more options became available, people chose to take them. Conservative talk radio and Fox News didn’t appear by accident, they and the Internet as a whole were a response to an unserved need for an alternative view point. Don’t make out the 1980s to be a time of heavy reading and intense informed public debate. It wasn’t. People then had access to less information with less variety. There is no reason to assume that less news on the feed would have any impact on democracy, especially since the rise of populism doesn’t seem to have been stemmed at all by the current FB newsfeed. Second, opinion polarization is real. American’s don’t respond well to challenging information. Confronting them with your chosen “correct” viewpoints may drive them further into their current positions. If your model of the public sphere and persuasion is based on the idea that everyone else is simply misinformed and that you are right, reconsider. Also, they likely think the same thing about you. It doesn’t pass the mirror test.
Don’t Fear the Feed
It is good for people to experience ambient social support. It is highly likely that this feed change will make Facebook better and facilitate more positive interaction between people. Sure, this is good for Facebook’s bottom line. But then again, why should I prefer to care about the Washington Post’s? The biggest challenge would seem to be the claim that publisher content is good for democracy, which is specious in the first place. If friends and family talk politics, something interesting and productive might happen. Remember — Facebook never said that they were banning content from the feed, just that Friends and Family would be prioritized, and to be honest I trust their judgement far more than the editors of The New Republic. Facebook is clear, your feed is “subjective, personal, and unique.” The idea that some publishers have had that the feed would be a free for all mistook Facebook for Twitter. Why do they care about FB? Because Twitter was awful. Why should FB be like TWTR even a little bit?
Don’t build a business assuming that FB will give you free publicity. If you are a reader, consider reading magazine or newspaper websites. It is a few extra clicks, but it will get out of FB’s closed system and likely make you the friend who adds new stories to the feed. Finally, enjoy your social media.