Facebook is Already Conservative
I recently published a book, Selling Social Media, a major point of my book is to offer a theory of medium specificity in the value proposition of each major social network platform.
President Trump seems to believe that there is some sort of legal default that would say that all networks should run with no management, or at least that there is a conspiracy of Left Liberals to make him look bad on the internet. A very small number of Facebook employees seem to agree with this theory of bias, complaining about Facebook’s Fortune 500 monoculture. Why do I say a small number: 25,000 + people work at Facebook, this memo is signed by dozens.
Let’s make this quick and to the point: there is no conservative version of Facebook. There is no open internal culture at Facebook that would produce such a result. Most organizations actively suppress unpleasant workplace behavior and align their organizations with liberal pluralism. They do this because it helps them succeed, every employee who wishes to join with the culture of an organization must meld with that organization and culture — that is how jobs work.
What is Facebook and Why does it work?
Facebook is an affectively modeled social network that literalizes a users network, the primary reason why individuals use this platform is to experience ambient awareness of that network and to provide and receive social support. In short, people use Facebook because it helps them feel entertained and supported by a network of friends and family.
Medium specificity, as a theory, indicates that different interface and network choices resonate with different affective positions. Twitter is designed to produce kairos, the sensation of a point in time, Pinterest is extremely enjoyable and doesn’t make folks anxious, the list could go on for several more points, but you get the idea. Many of the choices a social network company makes are mutually exclusive, as you flip the switches the network feeling modulates in particular ways.
Facebook works because it provides supportive, interesting content from people that you have approved, through friending, to look at your stuff. The central concerns for Facebook as a company hinge on helping folks post more stuff that people want to see and to encourage supportive interactions. Facebook fails when the wrong content is delivered or boundaries are not maintained.
The central tension and the death of the network
Selling advertising requires some sense of audience. For years Facebook promised a model of market segments that asymptotically approached the individual, these days not as much. The boundaries are made somewhat blurry by the advertising model. The balancing act is how much other stuff to let in, how to select what would be allowed and totally not allowed, how much to share with advertisers about the users, etc.
To thrive, Facebook needs to take an active hand in managing the feeling of the feed. Their core product is not just a programatic advertising method, but a form of televisuality of automatically curated user content. Facebook needs to work from a theory of how to make something nice, which also means not doing stuff that isn’t nice. Lunches and babies and social support: nice. Bullying, conspiracy, trolling: not nice. It is really that simple.
Why did they let it get this bad?
For years Facebook acted as if they were entitled to a quality flow of user content, thus they allowed bad actors to mine data and relied on a theory of the public sphere that was organzing. For the most part the underlying logic was a Hayek-lite cyber-utopian blend. If you assume that the better argument always wins, you can allow all arguments to collide and the best should rise ot the top. Thus, Zuckerberg’s older theory: add more voices. Even if the bad guys are winning and poisoning the platform, you can always just add more voices and eventually you will find the right one to beat them. You can see why Zuckerberg was blindsided: this theory is mutually exclusive with the affective modulation thesis. And if the stock market loves you enough, it just doesn’t matter if you do a bad job. This is why overcapitalization is so dangerous for markets: it actually drowns out all of the signaling mechanisms of that made markets work in the first place.
Twitter, unlike Reddit, Facebook, and Youtube has tried to take a lighter touch. So far it is not working that well. There is also a certain magical appeal to this: don’t do anything and everything will work out. Or as Homer Simpson put it in his campaign for sanitation commissioner:
Facebook is Already Conservative
If Facebook were to actively abandon affective modulation, the platform would be destroyed. I argue this in my book and I think that this is largely an empirical question at this point. We know that well affectively modulated self-organizing systems do not emerge from online publics. It makes sense, as NYT reports, that the author of the FB conservative memo is a fan of Ayn Rand, this is someone who really truly, theologically believes in self-organizing systems.
The last ten years have been a natural experiment — if we create massive swarm systems, does collective intelligence emerge spontaneously? No. It does not. Responsible new media firms are actually developing meaningful theories and moving beyond magical thinking. This is good and productive.
At the same time, this is already profoundly conservative. Facebook is not trying to change the world, at it’s best, it is maintaining the affective worlds of the users. The affective model of the friends-family network seems to reify many conservative ideas. Political speech happens on this platform, but Facebook is generally interested in liberal democracy. Facebook is taking the true conservative position, retaining as much structure as exists.
Reactionary objections to Facebook operations are literally calling to demodulate the affect of the platform, to destroy it. Facebook leadership then, should make the conservative choice: continue with the status quo.