This Wasn’t a Breach

Dan Faltesek
4 min readMar 19, 2018

You may have noticed the excellent story by Carole Cadwalladr and Emma Graham-Harrison on the mass collection of Facebook profile information by Cambridge Analytica. If you have not: read it here. Basically, by using a quiz app researchers were able to collect vast troves of Facebook user data, when combined with default public information they were able to produce two-million high-quality psychographic files. This is not a new idea, the psycho-demographic hypothesis is an older idea from advertising, particularly geographic research on magazine sales. As this story has unfolded, the issue is not the idea of psychographic research, but the degree to which Facebook was a partner in such enterprises.

It has been standard operating procedure for Facebook to ignore bad news and act as if it can do no wrong.

Facebook is the Mr. Burns of Social Networks
An image so Facebook is happy.

How do you do you do research with facebook?

You don’t. At least, not without Facebook’s involvement. There was at one point more access to Facbook, either via the company itself or via the Netvizz app. It was a great starting point for the quarter in Cultural Analytics to walk the students through downloading their basic social graph from Facebook and visualizing it. A relevant and fun way to open the quarter that made computational research seem less daunting. After the Cornell mood manipulation debacle, FB began to clamp down on public tools. As a matter of user protection, the requirement for internal support was supposed to slow research that could be negative for the company. It is clear that Facebook was aware of Cambridge Analytica and believes simply that they exceeded their agreement. In both cases, Facebook was aware.

Facebook’s Culture

One of the themes I have noticed in Facebook’s discourse is the tendency toward solipsism. Long running threads of research (like public sphere theory) or even their own bad choices are ignored, as if they are not a part of the authorized world. All there is, is Facebook. At least that is Facebook’s attitude. Poor crisis and network management follow. In the opening hours of this crisis, Alex Stamos (chief of security at Facebook), went out of his way to argue that this wasn’t a data breach. It would have been better if it had been. A breach is one thing, that involves sneaky hackers. Allowing an ultra-right wing research firm to scrape all the data is much worse. Did Zuckerberg just get back from a listening tour?

Should Zuckerberg testify?

Absolutely, 100% yes. They should tell the truth: they have no idea how Facebook works, what it is doing, or why it is successful. At this point they are coasting on network effects. They should also note that their default position is to assume that laws in the United States don’t apply to them because the Internet is new and special. Four years ago, the United States Supreme Court decided in CLS v. Alice that the idea that adding a computer to an existing business process did not make that process patentable. The area of law is different, but the point is the same: these are mature firms in a mature industry that is well understood. There is no reason why these firms should be so recklessly managed.

How do the other companies to it?

Twitter takes a lot of heat for the degree to which they have allowed bots to run wild on the platform. I have been skeptical of their entirely laissez-faire attitude. At the same time, there is a benefit to allowing a great deal of access to the platform: researchers working in the public interest are present. Allowing many interested researchers to access the platform for research would

Open the Platform

It is possible that user privacy may be compromised or that the company could be embarassed. It is clear that the lack of institutional control at Facebook has compromised user privacy and damaged the company. Users now take a public view of their interaction with Facebook. They will not become trusting friends again.

As it stands, Facebook is secretive and now by threatening The Guardian, they are taking a distinctive turn toward control that reveals how little they understand boundary coordination. A cascade of research about what Facebook really is and how it operates could restore trust in the platform. More access would allow users to understand how they interact with the system and to see real, interesting research in the public interest.

It could be that Facebook would require that researchers be validated in some way. Access to private posts or restricted Friendships could be restricted.

Research is not the problem. Opacity at Facebook is. As it stands, you can’t extract a copy of your own social network, but ultra-right wing researchers can do basically whatever they want.

--

--

Dan Faltesek

Associate Professor of Social Media, Oregon State: These are my opinions, not theirs. Read my book: Selling Social Media (Bloomsbury Academic), 2018.