Decentralized Censorship

The consequences of the privatization of the public sphere

As public discussion and commentary shifts from physical gathering places to social networks on the Web, the public sphere—if there is anything left of it—becomes privatized. In the new privatized public sphere it is no longer just the laws of the state that control and censor what can be said and done there—now this prerogative falls to the companies that run the networks. It need not be said that nobody is forcing people to use these private networks. This is irrelevant for the simple fact that people are using these networks in the manner of the public gathering place, that is, they are using them as a public sphere—whether they have been forced to use them or use them voluntarily is of no importance. The result is that the public sphere has been privatized.

The biggest problem with this is not that companies can now censor discussions, but that censorship becomes invisible. Owners of private physical places can enforce censorship and rules over how these places are used if they are open to the public, but these rules are obvious to all who go there because, unlike messages on digital networks, you cannot simply mute someone’s words in broad daylight. If someone breaks your rules, you would have to confront them, and, failing to comply, eject them off your property, an action that everybody present will be able to see. On the other hand, social networks can censor messages without anyone seeing, and can even go as far as to ban everyone else from seeing what a user is posting without that user even being aware that they have been banned. What this sort of censorship resembles is not a set of rules on private property, but repressions in tyrannical regimes in which books and neighbors simply disappear, never to be seen again.

What is lost in the privatized public sphere is not only control over censorship, but even the awareness that one is being censored. People’s minds are funneled through a managed simulacrum the existence of which they do not know, the parameters of which they have no control over. Perhaps worse still, the people running the simulacrum are not external to it, not isolated from it, but themselves funnel their own minds through it daily, and because the system is such that no one mind controls the whole, that no one algorithm filters the flow, they themselves become the victims of a system of censorship that nobody is responsible for, a system that evolves by itself: an independent process stuck in a positive feedback loop.

To have a regime ban a list of books or media organizations is one thing, it is quite another to live under a system of censorship algorithms the extent of which nobody knows and over which no one individual can be held responsible. In the former case, even if the regime does the censorship in secret there still remains a limited number of people who manually control the process, which means that the process can be changed once those people lose power. In the latter case, the decentralization of the process, going from centralized state censorship to decentralized censorship of private companies, coupled with automation, shifting from manual censorship to algorithms, results in the process leaving the sphere of individual control, and since it affects the very individuals who are managing it (since they use their own networks), it conditions the minds of the censors themselves in ways they no longer fully control.

September 2016