All Roads Lead to Rogan

Originally published in GRAPHITE Journal

It is becoming common to take for granted that social media platforms like YouTube contribute to reactionary political views. The radicalization pipeline, that conspiracy of algorithms that pushes viewers toward more extreme content, is decried for directing our attention, appropriating it against our will for the benefit of ideas we may have never considered otherwise.

But the theory of radicalization pathways on YouTube paints over, with broad strokes, the experience of using the platform. YouTube's content environment does not appear to be unilaterally directed—every rabbit hole leads to another one, and there is no bottom. We encounter less of a directed pathway or pipeline than an intermixing of special-interest topics, some of which appeal to us more than others.

If algorithmic filter bubbles have a goal besides the optimization of attention and profits, it is not to orient people toward particular views, but to provide them with content that resonates with their existing values and dispositions. In the case of politics, this is content that substantiates subjective values and refutes others that oppose them.

This type of content manifests most visibly through the language of "common sense." Common sense reasoning captivates a large, apparently active audience, like on popular channels like Joe Rogan's. It is apolitical by its very nature; that is, to whoever already shares its political presuppositions. If YouTube algorithms have an influence on political rationality, it is through delivering content that appeals to a viewer's specific sentiments, such that it can manifest as common sense.

From this perspective, insulation and inculcation isn't an effect of the algorithm itself, so much as the appearance of its application to a politically fragmented world. Rather than prisoners inside of filter bubbles, we find partisans that weaponize them. The filter bubble gives them the opportunity to smash the like, frequently, and with as much emotional intensity as possible.

The alt-right political figure of modern consternation is less a radical, pushed into extreme views by algorithms, than a sophist. This is a person obsessed with collecting rhetorical arguments and statistical examples that support their views. They have little regard for other ways of being, thinking, and acting that are illegible to the logic they already consume. For this sophistry, an appeal to rationality and a definitive truth is not the enemy, but its bread and butter. This is evidenced by the popularity of "gateway" channels to otherwise radical content: they display a strong commitment to "facts and logic" that refute opposing values.

We should interrogate this language of "common sense," as well as how the form of the filter bubbles leverages—and is designed to support—its aesthetics to captivate an audience. The idea of a radicalization pathway presupposes a spectrum of left and right political values, skewed to either extreme. But YouTube's algorithms are much more effective at demonstrating to viewers the utility of sophistry. They do not need radicalization pathways to do this; they simply help viewers collect arguments that support their views.

It is not controversial to say that political media takes advantage of its viewers' dissatisfaction, resentment, and confusion to gain their attention. But social media's real innovation is to tailor this mechanism to multiple political perspectives simultaneously, driving all viewers to content that supports their views. Here there is some truth to the consternation that social media foments political polarization. But that is less because people have forgotten how to talk to each other, than because recommended content has made them too good at doing so.

Pick-up artist tutorials on YouTube provide us with a clear illustration of this logic. These videos, often disseminated through ads, leverage the viewer's insecurities and frustrations to provide them with rhetorical and behavioral strategies for improving their circumstances, ostensibly on the way to overcoming their sexual dissatisfaction. Here the intent to persuade is more candid than in "common sense" language—which must, in order to appear as such, act as if no persuasion is taking place. But the aesthetics taken up by pick-up "artists" are the same: providing viewers with objects of attack, serial offensive rhetorical techniques, and a focus on the effects of language instead of empathy.

The idea of radicalization pathway is still useful for theories of deliberative decision-making, which assume that exposure to diverse arguments and fora for discussion is the substance of advanced political thought. That way, algorithms can be at fault for denying us alternative perspectives. But if people already gravitate toward views that support their material and affective interests, then a diversity of views does little to introduce them to new ways of thinking. And an algorithm that appears impartial to this way of thinking will be decried; which is to say, it will be politicized.

The trick is that an algorithmic recommendation system can be impartial to a left-right political divide, while still amplifying particular political aesthetics that influence this divide, even when they transcend its dichotomy. The strategic entertainment of conspiracy theories by Russia Today and China's "50 Cent Party" of online commentators attest to the ability of alternative views to undermine confidence in a political position. But while these methods can be seen to frustrate political perspectives at the margins, they also radicalize political views at the norm, which manifest as the only salient ones.

As long as the algorithm appears responsible for driving viewers to extreme content, there is no need to challenge this content in the first place; instead the solution to enforcing democratic values lies in algorithm design. This works out such that both progressives and conservatives view YouTube as partial to the opposing side. In response, the algorithm should simply be designed to uphold free speech—with the routine exception of targeted censorship—which works to its financial advantage.

To this end, any promising solution would lie in reframing the way we conceptualize the influence of algorithms. One way would be to acknowledge YouTube as a repository of tailor- made ideas instead of a system for homogenizing them. To contest its effects, we have to acknowledge the opportunities for thought it provides to people by responding to their dispositions, rather than locating the origin of this thought in the platform itself.