Glowing white dots fade across a black screen. Is it a screensaver? One emerges with a slightly different behavior, swallowing a dot whole and then disappearing with it. "Sometimes, these turn up. At first, they look like the other dots," says Harue, pointing to the screen. "What are they?" asks Ryosuke.
"I don't know."
This is a scene from the 2001 Japanese horror film Pulse (回路). The movie's characters are haunted by interfaces that present them with incomplete snapshots of reality, glimpses into the lives of unknown strangers, and uncanny images of their own bodies seen from behind. Where everyone is stalked by digital sensors, in Pulse there is the interface, and then there is something else. Something neither here nor there, which eludes total comprehension but manifests as a trace of a forgotton logic.
"What's it for?"
"A miniature model of our world... But only the grad student who designed it understands it."
Beneath the surface of what we see on screens, we can tell that something is changing, but we're not always sure how or why. It's easy to rationalize this feeling as the work of some algorithms, designed to influence us according to some proprietary plan from Silicon Valley. At the same time, we can't prove any of this firsthand – all we have is a feeling of anticipation, excitement, or prying eyes that persists despite any and all intentions to evade it. There is a sense here of what the media theorist Eugene Thacker calls dark media : objects that present otherworldly phenomena to us in a broken way, evoking a feeling of estrangement because of their incompleteness. The question arises: if we could open up the algorithmic black box, would we find anything there? Or is it precisely through its sensible manifestations that we begin to comprehend the effects of algorithm logic in the first place?
All the while, it seems that our explanations of algorithms are much more mechanical. Algorithms are described as sequences of rules or recipes, codes of law that regulate and normalize what we can see, do, and think. They are seen to enact their normalizations either through a determinable code, obscured behind a black box, or through operations on human subjectivity, obscured behind the original black box of human consciousness. This sense of being regulated and controlled by computational protocols motivates many to argue that we can think of algorithms like governance systems. If it's an algorithmic code that's governing us, then perhaps the path to emancipation is to take control of it, repurposing it for good...
But how emancipatory is it to take control of a system which haunts, entrances, invokes suspicion? Is a paranormal interface made out of codes of law that we could legislate to better ends, or something more sinister? Could we harness its power of abstraction, while exorcising its ghosts?
What is unique about algorithms, especially in contrast to other media, is their capacity not only to synthesize phenomena, but to arrange these phenomena so that they correspond to material circumstances, as well as to optimize these arrangements according to certain heuristics – quantifications of engagement, belief, or trust. This faculty of paramediation presents problems not only for understanding algorithmic governance, but also for thinking governance as such. We are dealing with media that are simultaneously true and false, and definitively neither; phenomena derived directly from data collected about material circumstances, but also parametrically designed to deviate from them according to certain heuristics. Paramedia evoke what Jean-François Lyotard called paralogy : the production of meanings that are evaluated on account of their practical efficacy rather than on their ontological accuracy or epistemological generalizability. Paramediation enables the arrangement of phenomena that correspond to reality, but which can't be said to have ever been located there – a phantom epistemology.
Wile we continue to think of algorithmic media like regulatory protocols, a more generative analogy might be found in interactive media, especially from the horror genre. The uncanny fluctuations of the "hypernudge"  – sending you push notifications at relevant times based on your data – express less a "regulation by design" than what the developers of Amnesia: The Dark Descent realized about horror: it's much more effective to send vague signals to someone based on their behavior, than to control their behavior outright. Enemies should affect the semiotic environment, but if they control behavior directly, it ruins the sense of fear that they can evoke. We should take care to recognize that this is something different than a procedural rhetoric , which uses a sequence of rules to construct an argument that appeals to logos or pathos. It is something much more direct, even viseral: a procedural horror where the medium is less a message than a sense of dread.
But the full implications of paramedia are difficult to track without considering their relationship to the truth. Here, paranormal investigation apps called ghost boxes give us a sense of how crime prediction algorithms mediate the truth, even more so than the mathematical formulas that underpin them. Ghost boxes don't make any definitive statements about whether ghosts exist; they simply sample radio waves, give them some annotations based on their form, and leave it up to users to decide. And they are no less influential . Suggestive mediations of this kind have been weaponized in the crime hotspot, which is supposed to reveal places where crime is likely to occur. The hotspot is less a decisive regulation or representation of reality – that is, something that makes a definitive decision about the truth – than an equally lethal composition of data, parametric framing, and loose congruence with the built environment to evoke a sense of suspicion in certain places. For the ghost box and the hotspot, statistical inferences tailored to data optimize, distribute, and concentrate sensations of anticipation and dread. The charge that these systems fail to represent the truth is met with the response that this is not their purpose at all – they merely aim to suggest something about it, provided the resources available...
The likeness of the hypernudge and the crime hotspot to horror media is due to the fact that all are concerned with designing computation to elicit particular perceptions, leveraging procedural and parametric modulations of algorithmic semiotics to induce sense without regulating activity. This capacity suggests that, against calls to open up the black boxes of algorithms to inspect their code or ontologies – their declarations about what exists or ought to exist in the world – we had better understand perception as a target of algorithm design all along: how developers consider the ways that algorithms will be perceived, as well as how they how these considerations are encoded in the arcane logic of hyperparameters. Such a shift from ontology to phenomenology, and namely to a post-phenomenology of the material practices that induce perceptions , would concern how algorithms are designed to figure and moderate phenomena according to data collected about events, activities, and reactions.
As algorithmic techniques become more advanced at balancing between normal data collected about material circumstances, and paranormal modifications designed to elicit particular responses, there still remain moments of rupture where something feels off. When paramediation goes wrong, it may manifest as the algorithmic paranormal, or uncanny figurations that betray their artificiality. From synthesized human faces with mutated edges, to the creepy feeling that recommended content is a little too relevant , we catch glimpses of what is at stake in the calculated arrangement of appearances. Attending to these parametric aesthetics, or paraesthetics, can help us to identify when paramediation is at play, and to discern what algorithms are designed to do to us, beyond sorting our information or controlling our activity. Rather than identifying whether data or interfaces represent the world accurately (mimesis), or insisting that they should augment human actions with tailored abstractions and automations (prosthesis), we examine how algorithms enable the arrangement of appearances according to data, as well as the optimization of these appearances in order to evoke particular sensations (paraesthesis).
We need to take account of these paramediations, understanding their contours and threats for epistemic justice : how they can be mobilized to influence behavior and drive perceptions, and by whom. Such a project might take the form of a bestiary of algorithmic dark patterns, documenting how specific arrangements of phenomena are designed to entice and seduce behavior. It might look more like Gilles Deleuze's creative taxonomy of cinematic semiotics , where every relation between images evidences an underlying architecture of signs, capable of evoking more or less alien affects. But it also might go beyond the canon of taxonomies, to implement creative appropriations of paraesthetics. It could take the form of what Ezekiel Dixon-Román calls Black techno-conjuring : the intensification of discrepancies between data and algorithmic inferences that result from an attempt to force "the creative indeterminacies of Blackness" into code. In any case, these experiments suggest that there is no one algorithmic mechanism that lends us the prospect of seizing control and emancipating ourselves from its logic. Instead there are many, each one producing various types of perceptions and disjunctures that we need to learn how to be sensitive to; that is, to learn and share experiences about their effects, on the way to a more widespread recognition of their politics.
Paramediation is more than a question of biased data, automation, misrepresentation, quantification, or even prediction – questions which concern the fealty of data to human judgment, to ideal configurations of the status quo, and more generally to the truth. But the truth is always behind the data; the question is how: how its appearance can be systematically arranged to to have certain effects. This does not lend itself to being thought in terms of longstanding values of equal representation, freedom of choice, accountability, participation, and transparency. While these principles have operated as guarantors of human liberties, provided a specific understanding of the relationship between individuals and the state, they are beholden to specific assumptions about how technical media operate in relation to human subjects. While heuristics of representation and transparency are suited to identifying whether media represent reality truthfully or accurately, they are inadequate to addressing modes of perception and interaction that algorithms facilitate.
As increasingly many take aim at the deceptive logics and dissimulations of algorithms that misrepresent the world (mimesis), and others want to repurpose these mediations for the capacity to abstract the world and augment human ability (prostheses), what we require is a break from the idea that algorithms exist primarily to represent the world in the right way or not. Indeed, designers have already set their sights beyond ontological accuracy: on the design of incentives or algorithmic experiences . In light of these experiments, we must learn to be sensitive to the effects of technical practices that can evoke sensations by coordinating phenomena in relation to one another. It is now widely understood that algorithmic power is informed by exploitative relations of class, race, gender, and ability, while it also reinforces them. At the same time, the technical capacity of algorithms for behavioral discrimination and phenomenal arrangement is irreducible to thinking these relations in terms of algorithm logic. As the black box is exorcised of its demons, we will find that the problem was never to be found there all along.
This notion, that social relations are actually embedded in algorithm code, obscures precisely how code is designed to operate on the order of appearances: to produce effects that are designed with human perception in mind. The materiality of algorithms is irreducible to their code; we must recognize human perception as a material aspect of algorithm operations as well. Without this regard for paramediation, we are left with a desire to reprogram the code, as if this would reprogram our social relations as well. But algorithms are not merely codes of law that regulate activity – they are dynamic media that participate in our capacity to make sense of the world. Far from enclosing real phenomena in a black box of quantification, deception, or ideology, paramedia inflect their appearances in ways that are diverse, carefully calibrated, and not beholden to any single regime of regulation – and no less violent. It is only by recognizing this diversity of practices that we can acknowledge their power, and overcome the desire to take control of them once and for all.
"It's something we programmed here...
I wouldn't suggest starting at it for too long."