John Cheney Lippold, A New Algorithmic Identity - kredati/media-theory-encyclopedia GitHub Wiki

A New Algorithmic Identity

Soft Biopolitics and the Modulation of Control

Imogen B.

SUMMARY

Cheney-Lippold breaks his argument down into four distinct sections. In each, he outlines the definitions of significant terms, as adopted from various theorists, and uses these as a foundation from which to build upon in relaying his perspective regarding the concepts surrounding algorithm and identity.

Code and Categorization

He begins by establishing code as a concept visually, drawing on Lawrence Lessig’s notion of code as a kind of infrastructure that builds the environment in which we navigate online. In this school of thought, code is seen as an “architecture,” whose intentional and unintentional rules have a profound effect on the ways we are able to behave online (166). In this architectural sense, code builds pathways we can walk down, which open up our online experience but, equally, put boundaries upon it.

Bringing in the concept of control, which is further explored later in the piece, Cheney-Lippold approaches it not only as an environmental force – the structuring of online space – but as a semantic one too. Code “[gives] meaning to digital artifacts” – the websites, images, videos, and objects we observe and/or engage with online – defining certain “values” which trigger the specialized targeting of content to users (166). He posits the example value, X = male, which is used throughout the essay to elucidate further points. The content presented to a presumed male user is found based upon this core value.

Cheney-Lippold turns to Fuller’s simplified rubric for how code interacts with users. This suggests code speaks in two distinct ways. First, there is a “delimiting action.” Users are given a finite number of choices of how to behave. The example provided is a drop-down menu in which the user has a choice of sex: male or female. This stage is objective, collecting quantitative data that code can assimilate into patterns based on value. Second, code is influenced by “cultural discourse” (167). It utilizes real-world understandings of categories such as gender to define its own categories. This stage is subjective, collecting and digesting qualitative information by looking for lucid patterns.

In this section, Cheney-Lippold concludes that codes are “cultural objects” that have been incorporated into our societal structure, by both constructing and assimilating the meaning of categories. These are the categories that arrange populations of people. Code simply translates this meaning into mathematical algorithm.

Categorization

Cheney-Lippold then examines categorization more closely, concentrating on the process of defining ‘X’ from the perspective of a consumer marketer. He states that there has been a shift in marketing strategies, from categorizing consumer populations demographically, to categorizing “psychographically” (167). In the past, demographics would look at the geographical arrangement of users, and suggest advertising content based on one’s address, and the concordant post code value of this address – that being, the average house prices and income of the residents. However, now, marketers have started using data from search queries in order to create “databases of intention” (168). These help companies understand what users intend to buy, thus increasing the effectiveness of targeted advertising, with the overall goal of generating greater revenue. Our actions are codified based on its profitability, forging an algorithmic taxonomy of human behavior.

To ensure this process is consistently efficient, companies collect data in real-time. Code updates itself as quickly as society does. When users make changes to their behavior online, the content they are suggested also changes to align with the algorithm’s new version of their perceived identity. Here, Cheney-Lippold refers to the Deleuzian theory of the changing model of societal control, which suggests that control is a “modulation,” constantly shifting and changing with each new piece of data. Thus, the “architecture” of code previously posited is constructed with an element of elasticity. It can continually update its own boundaries as users move around the open environment (169).

The key to this form of control is the effect it has on users’ lives, on- and offline. Code controls each individual’s life by “tailoring its conditions of possibility” (169). A user’s online behavior in the present, which is analyzed by code to derive its capitalist utility, is very much harnessed to the possibilities that they will be presented with in the future. This is because algorithm is governed by the generation of capital. The way a user is identified and categorized by code is first and foremost linked to methods of encouraging user spending. Thus, identity online has its foundations in consumption. The search results and advertisements a user is presented with are based on the algorithm’s assumptions of that user’s categorization, and concomitant predictions about their future purchases.

Soft Biopower and Soft Biopolitics

In this section, Cheney-Lippold begins to explore the Foucauldian concepts of biopower and biopolitics. He focuses on biopower in terms of the “construction of identity” (172). Biopolitics, he explains, was introduced in the eighteenth century, as a way for the government to have more control over the population through categorization. This is ‘hard biopower’. It “regulate[s] life through the use of categorizations,” such as gender, age, race, and occupation. Cheney-Lippold characterizes the new biopower of the digital age as ‘soft biopower.’ This is the how in the process of regulation, in how “biopower defines what a population is” and how the categories of this population are established in cultural discourse (175).

He explains that category construction is biopolitical because, as previously established, it effects the opportunities users are given: the search results and advertisements they are presented with based on the associations built into algorithm regarding one, their search query and two, their presumed identity categorization. The modulating architecture of code has afforded an “elasticity” of power, which has created a shift in the relationship between the biopolitical authority and us, the users (174). Cheney-Lippold describes this as a regulation of randomness, because information is so tailored to the individual. This has a twofold effect on the relationship, biopolitically speaking. He uses the example of male-targeted medical advertisements. First, such advertisements can “manage life” at a physical level, reducing STD transfer as just one example, Second, they can manage life at a psychological and intellectual level, by positioning maleness, as any other category, to control the information that presumed male users are exposed to (174). Thus, in the online space, nothing is random. Everything is tailored and controlled.

Control

The final section considers categories as “mechanisms of regulatory control” (175). Conceptually, control is understood as a force of regulation that “opens and closes” the opportunities of data and digital objects a user can come into contact with. The subsequent feedback mechanism functions in this sequence:

  1. A user enters a search query

  2. The system suggests results and advertisements based on this query and the user’s presumed categorization of identity.

  3. The user makes a selection.

  4. The presumed categorization of identity is reinforced or modulated in response to this action.

Step 4 in this model indicates the system in place for remedying mistakes made in the categorization of users.

Historically, the relationship between surveillance and the surveilled was “direct.” Devices like CCTV cameras were explicit indicators of surveillance, of which the public are at least somewhat aware. The current mode of surveillance, however, is indirect. Cheney-Lippold cites the definition offered by Nikolas Rose: “government at a distance” (176). Users cannot interact with the algorithm as such, or observe the tangible body as it observes them.

Cheney-Lippold comes to a concluding definition of what control is in the online environment. Deleuze conceptualizes the current ‘society of control’ as one which is open, unlike the enclosed societies of the past. While this openness is a kind of freedom, it also leaves users susceptible to “surveillance and manipulation” (177). These “open mechanisms of control” value populations catalogued by category more than individuals themselves (177). Control is something that the algorithm is gaining at the same rate the users are losing. The meaning of categories is becoming something defined by code, and users exist within these modulating definitions. Thus, the new algorithmic identity is “free” but “constantly conditioned.”

INFLUENCES

Cheney-Lippold refers to two key figures of reference that shape his thinking: Michel Foucault and Gilles Deleuze.

Michel Foucault: Biopower and Biopolitics

Cheney-Lippold turns to French theorist and philosopher, Michel Foucault, primarily in his references to the concepts of biopower and biopolitics. Foucault gave meaning to these terms in a post-structuralist discourse around power, knowledge and control. Cheney-Lippold references Foucault’s ‘Ethics: Subjectivity and Truth’ which defines biopolitics as a mode of governmental control that began in the eighteenth century in order to overcome certain problems associated with the management and regulation of a growing population, such as initiating and maintaining practices for health (Foucault 73).

The term biopolitics encompasses the wider theoretical framework, while biopower refers to the Foucauldian shift from “power of sovereignty” to “power over life” (Cheney-Lippold 172). Sovereign power is “repressive,” controlling the population through legal means (Lilja & Vinthagen 109). “Power over life” is, comparatively, realized through biopower. This focuses on the governance of the population as a whole, “steering the general behaviour” of society (118). A model of control under biopower uses statistical data to analyze and categorize the population, and make predictions about future behaviour that deter random causalities. Biopower refers to a certain freedom of society, in which the authority is less interested in individual behavioural data, than the “overall productivity” of a population, which can in turn be utilized for capitalist gain. Cheney-Lippold looks specifically into algorithm as a technological means for biopolitically controlling the online-saturated population, ultimately in service of guided consumerism.

Cheney-Lippold refers to Foucault’s 1992 theory of the “conduct of conduct” (174). Foucault posits the term ‘conduct’ is a useful word for describing the way power relations work in the “power over life” society he envisages. To “conduct” means to control and organize behaviour, but “within a more or less open field of possibilities” (Dreyfus & Rabinow 220-21). Thus the form of control Foucault refers to is one that is not so much a “confrontation” between the theoretical authority and the population, but a more covert relationship, based on the shepherding of knowledge, which in turn controls the possibilities and opportunities individuals are exposed to.

Gilles Deleuze: Postscript on the Societies of Control

Cheney-Lippold structures his exploration of the architecture of code and the environment it has created around Deleuze’s theory of disciplinary societies and societies of control.

In Deleuze’s essay ‘Postscript on the Societies of Control’, he argues disciplinary societies dissolved after the Second World War. This model of society was structured around the “organization of vast spaces of enclosure,” in which everyday life is made up of a series of “closed environments,” such as those which encompass family, school and work, through which individuals move arbitrarily between as time progresses (3). When one environment has been sufficiently expended, they move onto the next environment in the structured sequence. Space and time is very much ordered into strict “molds,” and individuals are organized as a collective mass because each mold is identical for each person (4). Deleuze uses the factory as a symbol for the disciplinary society. This is a space comprehensively controlled by the demands of capitalism, where individuals are simply the cog in a larger machine force. The capitalist owns both the professional and private spaces of the worker – the factory and the home (6).

Since the dissolution of this structure, societies of control have come to the fore. These differ from disciplinary societies in their more abstract organization of space. According to this model, spaces are not enclosures but rather, “modulations” (4). These have a degree of flexibility, adapting to changes in society to maintain a constant level of control. These environments coexist at once, and as such are never ‘complete’. Individuals move freely between the different stages in the model. Capitalism is no longer production based, but concerned more with the end product, with a greater focus on marketing practices (6). Cheney-Lippold uses this model as a structural basis for the way the new algorithmic online world has created an environment of suggestive control, as Deleuze describes. Code modulates, rectifying its boundaries with each piece of data it accumulates, in order to maintain the free, yet governed space in which the user can explore (Cheney-Lippold 169).

Deleuze describes the inhabitants of the society of control as ‘dividuals’. If an individual is a thing that, by definition, cannot be divided or broken down beyond its basic form, a ‘dividual’ is a thing that is endlessly divisible. Cheney-Lippold brings in this concept as a way to understand the constant categorization of the population, seen through the eyes of the algorithm as a homogenous mass, represented only by the presumed classification based on a select series of mathematically-derived options. These ‘dividuals’ have become the “axiom of control” in the algorithmic environment – the arbitrary, objective value that shapes the modulations as they occur (Cheney-Lippold 169).

ANALYSIS

The structure of Cheney-Lippold’s essay emphasizes the importance of the foundational code at its most basic level. Each section builds upon the previous, to logically elucidate the way in which each stage of the ultimate control garnered by coded algorithm – culminating in the final ‘Control’ section – is constructed by virtue of the other elements. The first section focuses on code, the building blocks of the online environment. Referred to as “architecture,” this is the material language of everything that exists in the digital world. The second section explores categorization, and how this code is utilized to create and then exploit these categories of the online user population. If code is the building blocks, categories are the walls that these construct. The third section, which applies the Foucauldian interpretation of biopolitics into the discourse, articulates the flexibility of these walls, able to bend and transform to the changing behaviours of the users. Continuing with the architectural analogy, if a user begins to walk off the path, the walls will bend around them, containing them in an open – as they have the freedom to move wherever they choose – yet continually modulating environment, one which updates itself to preserve a level of control and authority. The fourth and final section, ‘Control’, solidifies the function of this environment to maintain a Deleuzian ‘society of control’.

Cheney-Lippold’s notion of “open mechanisms of control” appears to have underpinnings in Mark Hansen’s discourse on ubiquitous computing. Hansen describes this, referred to simply as ‘ubicomp’, as computing that exists at any time in any place (67). Devices, such as personal computers and smartphones, are by nature portable and, as such, function as a key element in the daily lives of their users. The key characteristic of ubicomp is that it is invisible. It has become so normalized by society that it is now at the “periphery” of our perception (69). The “ubiquitous sensibility” that Hansen defines is “non-conscious,” by the very fact of its familiarity in our daily routines, as members of a society saturated by technology (73). It addresses us at a microsensational level, which, phenomenologically, is imperceptible to human consciousness (72).

The code that Cheney-Lippold explores works in conjunction with ubiquitous computing. It has the same covert impact on our lives: omnipresent yet just to the edge of our conscious perception. Hansen suggests that the media has become a setting that “we experience simply by being and acting,” unaware of our behaviour within it (73). Code and categorization elicits this response through the herding of online activity. Through categories, we are shepherded towards particular content and advertising, and this targeting of content often goes unnoticed. Cheney-Lippold describes a “homeostasis” that is achieved when we are being regulated through statistics, creating an equilibrium between the herder – the capitalist-driven code –, and the herded – the users (172). This creates a “seemingly seamless” online experience, in which we are provided with content that we can either accept or reject. Either response stems from an unconscious simultaneous acceptance of and disregard for the coded mechanisms that make up services like search engines. Code has become ubiquitous and thus, invisible, despite its constant alertness to changes in activity which, in turn, modify our experiences online.

INTERPRETATION

Cheney-Lippold’s essay contextualizes his analysis with the real-world example of Quantcast, a web analytics firm. The algorithm the company employs is used here as a model for the way algorithm works in general in the online environment. He explains the more a user searches Quantcast’s tracked sites, the more developed their online identity becomes, and the more they are inferred to be understood based on consumerist intention. Based on this process, and as explored in the Mark Hansen analysis, the algorithm and the user do not interact. Taina Butcher, however, examines the situations in which the algorithm and user do interact, to some extent. Focusing her discussion around several simple case studies, Butcher looks specifically at the social media platform, Facebook, to analyze what happens when users “become aware” of the algorithms that are generating the specialized data before them, and how this may affect their overall experience (30).

Butcher references a study by Eslami et al., which looked at the level of ‘algorithmic awareness’ in Facebook users (31). To contextualize, Facebook uses specific algorithms to tailor users’ News Feeds to their interests and consumer traits. This content is made up of, among other things, news articles and adverts. This tailored content is intermingled with posts from that specific user’s family and friends, with whom they have granted a connection through the act of ‘adding’ them as a Facebook friend. The study found that more than fifty percent of their sample was not aware that their Facebook feed is curated (31). Thus, in general, people have low algorithm awareness. The researchers argue this lack of knowledge could lead to serious consequences, if users infer that it is their Facebook friends that are bringing this content to their attention, rather than understanding that companies are targeting them individually. This highlights just how covert the coded herding of users is, bringing into question whether users really are as “free” – even if existing in an online ‘society of control’ – as Cheney-Lippold suggests.

One of Butcher’s own case studies indicates some users are aware of Facebook’s individualized algorithm. They “intuitively understand” that their online activity is being tracked, and Facebook connects this activity to the advertisements and content they are offered on its platform (34). For example, if the user browses an online shopping site, and clicks on a certain item of clothing, this specific item is likely to appear amongst, or in the advert banner bar to the side of, that user’s feed. For the participator in Butcher’s case study, she became aware of the Elmer-dubbed ‘profiling machines’ when Facebook began suggesting dating sites, after she had started using dating apps and had posted about being single. The participant was left confused by this algorithmic occurrence. Butcher notes that while Facebook’s algorithm was correct in inferring she had become single, and was looking for a potential partner, the presentation of these inferences “feel[s] wrong” inherently (34). Cheney-Lippold examines the biopolitical control algorithm has, in its managing of life at a physical and intellectual level, by opening and closing the doors of what we are exposed to on a biological level. He uses the example of adverts encouraging the practice of safe sex to prevent the transfer of STDs. In Butcher’s example, biopower is having a phenomenological effect on the user. Her experience of the online world is negatively tainted by her awareness of the code’s presence just beneath the parameters of what she can see at surface level. It highlights the way she is observed by the algorithm, as a “dividual,” broken down and categorized into fragmented pieces of data.

Another of Butcher’s examples brings to the fore the question of how accurate algorithmic identification is in relation to reality. This second case study sees a woman in her mid-40s miscategorized as a younger person as a result of her own blog post about Taylor Swift (34). While the person in question found the incident amusing, she was equally offended by the manner in which Facebook “make[s] assumptions” about her identity, based simply on her digital presence (34). She does not agree with these assumptions, rejecting the ‘new algorithmic identity’, to reference Cheney-Lippold, that algorithm had presented her with. Cheney-Lippold indicates that algorithms have a fallback stage in their feedback loop which adapts user identity presumptions when new behaviours indicate mistaken categorization (176). However, as Butcher’s case study demonstrates, this mechanism is not infallible. Miscategorizations can lead to a discord between how one identifies based on their own understanding and relationship with themselves, and how algorithm identifies them as one user in a larger categorized group. If not consistently throughout, Cheney-Lippold does provide a somewhat positive critique of code, in the way it “de-essentializes” categories of populations, such as gender and age, from “corporeal and societal forms” (170). This description invites an interpretation of code going against social stereotypes, even if only to “re-essentialize” these categories as objective markers of consumerist associations. Butcher’s case study negates this understanding of freedom from stereotypes, illustrating how they can, in fact, be reinforced by code through capitalist interest. This only becomes lucid when a miscategorization occurs, as it draws attention to the categories a user is definitively not a part of, and so they can be observed from an outside perspective.

Works cited

Butcher, Taina. “The algorithmic imaginary: exploring the ordinary affects of Facebook algorithm.” Information, Communication & Society 20 (2017): 30-44. Taylor and Francis Online. Web. 12 Dec 2018.

Cheney-Lippold, John. “A New Algorithmic Identity: Soft Biopolitics and the Modulation of Control.” Theory, Culture & Society (2011): 164-181. Sage Journals. Web. 6 Dec 2018.

Deleuze, Gilles. “Postscript on the Societies of Control.” October 59 (1992): 3-7. Jstor. Web. 8 Dec 2018.

Dreyfus, Hubert L., Rabinow, Paul. Michel Foucault: Beyond Structuralism and Hermeneutics. Chicago: The University of Chicago Press, 1983. Web.

Foucault, Michael. Ethics: Subjectivity and Truth. New York: The New Press, 1998. Web.

Hansen, Mark. “Ubiquitous Sensation: Toward an Atmospheric, Collective, and Microtemporal Model of Media.” Throughout: Art and Culture Emerging with Ubiquitous Computing. Cambridge, The MIT Press, 2012. Web. 6 Dec 2018.

Lilja, Mona., Vinthagen, Stellan. “Sovereign power, disciplinary power and biopower: resisting what power with what resistance?” Journal of Political Power 7.1 (2014): 107-126. Taylor and Francis Online. Web. 8 Dec 2018.