top of page

Algorithms, Meaning-Making and HBO's Westworld

  • Writer: Inga Pavitola
    Inga Pavitola
  • Oct 20, 2020
  • 11 min read

Updated: Jan 27, 2021


originally written in June 2020 as an academic assignment


Intro

Algorithms and big data have become a significant part of our everyday lives, especially when referring to anything we do online or on our computer. Media production nowadays is almost entirely consumed digitally — be that on streaming platforms that record, collect and trace our interactions directly, or less often by acquiring a certain piece of media on a physical copy like Blu-ray disc or consuming the desired content in a less legal way. Nevertheless, all these interactions can be recorded and analyzed. In this context, it is not surprising that the narratives of new media production began to center around the ideas of algorithmization and big data.

In this paper, I will focus on the third season of HBO’s TV Series Westworld. It will serve as a case study to see how the concepts of algorithm, platform society, and big data have shaped the central story arc of the series and to look at how this narrative potentially reflects on the broader ideas of the production of meaning and struggle for power within a society.

About the Series Westworld is a TV Series created by Jonathan Nolan and Lisa Joy and produced by HBO — an American pay television network owned by Warner Media Entertainment. The central premise of the TV series is based upon the 1973 film with the same name, although the actual story is original and not a remake. Westworld first aired in 2016. Contrary to the praxis of streaming services like Netflix, HBO continues to broadcast its productions one episode a week. The third season of Westworld consisted of 8 episodes, premiered on March 15, 2020, and concluded on May 3, 2020. On HBO’s official website the series is described as follows:

Follow the dawn of artificial consciousness and the evolution of sin in this dark odyssey that begins in a world where every human appetite can be indulged. Aaron Paul, Vincent Cassel, Lena Waithe and Scott Mescudi join Evan Rachel Wood, Jeffrey Wright, Thandie Newton, Ed Harris and more for the upcoming third season, which will explore questions about the nature of our reality, free will and what makes us human.

The artificial consciousness in Westworld is represented by sentient robots referred to as hosts within the series. These robots were created by a company called Delos for the theme parks that they run. One of these parks is called Westworld because of its western aesthetics. Rich people come to these parks in complete anonymity (or at least seemingly so) to act out their darkest desires by abusing and torturing the hosts. In the course of the first two seasons, the hosts began to get aware of themselves and the injustices and abuse they needed to suffer through and eventually rebelled against their creators.

In the third season, the main events move from the Westworld theme park to the real world, and it is soon revealed that basically every aspect of human life in the real world is controlled by a big data company called Incite. In the universe of Westworld, Incite advertised itself as a company that would help you become the best you that you can be by helping you make the right life-changing decisions based on calculations by a computer.

Stylistically Westworld is dark and minimalistic, avoiding any bright or vivid colors. A lot of the events take place at night or in enclosed spaces. The shots are often very long and slow. The soundtrack enforces this effect with futuristic and melancholic background music. It is a dystopian view of the future — beautiful on the outside and rotten on the inside, and the stylistics of the show seem to convey this idea.

The central story arc of the season focuses on the conflict between Incite and the hosts lead by one of the main protagonists of the series — Dolores portrayed by Evan Rachel Wood. Hosts see Incite as a representation of authoritarian control of a corporation over the society.

Algorithms and Society There is a relationship between this view and what Jose Van Dijck talks about when referring to platform society. The term platform society is meant to emphasize the complex relationship between online platforms and societal structures. Van Dijck argues that platforms are an integral part of society that not only reflect the social but also produce the structures we live in. More so, the term platform society also refers to a profound dispute about private gain versus public benefit in a society where most interactions are carried out via the Internet. [1]

In the dystopian world of Westworld’s third season, digital platforms have indeed produced the reality in which humans live by calculating their future decisions through big data and predictive algorithms. In the series, all these calculations are represented by one mega-computer called Rehoboam. A curious example though is the social network/application called Rico that characters in the series use to commit crimes. Rico is a virtual space where one can place an order for a crime to be committed and someone else can sign up to carry it out for a cash reward. All elements of a modern-day mobile application are present — personal profiles with scores, geolocations, instant payments, all via constant connection to the Internet.

Of course, the impact that the social networks and platform society has on our lives is brought to an extreme in this particular example first and foremost for artistic and narrative purposes of the story, yet it could also be seen as a reflection on the potential dangers the society could face without understanding the underlying principles behind what digital platforms represent and how they function.

The question of private gain versus the public benefit that Van Dijck talks about is central to the story of the third season of Westworld as well. The creators of the series try to refrain from passing clear judgment on the subject and rather focus on the discussion itself. The hosts and their allies represent a deeply negative view on what predictive algorithms and big data can mean for the world, emphasizing not only the financial gain certain individuals would get from it but also the power over the processes in which value and meaning are created and assigned. On the other hand, the main antagonist of the season — the man behind Rehoboam and Incite, Engerraund Serac portrayed by Vincent Cassel — justifies his own actions by saying that he has the public benefit in mind. He argues that he is using what algorithms and big data can offer for the sole reason of coursing a better future for humanity, by eliminating wars, conflicts, and unnecessary death. He refuses to see his actions as authoritarian or controlling because they are based on pure and objective calculations.

Yet Van Dijck stresses that algorithms are hardly ever transparent. She cites Gillespie and Pasquale defining algorithms as sets of automated instructions to transform input data into a desired output. Desired output is exactly the danger that the protagonists of Westworld’s third season recognize in algorithms. Algorithms can be manipulated, and although on the contrary to Van Dijck, Westworld’s creators are hardly interested in the economic implications of it, the social and moral aspects of it are certainly important to them.

Rehoboam’s projection on Serac’s watch — still from Westworld Season 3


Myth of Personalization Another aspect of digital media, big data, and algorithmic culture that is tackled in the third season of Westworld is the idea of personalization and the conception of an individual within it.

In the article “How algorithms see their audience: media epistemes, and the changing conception of the individual” (2019) Eran Fisher and Yoav Mehozay talk about the big promise of personalization or mass-customization of content by digital media. It is a promise to personalize content effortlessly for users, based on a careful study of individual wants and interests by the media. This promise is what justifies the use of algorithms. They argue that what can be witnessed now is the move from an ascriptive conception of individuals in the construction of the audience in the mass media to what might be called a performative conception of the audience in digital media. This means that individuals are rather seen based on the behavioral data they produced while consuming content, and not by trying to deduce their actions by identifying who they are. Fisher and Mehozay eventually conclude that such a conception seeks the surface rather than any deep structure.[2] This doesn’t necessarily mean anything bad in the context of media consumption.

The creators of Westworld though take this concept further by exploring what applying algorithms to all aspects of human life could look like. The picture that is painted is grim. On the surface, the society in Westworld’s third season seems to be governed by order and prosperity. It is a highly advanced and technological society where disease and poverty hardly exist anymore. Yet what the protagonists of the series seem to be arguing is that underneath this facade there is a chaos of suppressed dreams and broken lives. People are unhappy and struggling because the mega-computer and its predictive algorithms are unable to address the deeper conceptions of what constitutes a human being.

In the narrative of the series this theme mostly manifests through the discussion of free will. The mega-computer Rehoboam is unable to accurately predict the actions of hosts, who are truly free, as well as some humans, referred to as outliers in the series. As the result, the antagonist Serac is forced to hunt down hosts and outliers. In Westworld, the existence of free will is irreconcilable with the world governed solely by computer algorithms. Personalization that Incite offers to the people of the universe of Westworld is, therefore, in its essence — fake.

It’s partially what Neta Alexander refers to when talking about the myth of personalization but once again brought to the extreme for narrative purposes. In her article about Netflix algorithms, she concludes by pointing out that there exists a contradiction between the notion that we have reached an on-demand utopia in which we are finally free to develop our own taste, and the neoliberal reality of filter bubbles, hidden kernels, and various manifestations of digital noise and censorship.[3] Censorship and control are exactly how the algorithms-driven reality of Westworld really looks like.

Reflexivity The situation where media production is self-reflexive is quite common nowadays. This represents a deconstructed dive into the digital era — a form of realism that depicts a world thoroughly and inescapably mediated through data and screens, in which life itself is always abstracting, always mediated, always already “meta”.[4]

A lot of this is typical of Westworld’s third season — the use of technology is non-stop. Dolores is seeing a computer screen through her eye lens, Caleb — the main human protagonist played by Aaron Paul — is signing up for a robbery using a mobile application, big data company Incite has a profile on every human with categories, tags, and predictions of their future behavior. There is a reoccurring theme throughout the season, which centers around the potentiality of all the events that are happening being merely a simulation run by a computer. And although this is not proven in any way by the end of the season, it nevertheless conveys a common sentiment of a wide range of modern-day media productions that tend to not only tell a story but also reflect upon itself telling it.

The Creation of Value and Meaning The rise of digital media has contributed greatly to the way the process of creating meaning and assigning value is seen within the media production industry. It is not a one-sided process anymore where the media companies are pushing their message without the audience having any possibility to influence it. Nowadays it is recognized as an act of communication where both the creators of the message and the audience have the agency to interpret it independently. Initially, digital platforms were seen as the ones that gave the viewer more power to choose the content to consume. Yet algorithms reveal a contradiction in this view — with the performative conception of the audience the viewer is basically put within a cage of expectations.

In the third season of Westworld, this contradiction plays out to the point where being in this cage of expectation means having no control of one’s own life. Each human in the world of Westworld has its own Incite profile created based on all data collected on them. This profile not only predicts how the life of a certain person will unfold including the potential time of death but also prevents significant changes from this intended future by blocking unwanted action — such as marriage and kids for individuals that are considered under risk, denying promotions, etc. More so, some individuals are deliberately sent to risk zone, such as war, to ensure they don’t survive.

Incite profile for the protagonist Caleb — still from Westworld Season 3


As consumers of media, we are seemingly free to judge the content we consume as we please, to decode the message we receive as we better see fit. Yet with the rise of algorithms, this is only an illusion of seeing the whole picture — an on-demand utopia that Alexander is talking about. In reality, a lot of information will never reach us because of the “censorship” that the algorithms enforce upon us.

Westworld’s third season addresses this illusion of being an active part of the process of creating meaning. Of course, as already stressed before, this is first and foremost a piece of fiction that deliberately puts some of the aspects of its narrative to the extreme for dramatic purposes, yet the case could definitely be made that algorithms in fact do prevent the circulation of new unexplored ideas and only enforce the existing power relationships.

Nicholas Carah and Eric Lowe define the process of constant meaning-making as hegemony. Through changing meaning one can also change human interactions, social organization, or distribution of resources. However, this happens only because meanings are under constant challenge from the moment of their conception.[5] What Westworld’s season three sets into focus is the risk that the algorithms ultimately pose — the complete elimination of the situation in which meaning is challenged. And if the meanings aren’t challenged, the power relationships freeze, and eventually all process of meaning-making stops completely as there is no more demand for it.

Conclusion Westworld is a dystopian sci-fi production with some elements of steampunk aesthetics and themes. It is therefore not surprising that the picture that the series paints for our future is grim and pessimistic. Evil corporations are an integral part of the genre, and considering how big a part of our everyday life the social networks and digital media have become, it is hardly surprising that the series in its third season focuses on the potential danger of big data and algorithms. More so, this focus on technology, privacy, and data can be explained by the tendency of reflexivity in modern-day media.

Westworld’s third season explores a wide range of questions — not only deeply philosophical ones like free will or consciousness, but also more down-to-earth-like — such as media production, social networks, and technologically advanced societies. Considering that the main antagonist of the season based his actions on the predictive algorithms, one could argue that the season represents an overall critic of platform society. Yet all the elements of platform society are rather used as a narrative tool, exaggerated in order to create a picture of a dystopian future.

The actual critic the series is offering, I’d argue, is lying deeper than the surface level of overuse of technology. It’s about the underlying struggle for power and representation that inevitably precedes it. For a dystopian world to come to life, one needs to give up his or her right to speak out first. As the result, although the algorithms seemingly are the main driving force of the central conflict in the third season of Westworld, the series is less about the dangers of the advancing technology as such, as it is about the potential implications it has on our future ability to participate in the meaning-making process.

Reference: [1] Van Dijck, J. The Platform Society. Public Values in a Connective World, Oxford University Press, 2018 [2] Fisher E., Mehozay J. How algorithms see their audience: media epistemes, and the changing conception of the individual. Media, Culture & Society 1–16, 2019 [3] Alexander, N. Catered to your future self: Netflix’s “predictive personalization” and the mathematization of taste. In McDonald, K., Smith-Rowsey, D. (Eds.), The Netflix effect: Technology and entertainment in the 21 century (pp. 81–98). New York, NY: Bloomsbury. 2016 [4] Lavender-Smith J. The New Reflexivity: Puzzle Films, Found Footage, and Cinematic Narration in the Digital Age. CUNY Academic Works. 2016 [5] Carah N., Louw E. Media and Society. Production, Content and Participation. Sage, London, 2015

Comments


Never Miss a New Post.

© 2021 by Roadsigns. Proudly created with Wix.com

bottom of page