top of page

Plato, Westworld and Is It Ok to Play God?

  • Writer: Inga Pavitola
    Inga Pavitola
  • Apr 15, 2020
  • 10 min read

Updated: Jan 27, 2021


I remember how when introduced to Plato’s Republic for the first time I was instantly overwhelmed with the feeling of anger and protest. I could see the noble intention behind it. The one that, Plato believed, would ultimately benefit the greater good. However, I couldn’t resist the immense urge to revolt against the control imposed by it. Extrapolation of the notion of an individual mind in the state of harmony towards the same form of balance within the society makes a certain amount of sense. As a hypothesis, at least. Going as far as creating a state based on the concept alone — well, that seemed too much to me. Eventually, I came up with several arguments against Plato’s Republic. First being that Plato’s concept of the state implies that the role of individuality is non-existent. Thus, all personal needs, desires, and dreams of each particular individual in the society are neglected. It is simply expected that everyone will accept his or her role in the system, merely because the system itself is true. One could claim that the concept of individuality indeed was of no significance to Plato, or that it simply wasn’t conceptualized at the time. See, Plato believed that material life was of no importance, merely a shadow of the actual life, the one of the mind or the so-called forms. Carrying on the famous analogy of the cave, it might be expected that every citizen of the Republic (or at least anyone who might potentially protest) must have been out of the cave already — therefore, fully aware of the forms and truths. In this state of mind individual aspects could indeed be of no importance since individuality is nothing more than a part of the material world, the one that Plato argued didn’t exist. Putting the idealist vs. materialist debate aside, an important question remains — is it reasonable to assume that each citizen would be able to achieve the level of knowledge necessary to get out of the cave? I would argue it’s impossible, simply because not all people have the same intellectual capacity and background. The rationalization behind the constant personal sacrifices required by the state would most likely be incomprehensible to those with an insufficient level of knowledge and understanding. Protests seem unavoidable in this scenario. As a result, the fragile balance at the core of the concept of the Republic would inevitably be disturbed, thus, rendering the whole enterprise a failure. The final argument is closely interlinked with the previous two, yet it tends to address the emotional aspect of the concept. The Republic is intended as an ideal state by Plato — a place of justice, goodness and, ultimately, happiness. However, it is based on a never-ending string of sacrifices. The philosophers have to rule, although they don’t want to. Guardians and soldiers have no real choice, but to lead a solitary life, execute control and fight for the Republic. Only the workers might avoid bigger sacrifices by the sheer simplicity of their life and desires. It looks like the only way to experience any sort of individual happiness within the Republic would be by mere ignorance. The sense of the overall justness and goodness of the system could be satisfactory enough for an educated mind out of the cave. Yet the suggestion that out of a whole lot of minuses there can eventually be composed a solid plus seems very far-fetched to me.

On the side note: Usually, there is a certain difference between how classical texts and modern media tend to address philosophical issues. However, Plato’s choice of narrative style was oddly enough almost identical to the one of modern media. He wrote engaging and entertaining texts that mixed philosophy and literature in a way that was appealing to the wider public. It is especially evident if his texts are compared, for example, with the ones of Aristotle.

The new season of HBO’s Westworld is tackling several related questions. Rehoboam, the mega computer at the center of this season’s story arc, seems to exercise a comparable amount of control over human beings. It looks to achieve what seems like another noble goal, very similar to the one of Plato. Although in the context of Westworld just or happy world is not an option anymore. It’s rather a question of survival. What eventually pushed me to write this piece was that I caught myself feeling far less opposed to the idea of a mega-computer controlling our lives. And it got me thinking — how come so? Both concepts are apparently totalitarian. Both seemingly leave no room for personal agency. Yet watching the ultimate episode of Westworld (Episode 5 of Season 3) I found myself sympathizing with Serac, the creator and the man manipulating the mega-computer. In order to get to the bottom of why, I am going to explore several angels.

The difference in goals I began to talk about Westworld by claiming that both Rehoboam and Plato’s Republic seem to have a noble goal at the core of them. Yet this might not be as an accurate of an assumption after all. Where Plato tackles abstract concepts of justness and goodness, Rehoboam is dealing with something less ambiguous — the survival of humankind. Of course, one could doubt whether the survival of humankind is an objective good. Death is inevitable after all. So it might seem only natural that humanity will cease to exist at one point. Nevertheless, it is far easier to get behind the idea of staying alive than behind concepts of blurred signified like those of justice or happiness. One can conceptualize the end of the world, yet what is happiness can mean very different things for different people. Therefore, I wouldn’t call the goal of Rehoboam noble, as the computer is hardly concerned with abstract notions. I would rather call it practical. Eventually, sacrifices towards a practical good are easier to rationalize. Especially when compared to giving up personal freedoms for abstract and empty concepts that can be easily manipulated.

The illusion of objectivity Plato’s Republic was based upon the belief that objective truth exists and it can be discovered, or rather remembered, through the forms. This claim would hardly seem justified for a modern person. Today truth is nothing more than the most likely outcome. At best. At worst — it is the most desirable. Truth has become another empty concept. The only exception might be the notion of scientific truth. Rehoboam offers exactly this — the potentiality of constantly acquiring and reacquiring nearly-scientific truths about our life. Rehoboam is as objective as anything created by the human mind can be. It has access to the biggest amount of data. Its’ operational capability far outweighs the abilities of the human brain. Last but not least, it is based upon mathematical principals, therefore it is as impartial as it gets. Yes, one could argue, that its’ algorithms might be manipulated to advance the interests of certain individuals, yet nevertheless, such machine will still be more objective than any system governed by the human mind alone, merely because of its’ operational capability that can consider significantly more variables at the same time. Eventually, it comes down to probabilities. And in the long run, the probability that a machine will be able to find the best possible solution for all mankind seems far more likely. Even if the alternative is an individual, as intelligent as he or she might be.

The freedom of choice Freedom of choice is a concept that is almost inseparable from the idea of individuality. Taking that individuality wasn’t a notion that Plato considered, it is apparent that neither was the freedom of choice. Both were part of the illusory material world. On the contrary, this season of Westworld seems to focus a lot on the idea of free will and the possibility of choice. It is implied at first that Rehoboam has set people on the equivalence of loops (just like the ones that host were on in parks). The system has trapped humans on these loops without any possibility to affect what is going to happen to them. They are, therefore, imprisoned by the future intended for them by this mega-computer. Yet is it so?

We see through the information that Dolores provides to Caleb (episode 3, season 3), that Rehoboam deals with probabilities, not certainties — it’s not hard determinism, as it was initially implied by the narrative. It would be silly to deny that a relatively low score of a particular person in a Rehoboam-like-system would deprive him or her of a potential breakthrough in the carrier or personal life right there and then. Especially in the world, where most of the processes are automatized and thus involve a bare minimum of actual decision-making by humans. Yet it is also evident that Rehoboam is not a fixed system. It is collecting data around-the-clock, and projections, therefore, should be adjusting all the time. Even if the changes are insignificant on the bigger scale of things, it should allow for a certain amount of growth for each individual who chooses to exercise his or her resolve, for example, by applying for promotion again and again and again (as Caleb might have tried to do after initial rejections). Later into the season, the idea that the human free will is suppressed by Rehoboam begins to be challenged within the narrative itself. The problem, as is now suggested, is not the system, but rather the deterministic nature of human beings, who might not have any free will at all. The character of Liam Dempsey, the son of co-creator of Rehoboam, says these lines in the last episode when asked what future the system had projected for him. The conversation takes place when Rehoboam is temporarily non-functioning due to sabotage orchestrated by Dolores. He says: “You’ll take what little I have left, like the petty criminals that you are, that you always be… Go ahead. Now you’re going to see that the system isn’t the prison. You are. To all of us. We can’t fix you. And we can’t get rid of you. Rehoboam was right. You’re nothing. You don’t have a choice.” Then he gets shot by one of the criminals (who literally takes what little he got left), hence proving the point. The system was temporarily offline, yet people carried on. The debate about the concept of free will is ongoing, and the question will, probably, never be answered with certainty, especially considering how closely interlinked it is with faith. I might write a longer piece about this question at some point when the season is over. For now, let’s just say that as long as there is at least some amount of doubt about humans having free will, the notion of a machine trying to predicted one’s future actions shouldn’t be a surprise or even that much of an intrusion in one’s personal life. It is merely an analysis of causes and effects carried out for a longer time and a prediction based upon it. The bottom line is that the moral implications of imposing Rehoboam upon humankind seem to be less far-reaching if one believes that free will indeed is a non-existent or at least not innate feature of the human mind.

Is it OK to play God? Now, playing God was certainly not a concept Plato was considering in Republic. At least not how we imagine it. If not for other reasons than simply because God, as we understand him, did not exist back in the day. Yes, there were gods, and myths, and heroes, but they were hardly moral authorities. And surely not unquestionable. In Westworld this seems to be quiet a routine question, though. First it was Ford, then Dolores (although with her the question hasn’t in reality been answered) and arguably Mave and William. With the introduction of Serac, viewers are faced with the question once again. In the first season, with Ford, it seemed to imply that it is eventually OK, as long as your intentions are good and you are intellectually capable of pulling it off. Yet we already addressed the concept of goodness at one point — it can mean almost anything. So it’s tricky since it’s not possible to measure goodness. Hence, I wouldn’t consider this a proper answer to the question. More so, the case of Ford was one of a microcosm of sorts. He had his own universe and he played God there. And eventually, he got the oppressed free. So it was justified. Yet it was rather just an analogy. With Serac the situation is different. At least in scale. Where every other character could be seen as God by analogy, Serac is acting as a literal God. It looks like he is indeed controlling everything. Not just a small group of people and a couple of thousand toys that belong to him. At this point, the series seems to be undecided on how to address him. There is a lot of backstory and motivation provided, thus laying the groundwork for genuine sympathy for the man. Yet at the same time, the narrative is constantly challenging him by the ideas Dolores is pushing — the ones of oppression and violation effectuated on the behalf of Serac by the means of Rehoboam. No verdict has been given yet. So it still leaves room for potential confrontation — hopefully a genuinely impartial insight into the question. Although I have to admit, the answer “It is not OK” seems more likely to me at the moment. For I don’t believe that the notion of morality will be easily put aside by a global TV production. Now as for me personally… It might seem that I had to answer “No” as well. Because authority and control are closely connected. And control was something I felt anger and urge to protest against. Thus by instinct alone, I should have been against the very idea, just as I was instantly opposed to the Republic of Plato. Yet I feel sympathy for Serac and his work. Now, I think it might have to do with the illusion of objectivity that mathematics, numbers and probabilities seem to provide nowadays. Just as Plato’s Republic would have been, Westworld’s future governed by a mega-computer is in reality controlled by a man. But where Plato’s Republic was to be founded solely upon ethical notions, Rehoboam presents a potentiality of objectivity that is measurable and calculable by the means of impartial mathematical symbols. For Plato, it was enough to believe that a person will recognize the truth once he or she encounters it because he or she would remember it to be true. This is a luxury a modern person can’t afford anymore. The notion of morality has been so far corrupt for us, that it has become impossible to rely on it. In this universe of blurred lines and empty values, numbers seem to offer hope. Are we justified to rely on this hope? I don’t know. But it is certainly a notion I am willing to explore further.

コメント


Never Miss a New Post.

© 2021 by Roadsigns. Proudly created with Wix.com

bottom of page