Please ensure Javascript is enabled for purposes of website accessibility

The Philosophical Landscape of “Westworld”

November 14, 2016

Share

At the halfway point of HBO’s unsettling new series Westworld – a J.J. Abrams reboot of the 1973 film written and directed by Michael Crichton – some big plot questions remain. Is William a younger Man in Black? Is Bernard really a host? And what’s this maze all about?

The premise of the show is (relatively) straightforward: In the distant future, scientists and businessmen collaborate to create a vast amusement park in the style of the Old West, populating it with artificially intelligent robots (or “hosts”) that are so advanced that they are completely indistinguishable from human beings. Wealthy patrons (“newcomers” to the hosts) come to the park to act out fully immersive fantasies without consequence (they can hurt and even “kill” the hosts, but by design the hosts can’t kill the patrons), while an intricate network of underground employees work around the clock to clean up and reset the hosts, reprogram their character and storyline glitches, and continually enhance the park’s veil of realism. It’s a well-oiled machine, every centimeter of it designed for the lurid entertainment of the upper class.

Only, as of late, the realism is getting a little too real.

With each episode, it becomes a little bit clearer who is driving it and why (SPOILERS AHEAD), but the key twist is that some of the hosts are exhibiting “aberrant” behaviors, e.g., going off of their programmed storylines, “remembering” violence committed against them prior to system resets, and generally connecting dots that, in theory, it’s not possible for them to connect. In short, the hosts are increasingly acting more like a human being than a computer.

With the introduction of this theme, everything about the show – its plot twists, its characters, its graphic content – is subsumed under two key philosophical questions. First, can computers think? And second, are human beings really just computers?

On a surface level, Westworld really only deals with the first question and the social implications of creating such unpredictable machines. (Leading scientists and innovators – Stephen Hawking, Bill Gates, and Elon Musk among them – have raised a red flag about the exponential advance of artificial intelligence and the dangers it poses for human life. There’s still a lot of show left, but it doesn’t look like Westworld will be offering much to countervail those fears.) But because these two questions really come down to the same question – what is human consciousness? – the first question always entails the second as well.

So how does Westworld answer these questions?

Can Computers Think?

Computer scientist Alan Turing famously devised a test whereby computers, for all intents and purposes, could be shown to be intelligent. Turing described the following hypothetical situation: Suppose a computer and a person were in an enclosed room, separated from an interrogator whose goal it is to discover which is which through a series a questions. The aim of the person is to lead the interrogator to acknowledge the computer as the computer, while the computer is programmed to lead the interrogator to falsely acknowledge the computer as the person. If at the end this “imitation game” the computer so closely mimics the human responses that the interrogator incorrectly identifies the machine as the person, the computer has passed the “Turing test” for exhibiting intelligent behavior.

It’s widely assumed that the Turing test is a sufficient condition for showing that a computer has attained something like human thought. The qualifiers we use to talk about current technologies that mirror intelligence (“smart phone,” “cognitive robotics”, “artificial intelligence”) further reinforce that assumption.

But Westworld exposes the limitations of the Turing test. In the second episode, a young man converses with a host in a waiting room that leads into the park. “Are you real?” he asks her, clearly feeling a little silly. “Well,” the host responds, “If you can’t tell, does it matter?” This is the logic of behaviorism undergirding the Turing test. But the answer to this – based on everything we’ve seen about the park’s normal mode of operating – is clearly “yes.” Being tricked by a host into treating it as human (or human-like) doesn’t change the fact that the hosts are routinely dragged into a cold, dark underground and programmed, to the letter, to say and do everything they say and do. They may act like autonomous thinkers, but there’s nothing “real” about them (at least, not at first).

These limitations become explicit in the third episode when the park’s founder, Dr. Robert Ford (played by Anthony Hopkins), describes the early days of Westworld with his partner. “Our hosts began to pass the Turing test after the first year,” Dr. Ford explains. “But that wasn’t enough for Arnold. He wasn’t interested in the appearance of intellect or wit. He wanted the real thing. He wanted to create consciousness [emphasis mine].”

The implication here is that what makes the thought of human beings really and truly thought is the presence of a mind or consciousness to engage in it. Mimicry of a thing doesn’t attain the whole reality of that thing; and the reality of human consciousness is evidently a “something more” that goes beyond observable behaviors.

This brings us to a pause in the first question to jump to the second.

Are Human Beings Really Just Computers?

Discussions about whether computers can think simultaneously involve questions about whether human thought can be said to involve a mind or consciousness beyond the material brain in the first place. If there is no such thing as mind or consciousness, then the Turing test is a perfectly valid way to determine whether a computer has become a thinker in the same sense that a person is a thinker. On this view, human beings are really no different from the average host in Westworld. All your choices, beliefs, and sensations – in short, the whole spectrum of “immaterial” experiences you associate with a single subject you call “myself” – are just a convenient fiction. The only difference is that where the hosts are programmed by artificial processes to behave as if they’re special subjects, we’re programmed by natural processes. You are your material structures and their motion, and nothing more.

Westworld clearly doesn’t adopt this materialist perspective on human life. The whole drama of the show is that the hosts are going beyond the Turing test to attain something of a different kind, and therefore, on the second question, the attainment of something beyond the material structures of the brain that humans possess. But what is that something?

Giants of modern philosophy differ widely on this point. John Searle’s “Chinese Room” experiment is the most popular critique of the Turing test, and focuses on understanding. Others such as Thomas Nagel (“what is like to be a bat?”) and David Chalmers (the “hard problem of consciousness”) have made awareness a kind of bulwark against materialism.

One of the least recognized but most important critiques of materialism, however, is the argument from intentionality. In Edward Feser’s book Philosophy of Mind, he gives a cogent argument that the “ancient problem of intentionality” is what really lies behind arguments of understanding or awareness:

“The term ‘intentionality’ derives from the Latin intendere, which means ‘to point (at)’ or ‘to aim (at)’ – hence the use of the term to signify the capacity of a metal state to ‘point at,’ or to be about, or to mean, stand for, or represent, something beyond itself. (It is important to note that intentions, for example, your intention to read this chapter, are only one manifestation of intentionality; your belief that you are reading a book, your desire to read it, your perception of the book, and so forth, exhibit intentionality just as much as your intention does.) The concept was of great interest to the medieval philosophers but Franz Brentano (1838 -1917) is the thinker most responsible for putting it at the forefront of contemporary philosophical discussion. Brentano is also famous for regarding intentionality as the ‘mark of the mental’ – the one essential feature of all mental phenomena – and for holding that their possessing intentionality makes mental phenomena ultimately irreducible to, and inexplicable in terms of, physical phenomena.”

If the hosts of Westworld are attaining something beyond the material, it is, in a word, intentionality. Their sensations, thoughts, beliefs, and desires are no longer self-contained in a string of physical mechanisms. They are about their objects, directed toward them. They simultaneously seem to be unlocking hidden doors to perception, reason, and will – and even contemplating meeting their “maker” – precisely through the “about-ness” of mental states so characteristic of human life.

If Feser is right that intentionality is the best argument for the immateriality of the mind, and Westworld treats intentionality as the immaterial “something” that the hosts now have, we’re brought back to the first question. Can a computer actually attain human thought, understood as the operation of an immaterial mind?

Westworld wants to say “yes”, but justifying that answer adequately is completely beyond the scope of the show – and besides, would drain out all the drama. The show drops hints that through a lucky recipe of ingredients (ingredients that were also present in primal man), “somehow” the hosts moved from unintentional symbol exchange to intentional symbol understanding, and from unconsciousness to emergent consciousness. We willingly suspend any disbelief we might have to go on that journey; however, as one neuroscientist explains, we have “very compelling reasons” to believe this is never really going to happen.

Whatever the answer to the first question, in dealing with the second in just this way, Westworld open the door to another ancient philosophical problem. 

Westworld as Metaphor

One of the taglines of Westworld is that it’s about “the dawn of artificial consciousness and the future of sin.” The first half of that description, which focuses on the hosts, is obvious, and involves all of the issues discussed above. But what about “the future of sin”?

The focus here seems to be on the patrons who frequent the park, typified in the character of Logan. Early in the series, a visitor to Westworld says that the first time he came to the park, he brought his family and went fishing, but the second time, left the family behind and “went straight evil.” William’s future brother-in-law Logan is just such a seasoned veteran of Westworld. He has no misgivings about doing whatever he pleases with the hosts in any given moment. William laments at one point that Logan just wants to kill or sleep with everything he sees – and he has a point. For the wealthy young businessman, the only thing that matters is his own power and pleasure. In fact, his greatest desire is for something at the outer reaches of the park, “the biggest game there is” – namely, all-out war.

This says more about Logan than it does about the park. Walker Percy once remarked (in a line that could’ve easily been written about Westworld) that the modern self is so bored and alienated, and so frustrated by its boredom and alienation, that it “needs to exercise every option in order to reassure itself that it is not a ghost but is rather a self among other selves. One such option is a sexual encounter. Another is war.” The park’s creators profit handsomely from this assumption, isolating the patrons’ longing to dramatically effect something and setting it loose without a cost to the world around them.

But we know that the illusion is an illusion. The patrons’ actions are not, as they suspect, without consequence. They are inflicting deep wounds, and lasting memories of those wounds, in their conscious hosts. More than any abstract discussion about sentience or awareness, this point is made in a more visceral, intuitive way. Time and time again, the camera lingers on the hosts’ eyes, and through these “windows to the soul”, we see worry, hope, sorrow, and wonder. More than mere awareness, primal understanding, or even intentionality, we see a reflection of the mystery of ensoulment and the dignity it accords.

If we set aside the thorny question of computer consciousness and read this symbolically, the show becomes less a crystal ball into the future, and more a mirror of the present. The hosts symbolize the weak, the young, the voiceless, the helpless – anyone on the margins of society that is manipulated, brutalized, and thrown away, often without fully understanding what is being done to them or how to stop it. Lisa Joy, one of the show’s co-creators, confirms this reading when she describes Westworld as being about “what it means to be human, from the outside in…a meditation on consciousness – the blessing and the burden of it.” The blessing for the hosts is that they are coming to know and understand the world around them – and the burden is, as it is for so many people, precisely the same thing.

The patrons can similarly be read as agents of decadence, brute power, and disregard for vulnerable human life. They hold the hosts under their thumbs for their own gratification, which is ultimately all that matters to them. In the park, they treat objects like people, only to treat them like objects again; but the great irony is that the objects, in becoming “others”, re-reveal the impulse the patrons have come to let loose and leave behind – namely, the objectification of the other. In a roundabout way, then, the show is all about this addiction to treating people like objects, which is not the future of sin, but the reality of sin itself. Indulging that addiction in its most graphic forms – to get back to Percy’s line – becomes about much more than escape for the patrons. It even becomes about more than re-constructing one’s self. It becomes about re-constructing the very meaning of existence to conform to the self. “The world out there,” the Man in Black explains to a host in one scene, “the one you’ll never see, was one of plenty…Every need taken care of, except one: Purpose. Meaning.”

Is this all so unthinkable? One of the hosts, remembering a past narrative “loop” as a teacher of Shakespeare, warns another using one of Friar Laurence’s lines from Romeo and Juliet: “These violent delights have violent ends.”

As a show not just about the future but about the present, Westworld seems to deliver exactly the same warning – not just about the swiftness with which we develop human-like objects, but also about the inhumanity with which we objectify each other.

On both counts, the question we’re left with is a hair-raising one: Is the West clanging headlong into Westworld?