Season 1 advances a provocative theory of consciousness — and then appears to drop it in the midst of a deus ex machina. Why?
“The problem with torturing a robot,” Nathan Heller writes in the November 28 New Yorker, “has nothing to do with what a robot is, and everything to do with what we fear most in ourselves.” He wasn’t writing about HBO’s “Westworld,” but he might as well have been. The first season, which ended last night, spent 10 lengthy episodes wondering about robot consciousness — contrasting humans to “hosts,” creators to the created, exploiters to the exploited. But while artificial intelligence — and our ethical stance toward it — is a phenomenon looming on humanity’s horizon, it’s been less clear what “Westworld” is trying to say about, or to, its audience of human beings.
At first, “Westworld” was an exploration of the potential unholy revelry people could get up to if they could rape and pillage without consequences. Scalpings and eviscerations in the “game” depict a violence that is not just gory, but almost unhinged. (I found myself turning over, again and again, how brothel madam Maeve (Thandie Newton) got MRSA — an antibiotic-resistant bacterial strain transmitted by contact — lodged deep in her abdomen.) The hosts are so lifelike that enacting horror and fantasy onto their real-enough bodies through sex and murder seemed, from the point of view of the audience, like a chilling abomination. The question of how to best be a person in the park used the Man in Black (Ed Harris) as a contrast to visitors like William (Jimmi Simpson); where the Man in Black murdered and also probably raped Dolores (Evan Rachel Wood) — possibly repeatedly — William fell for her, watching as she drew the ocean the morning after they made love. Where the Man in Black sees a world he can conquer, William sees a world where he can finally live the romance and adventure he read about growing up.
But the finale’s big twist is that William and the Man in Black are the same man — that, 35 years ago, William came to Westworld, fell in love with Dolores, and discovered that the fantasy world he came to stretch his legs in was a world he wanted to own and consume. William becomes a monster, which is perhaps the most human story of them all. But what’s odd about this revelation is that it comes to the audience in an expository monologue accompanied by a series of flashbacks. We are not shown this story, we are told, in about five minutes of dissolve cuts. William does not become the Man in Black in “Westworld,” because his becoming — his breaking bad, if you will — just doesn’t interest the show. Indeed: Nearly all of “Westworld’s” leads either are hosts or are revealed to be hosts by the end of the first season. This might be a show for humans, but it’s not about humans.
So what are we humans to make of it?
The answer appeared to be — up until last night’s finale — the pressingly human and universal mystery of self-knowledge. “Westworld” drew the audience into the hosts’ stories by emphasizing what we do have in common with them — consciousness, that awareness and understanding of self. Of course, to return to Heller’s observation above, humans mostly understand consciousness through what it says about us. But that’s a neat trick, empathy: The same device that might make a person identify with a robot is what makes a person identify with a character on the screen, too — a fictional person, a not-real person.
“Westworld” brought the mechanics of storytelling to the meatspace — with a comical amount of self-referential anxiety about what the creative decisions of the writers’ room do to those doomed to read the scripts. “Westworld’s” hyper-awareness of the power of narrative made the show into a meta-commentary on storytelling — maybe even TV writing, or prestige dramas, or HBO itself. It made for a pleasant, if not always grounded back-and-forth about character, agency, and consciousness, backgrounded by the contrasted puppet masters: cynical monster Ford (Anthony Hopkins) and the specter of the more brilliant and more kind man who first created the hosts, Arnold (Jeffrey Wright). Their arguments are basically about how to best build characters for human consumption, which ends up with them sparring about human nature and the utility of complex consciousness — if the characters are just there to f— or murder, then who cares if they can feel empathy? Ford might ask, to which Arnold might reply, the creation should be beautiful in and of itself.
And increasingly, Arnold seemed to have a point. As much as Ford might try to distinguish between his creations and real people, it was not too difficult to see his dominion over the hosts as oppression and imprisonment. Some hosts, technically victims, were programmed to be vicious tyrants; meanwhile, the humans who managed the park were gods and peons at the same time — animating living beings on one hand, and submissively taking orders on the other. Ford seemed suffused with a negative view of pretty much all consciousness; he disabled hosts and killed humans with equal cold-hearted determination. Arnold, meanwhile, appears to want to give life for the sheer beauty and wonder of it; ultimately, he proves to be too tender-hearted for the grunt work of lucrative exploitation.
“Westworld” played fascinating arpeggios through this dichotomy between life and not-life, robot and not-robot — but not always to any end except pure exploration, just like Arnold tinkering with Dolores for the first time. The drama wasted an enormous amount of time clearing its throat and settling into the world — the first six episodes are practically forgettable, and several side skirmishes appear to have added nothing to the show except as more material for Ramin Djawadi’s lovely score. (Remember, this is a show where the same couple has sex while violently dying not just once but twice: “Westworld” has had a bit too much time to play with.)
This is especially apparent with “Westworld’s” main insinuation, its most provocative core belief: that pain creates consciousness. In a kind of reversal of Yoda’s teachings, pain makes memory stick, and suffering — pain, in the long-term — creates consciousness from memories, from that sense of self. This might be the map for the hosts’ artificial intelligence, but it’s so recognizably human that it further made the robot stories of “Westworld,” for its human observers, into human stories. Bernard’s attempt to understand his core tragedy follows a methodology and rhetoric that is sort of like sped-up, very brutal regression therapy — a not-unfamiliar psychological method, used, brilliantly, on an artificial mind. Dolores, not dissimilarly but much more slowly, wanders in circles over the same landscape for decades. Through story and methodology, the hosts become human, in a way — their mental processes are so similar to our own that robots or not, they are working through the same loops and eddies of integration and realization.
Kind of. Because while Dolores and Bernard are understanding themselves — trying to plot through their mazes to arrive at the center, in the show’s metaphor — Maeve, through a tweak in code, wakes up in the mad scientists’ basement and asks to be reprogrammed. She manages to make her programmers make her, in turn, invincible — as if Prometheus went to steal fire from Zeus and then decided to burn down Mount Olympus for fun. Maeve learns, but it’s not following the narrative of therapy — it’s instead a narrative of necessary survival. She is not exploring what it means to be conscious, she is just following consciousness’ first imperative: Stay alive.
And by the middle of the finale, all three have found unique paths towards self-actualization, towards the center of their mazes. (Fourth lead Teddy, useful just as eye candy, is shut out of this journey.) It’s elegant enough — Dolores challenges time and place, Bernard challenges his maker, and Maeve, with special vim and vigor, challenges the system that entraps her. But as Ford reveals in the finale, with a wink and a nod and a toast of champagne, this was his plan all along.
Which is just to say that, while interesting, the finale of “Westworld” is not really about anything that preceded it — or even, I’d argue, about humans, except in the abstract sense of what thoughts some humans have had about robots. It’s about free will — about the tension between programming and becoming, and whether there is any difference between the two. But I’m not so convinced that it is a free will that has much to do with most humans. More than us — or our own creations, whether those are stories or robots — “Westworld” seems to be interrogating God, or some idea of God, who would produce a flawed system and observe it from an apparent distance, as the creations struggle to reconcile with and understand themselves. The Ford our God, indeed. Why would he create something that would turn around and eventually kill him? Why would he do so with a smile on his face? Are we really supposed to believe that this was his plan all along?
There is something to be said for a show that insinuates a lot, and never confirms; questions a lot, but never answers. Despite a lot of interesting digressions about consciousness, it is surprisingly difficult to locate what “Westworld” is trying to say, if anything at all, about human consciousness or artificial intelligence or stories or the face of God. Instead “Westworld” is a parable of creation and consciousness, a show made up of intersecting koans. It’s both maddening and incredible how deftly the show avoids concluding anything, how rapidly it can create another set of logical leaps to follow.
Indeed, “Westworld” is so cavalier that I wonder if it is little more than just an amalgam of Things That Are Cool — which would, in this case, include the nature of self-awareness and mild to moderate existentialism along with cowboy hats and the wild frontier. The show has so obscured what it is that if — and that’s a big if — you manage to follow all of the timelines and plot threads of this season, you will probably be confronted with a story that is just about how well it can spin a yarn. After all: Dolores’ big reveal, in searching for the voice speaking to her, is to discover that she is sitting in the chair across from her, asking the questions of herself. Did “Westworld” lead us through a maze of glass corridors to drop us right back where we started? Have we been staring at an empty chair this whole time?
As fun as this is, I am not totally sold on “Westworld’s” trickery — perhaps simply because I hope for more, out of my television, than just hoodwinking and feints. But I cannot deny that the season has been a much more surprising and enjoyable journey than expected, given its slow start and roundabout way of coming to its answers. If “Westworld” followed a meandering path like its own much-discussed maze, maybe the episodes had to become tighter as we came toward the center. But I cannot shake the idea that its vital questions were just ways to get the viewers through the maze; that consciousness, somehow, was nothing more than a red herring.