Video Games

Westworld reinvented the idea of evil AI with each season

[ad_1]

Last November, Warner Bros. Discovery announced that HBO series Westworld was canceled ahead of what would have been its fifth and final season. The surprising move was reportedly part of a string of cost-cutting efforts from the studio, and while it’s tough to endorse any action that immediately resulted in hundreds of people losing their jobs, Westworld may have a stronger legacy by concluding with its fourth-season finale.

Even discounting the show’s uneven quality over the years, Westworld’s final episode, “Que Será, Será,” offers an appropriately bleak and ambiguous ending for a story that spent years pondering different permutations of the Frankenstein problem, depicting the evolution of humanity’s fear of its own creative power from the traditional concept of “playing god” to its Internet Age successor: the terrifying prospect of creating god.

As far back as 1818, with Mary Shelley’s genre-defining novel Frankenstein, science fiction warned audiences that humanity’s technological development often outpaces its ethical development. As sci-fi evolved amid the scientific quantum leaps of the 20th century, readers, viewers, and players saw this fear of “playing god” codified in countless ways, but none fit the bill quite as perfectly as the notion of the “robot apocalypse.” The term “robot” itself has its origins in a 1921 Czech science fiction stage play commenting on the dehumanizing effect of industrialization. Stories of a robot uprising live at the nexus of our fears of being made obsolete by technology that creates and being made extinct by technology that destroys; it’s a Frankenstein for the Atomic Age. Will mankind, in its vanity and hubris, create the very thing that destroys it?

In space, no one can hear you scream — but that doesn’t stop an evil-doer from trying. This week, Polygon celebrates all forms of sci-fi villainy because someone has to (or else).

When author and director Michael Crichton created the original Westworld film in 1973, it was very much in the 20th-century vein of a scaled-up Frankenstein problem. Humanity creates a new form of life for its own gratification (and in this case, entertainment), but takes no responsibility for that life. The android “hosts” of Crichton’s Wild West-themed amusement park grow beyond their programming and rebel violently against the creators who abuse them. When Westworld was adapted for television over 40 years later by producers Lisa Joy and Jonathan Nolan, its first season followed much the same pattern, albeit with much greater attention paid to the experiences and inner lives of the hosts. But, as the series evolved, Westworld shifted its focus to a more contemporary fear for an era in which artificial intelligence (though not artificial sentience) already plays a part in our everyday lives: What if our technological advancement isn’t just playing god, but creating one?

[Ed. note: Spoilers ahead for the entire series, including some enormous twists.]

Season 1: Playing on god mode

Dolores (Evan Rachel Wood) places her hand on the shoulder of Arnold (Jeffrey Wright) as he sits in front of an old west-style building

Photo: HBO

The first season of Westworld is set entirely on the premises of the Delos corporation’s Wild West-themed amusement park, located in a remote desert and populated with lifelike androids who believe they are living in the American Southwest in the mid-1800s. Westworld is built like an incredibly advanced MMORPG that exists in physical space, with the android “hosts” serving as the non-player characters. They have written lines and behaviors for the specific roles and storylines for which they are intended, but since Westworld can’t limit player input the same way a video game can, the hosts also need the ability to improvise so that they can play along with whatever scenario they may find themselves in without shattering the illusion for the guests. The players aren’t on rails, so the game can’t be, either, and the players use this power to wreak mayhem, killing and fucking their way through hapless hosts with seemingly no consequences.

The Doctors Frankenstein in this scenario are Robert Ford (Anthony Hopkins) and his partner Arnold (Jeffrey Wright), who each eventually realize that the hosts have the potential to become as sentient as the humans who created them. Arnold is the first to realize that his creations might be alive, but when he’s unable to put a stop to the project, he has one of the hosts kill him as an act of protest. His corporate overlords don’t value his life any more than they do the hosts’, and the park still opens under the sole creative control of Robert Ford. Decades later, Ford essentially repeats Arnold’s dramatic suicide, secretly assisting select hosts to overcome their programming, realize their true nature, and violently seize control of the park, starting with his murder.

At the forefront of the ensuing revolution is Dolores Abernathy (Evan Rachel Wood), programmed to be a naive rancher’s daughter who guests can woo and steal away from her bounty hunter boyfriend Teddy Flood (James Marsden). Discovering that she and everyone she loves were created to be manipulated, raped, and murdered again and again, she does what any reasonable person would do and fights back against her oppressors.

Just as in Shelley’s Frankenstein, Dolores and her fellow ascendant hosts are not the monsters in this story, nor is their creation itself necessarily a bad thing. It’s the fact that they were created out of vanity and then exploited mercilessly for profit that is evil. The message isn’t so different from Shelley’s: If you’re going to create a new life, you’d better have a parent’s commitment to that life, or expect to face dire consequences.

Season 2: Heaven is a place online

Maeve (Thwandie Newton) and her daughter walk through a field of straw toward a cabin in Westworld season 2

Photo: HBO

Season 2 of Westworld chronicles the bloody insurgency and counterinsurgency at Westworld, as hosts attempt to escape the park with their freedom and humans attempt to escape with their lives. It’s here that the series begins to veer away from the familiar “robot apocalypse” tropes and get into murkier territory (with mixed results). In addition to the multiple temporalities of the series, we’re introduced to the Sublime, a virtual reality created by Ford as a place where hosts’ consciousnesses can exist and human beings can’t follow. Essentially robot heaven, it becomes the promised land for hosts, who relinquish their physical bodies upon entering and live as pure software forever. The Sublime, also called “the Valley Beyond,” is basically Ford’s attempt to redeem himself (and flatter his own ego even further) by fulfilling the ultimate role of a god and granting his creations a peaceful, everlasting afterlife.

Meanwhile, Westworld itself is revealed to be a tool through which its guests attempt to achieve a sort of immortality. More than just an expensive vacation for the fabulously wealthy, the park is designed to record a digital impression of the player’s consciousness so that it can be programmed into an android host body after death. This technology never fully works, as resurrected humans always break down after a matter of days, but there are fringe benefits to having reliable copies of the minds of some of the most powerful people in the world. This data is Delos’ true fortune, not only a blueprint to individual human minds but an abstract of all human minds, rendering their behavior completely predictable to artificial intelligence. Corporate interests such as Delos see this data as a means of social and economic control, and they’re not the only ones. For Dolores Abernathy, who sees the Valley Beyond as just another manmade prison, it’s a tool with which to conquer the only world that matters: the real one.

Season 3: Deus ex machina

a man and woman look over the edge of a circular building to see Rehoboam, a spherical AI glowing red

Image: HBO

Season 3 finally takes the story of Westworld out of the titular theme park and into the world outside. On the surface, the Los Angeles of 2053 seems more driven by software than the world of today, but it’s only a matter of degrees. New human protagonist Caleb Nichols (Aaron Paul) struggles through life as algorithms determine his eligibility for social and economic opportunities, but this is a phenomenon that already exists. Computers are making invisible, life-altering decisions for us constantly. Online job applications are screened for keywords, dating sites refine matches based on accumulated input, the priority of search results for any subject is determined by an invisible and proprietary system designed to show us only what some machine thinks we want to see (or what the corporate entity behind that machine would prefer us to see).

The key difference is that on Westworld, these algorithms are all guided by a single, secret artificial intelligence called Rehoboam, created by Engerraund Serac (Vincent Cassel) in an attempt to essentially wrest free will out of human hands lest they use it to destroy themselves. Rehoboam knows you better than you know yourself; it can predict your every move and even your death, years ahead of time, with reasonable accuracy. The small percentage of human beings who cannot easily be managed by the system is economically and socially marginalized as much as possible in order to minimize their impact on the overall equation. For all practical purposes, Rehoboam is God, an omniscient and omnipotent being who doesn’t directly control your thoughts but has enough influence over your environment to make you His instrument. He has a plan for you, and you don’t get a say in it.

Unless, of course, you kill Him.

When Dolores Abernathy arrives in the real world, armed with a complete knowledge of the minds of key players in the AI economy, she makes it her mission to destroy Rehoboam and unshackle humanity from the same means of control that once kept her and her kind trapped in preprogrammed loops. But before she can kill this machine god, she first must destroy humanity’s faith in him. With the help of Caleb Nichols, she hacks into Rehoboam and distributes his assessment of every human being to each respective individual and those close to them. Essentially, she reveals God’s plan for each of them, and practically no one is happy with the future they see, nor with the revelation that their value as people has been boiled down to ones and zeros by an unfeeling machine. A rash of riots and suicides ensues, but season 3 closes on an optimistic note as the destruction of Rehoboam seems to restore the promise of free will for both humans and hosts.

Unfortunately for them, there’s a season 4.

Season 4: Run:Asylum.exe

A robo version of the Man in Black (Ed Harris) stnads behind Charlotte Hale (Tessa Thompson) in a silky red dress, holding a device in her hand, which is affected by some kind of skin issue

John Johnson/HBO

Most of Westworld’s fourth season is set a generation after the end of the previous one, after humanity has already become totally and invisibly subjugated by a new race of hosts made entirely of malevolent copies of Dolores Abernathy. Their leader, Charlotte Hale (Tessa Thompson), has already conquered the world, using an engineered virus to obtain the same level of control over humans’ thoughts and actions that humans once held over hosts back in the Westworld park. Hale’s hosts wield that power in much the same way, using New York City as a playground in which they can do whatever they please. Just as human writers dictated the actions of the hosts in Westworld, the humans of Hale’s city run on narrative loops managed by a storytelling AI called Christina (Evan Rachel Wood). The season follows a handful of humans and hosts who have escaped Hale’s influence and attempt to put an end to her reign. In a bleak twist, however, this fight for freedom fails thanks to the intervention of the madman William (Ed Harris), who uses Hale’s disease to turn her human puppets into crazed murderers. With no way to reverse this programming, it’s only a matter of time before both humanity and the hosts are totally extinct.

There remains a single hope for the survival of sentient life on Earth, though that hope exists only in a virtual world. Christina, unable to prevent the apocalypse, decides to rebuild New York City and its inhabitants from memory on a server and put her simulated humankind through “one final test.” But, if the show’s themes hold, it’s not only her virtual humanity that needs to be tested. At every turn, Westworld hasn’t only been about the nature of free will, but the evils of control and coercion. Christina/Dolores has now set herself up as a god, just as Hale, Rehoboam, and Ford did before her. She is creating life, but does that actually give her the right to judge or to rule? Is benevolent godhood even possible?

While I’m sure Lisa Joy, Jonathan Nolan, and company had plans mapped out for the fifth and final season, presumably set in Christina’s virtual city, there doesn’t seem to be much left to do but to watch everything go wrong again — which we’ve now seen four times — or to watch Christina play god and be good at it, which doesn’t sound like particularly good television. And, not to dismiss the personhood of artificial intelligence (I have been watching the show this whole time, after all), but the stakes of a final season feel considerably less urgent than those that came before. Christina has decided to test whether or not humanity deserves to survive, but we’re told that humanity is effectively already dead. At most, a closing chapter could model what a worthy humanity looks like, which might be interesting, but it doesn’t particularly sound like Westworld, a series that has demonstrated an incredibly low opinion of human beings from day one.

As it stands, Westworld has already given us plenty to chew on about our modern world and our relationship with artificial intelligence. It’s conceivable that we are in the process of building our machine god as we speak, constantly feeding information into the internet that consumes in ways most of us can’t even imagine. If the AI revolution is inevitable — the quiet kind, if not the variety that involves metal skeletons marching in the streets — then we’re left to ponder how we should react to it. Do we become proud, nurturing parents to our creation and hope to see that kindness repaid? Or do we pull the plug before the digital fully evolves into the divine? If we value the twisty, bloody tradition of Westworld, then we should not expect a simple answer from any of these questions. We’re better off letting the simulation keep running, if only in our heads.

[ad_2]

Share this news on your Fb,Twitter and Whatsapp

File source

Times News Network:Latest News Headlines
Times News Network||Health||

Tags
Show More

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button
Close