Techies and gamers should pay attention to HBO’s Westworld, which debuts on Sunday as a major TV show that delves into human artificial intelligence. The sci-fi series explores the morality of creating human-like artificial intelligent beings, how we should treat them, and what the difference is between humans and machines.
In a press briefing, I talked with the creators of the show, and during that conversation, video games, virtual reality, and real-world technology came up a lot.
[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":2068499,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"arvr,business,games,media,","session":"A"}']The show is a remake of Michael Crichton’s sci-fi film of 1973, where rich guests can take a vacation in the almost-real theme park of Westworld, which is full of androids who are instructed not to harm the human guests. The human guests can do anything they want, with no consequences, according to the corporation that runs the technological paradise.
The show runs with Crichton’s original idea of the theme park inmates turning on their masters. Filmmaker JJ Abrams broached the idea of a remake 20 years ago, and HBO finally made it happen with executive producers Jonathan Nolan and Lisa Joy. In our group interview, they were joined by actor Jeffrey Wright (Bernard Lowe), Thandie Newton (Maeve Millay), and Evan Rachel Wood (Dolores Abernathy). They were quite passionate about the enduring themes that the show explores and how the technologists of today are heading into the future without thinking about them.
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
They had thoughts about Google, Facebook, and IBM. The message to Silicon Valley? Be careful. And to gamers? We are what we pretend to be, so we must be careful about who we pretend to be.
Here’s an edited transcript of our conversation.
Jonah Nolan: The project started more than 20 years ago, when J. J. Abrams sat down with Michael to talk about making a film. Nothing came out of that, but the idea stuck in J. J.’s head. In 2013 he reached out to us and said he’d been thinking about Westworld again. He thought there was a series there and he wanted to know if we agreed. He also, crucially, suggested that one way to reinvigorate or re-approach the narrative would be to consider the perspective of the robots, or the “hosts” as we call them.
For us, that was an offer we could not refuse, an opportunity to do a show about everything we were interested in, in one go.
Lisa Nolan: It was a chance to tell a frontier story on two levels. On the one level, it’s on the frontier of science – all the more so now, when what was once pure science fiction is much closer to science without fiction, in terms of the development of A.I. There’s also the Western landscape. The ability to approach that from a new angle was a playground we couldn’t resist.
Question: In the interview you did for Esquire, you mentioned you’d been keeping track of various research projects. You were interested in A.I. Can you talk about some of the projects and people you’ve been watching?
[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":2068499,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"arvr,business,games,media,","session":"A"}']
Jonah Nolan: We’re very interested in deep learning. The progression of that, even as we were writing the show, with DeepMind demolishing the world go champion, made for an interesting tech story, but it’s actually more of a — it’s a landmark moment. Much of the coverage from you guys suggested that. People were interested, although I think because Americans don’t play go, they missed the significance of it. Very interested in that company and their research. IBM presents an interesting model with their adoption and exploitation of Watson as an ongoing business model. It’s becoming the core of their business, employing machine intelligence to solve industrial questions.
We had some interesting conversations along the way with some interesting people. There does seem to be a little reluctance among folks to talk about this, because it’s an ongoing — we’re probably tilting too much toward the apocalyptic language people have often used with A.I. But this is an ongoing industrial concern. There’s a lot of money behind what’s happening with machine intelligence, in this town and globally. It was interesting. It was enlightening.
We didn’t want to feel limited by the research we did, though. I’m a believer in doing a bit of research, but not so much you wind up lost in the woods. We read a lot about consciousness, which seems to remain the domain of philosophers rather than computer scientists. A lot of A.I. researchers seem to want to sidestep the consciousness question, for a number of interesting reasons. Some of them because they know it taps into a cultural conversation they don’t want to engage with at this point. Some of them because — one person said, “If you’re asking me a question about consciousness, how it works, does it exist, in some researchers’ theories the simplest answer is that it doesn’t.”
[aditude-amp id="medium2" targeting='{"env":"staging","page_type":"article","post_id":2068499,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"arvr,business,games,media,","session":"A"}']
Question: Do you get this feeling that science fiction is inspiring real-world technology, which is inspiring science fiction again?
Lisa Joy: That’s the Platonic idea of the realm of pure forms. If you can imagine it, perhaps somewhere it exists. For a long time, people have been talking about ideas like this, but without the technical wherewithal to re-create it or manifest it. Now science is catching up with the imagination and exceeding it. It’s an iterative relationship.
Question: You were talking about industrial applications. I was wondering if you could explain what exactly that means.
Jonah Nolan: More in commerce. The two biggest players at this point, the ones that have gobbled up all the other companies, are Google and Facebook. Larry Page and Mark Zuckerberg are both on a tear to be the father of A.I., or that’s what it seems like anecdotally. Whether or not they’ll make it, or IBM or whoever else gets there first — it’s a race without a defined finish line.
[aditude-amp id="medium3" targeting='{"env":"staging","page_type":"article","post_id":2068499,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"arvr,business,games,media,","session":"A"}']
One thing we suggest in the show is that even if we could define the finish line, we’d probably be inclined to keep moving it as we grow less and less comfortable with the idea of sentience or consciousness in our creations.
Question: It seems like one of the core questions of the show is, “Who would you be if you didn’t have accountability?” Do you think that lacking those restraints — does that change someone’s personality, or does it merely reveal it?
Jonah Nolan: That’s a very good question, which we’ll endeavor to answer over the season.
Question: It seems like the answer is tending toward sadism. You have to be dramatic, obviously, but that’s what we see in most of the visitors.
[aditude-amp id="medium4" targeting='{"env":"staging","page_type":"article","post_id":2068499,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"arvr,business,games,media,","session":"A"}']
Lisa Joy: Or you can be the hero.
Jonah Joy: We unfortunately didn’t get as much time, even with 10 hours of storytelling, to cover everything we wanted to cover. But we did design the park as a place that would have a range of different activities.
Lisa Joy: Especially in these early episodes, we’re focusing on the plight of the hosts above ground. Although we introduce characters like the lovely family going for a walk and the cool gunslinger lady who’s there to test her mettle, those are more interludes. In terms of the recurring figure, we’re looking at the man in black, who’s a rather dark figure. I wouldn’t say it’s a judgment of all the guests who go there. But certainly within those episodes there is an emphasis.
As far as the question of what the park can unleash, one thing Jonah and I talked about is how, when you go to the bookstore, the one aisle that’s definitely not empty is the self-help section. The place where people go hoping that there’s the thing they can do to change – to be less shy, to be more charming, to lose weight, to be more aggressive or assertive, to get over their drinking problem. We’re plagued by demons, or things we think are demons but maybe aren’t necessarily, that shouldn’t be pathologized as demons. There is something that seems to be a common denominator. All humans have those things nipping at their heels.
[aditude-amp id="medium5" targeting='{"env":"staging","page_type":"article","post_id":2068499,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"arvr,business,games,media,","session":"A"}']
Jonah Nolan: The park is no more sadistic than spending four hours watching your friends play Grand Theft Auto. I think Crichton, when he was writing the film in ’73, anticipated several things. There’s the sequence in which the head scientist, trying to figure out what’s going wrong with the hosts, casts around talking about, “It’s as if the problem is spreading from host to host like a bacterial infection.” It’s a computer virus, even though the film was written two years before the appearance of the first computer virus.
Here’s a guy who’s very smart, Crichton, a polymath himself. Spent a lot of time thinking about technology and where things were going to go. In the era of Pong, he anticipated Grand Theft Auto. My wife, who’s the only person I’ve ever seen Grand Theft Auto and actually obey the traffic signals—Ford comments on this in episode four. He and his partner Arnold build the park with an eye toward open space. They built 100 happy storylines. And everyone went with the more sadistic, or at least the more self-aggrandizing experiences. Exercising power, as he says.
It seems to us, based on what’s happening with video games – almost all of which feature violence to some degree, almost all of which feature a sort of heroic story in which you conquer things – the uptake in the park for those narratives would be pretty high.
Question: We’re seeing a controversy around A.I. right now, but also around more basic things like social networks. The question is whether the people creating these are programming implicit biases into the algorithms, the way they sort content. When you look at the world, do you think of the big products that people use, like Google or Facebook, having implicit biases in them that maybe aren’t purposefully doing anything wrong, but potentially doing something harmful?
Jonah Nolan: Well, she and I aren’t on social.
Lisa Joy: Technically I’m an egg on Twitter. I have four followers. I think they’re bots. Or you guys.
Jonah Nolan: I think social is batshit crazy. My last show dealt a little more with this. Totalitarian regimes around the world struggled for decades to build—Raul Castro’s first job was to break down the biggest enemy of a totalitarian state, which at that point was what they called an “informal social network.” It’s vanishingly easy to figure out who your relations are, who your teachers were, who your co-workers are. But it was once very difficult to figure out who your friends were. It’s a very subtle thing. It doesn’t show up on a census form. The way Raul Castro cracked it in Cuba was he recruited a spy on every block to inform as to who was friends with whom. Dissident movements are difficult to figure out. It requires a lot of man-hours.
In America, where we don’t remember anything about totalitarian regimes, although one appears to be heading our way in about two months—hopefully not. But we decided to hand over that information.
Lisa Joy: One thing I will say, in terms of implicit bias, everybody has one. That’s one of the things that is helpful to recognize about yourself if you’re trying to overcome bias. No matter how pluralistic you think you are, you come at things with a set perspective. Whether you’re talking about creation as creating a child and informing that child’s views, or creating a work of art, or creating technology, your own implicit bias is a transferable thing.
What’s the answer to that? Socially, part of it is—you counteract implicit bias with conversation and discourse and a plurality of people who create art and technology. In that way you try to get a more expansive idea of the world, a more expansive idea of people’s opinions.
Question: Aren’t there implicit biases in every information source, in all of these tools? Twitter, I find, is a very versatile tool, but it can be used to enhance your own personal biases by surrounding yourself in the doomsday bunker with like minds who don’t challenge your perspective. The tool is shaped by the hand, to some extent.
Thandie Newton: I feel like the greatest control, the greatest power, comes from the fact that people don’t seem to have the awareness that they’ve been programmed, and that others have as well. They’re being steered in hidden ways by marketing and so on. If people tap into the awareness that their instincts aren’t necessarily just coming from their own personal code – that they’ve been programmed by stimulus, the nature/nurture thing—that’s the control.
It’s so incredible, just as a personal Twitterer, looking at some people’s feeds and people arguing and how much people take things personally. This person doesn’t know you. They don’t know anything about you other than these 140 characters. And they get into these fights that go on for hours. That suggests to me that they don’t have any awareness about the fact that they are not their thoughts. They are not their feed. That’s just ego.
That’s one of the things that was so beautiful about working on Westworld. By playing these robots, this artificial intelligence, you realize that we’re no different from people. It’s a metaphor in some ways. We are so still and present and powerful, but that’s just because we have one focus. We’ve been programmed to do one thing really well.
We could all be like that. Arguably, if you’re very successful in your field, it’s because you’ve found your focus. You’ve stripped away any baggage that stops you from improving your position and you drive forward. But there are so many people who get stuck in the personal, in the go, because they don’t realize they’ve been programmed to get irritated by Twitter, to hate that person. We’ve been fucking influenced to be that while other people, who are truly in power, are busy controlling the world while we argue with our lack of awareness and our complicated defensive egos.
Evan Rachel Wood: One of my favorite ways to use social media is to see the patterns, see the quotes that everyone seems to be using that they found from a news source. “Oh, this is repeating all over the place.” It’s obviously a program. You can find the consistencies and the patterns because you see the chain reaction and how it’s growing. It informs me on what to think twice about sometimes, to question if I’m being programmed or not because I see it in other people.
Newton: It’s amazing. I’m fascinated by human behavior.
Wood: We wouldn’t be doing the show if we weren’t.
Newton: Yeah! But it’s funny, because we’re talking about technology, talking about artificial intelligence. What I’m fascinated about is intelligence in general, which comes from us. We’re obviously trying to mimic that, the way we program. You look at a plane flying and that’s inspired by things that really do fly, natural things. It’s amazing how people are influenced and how un-aware they are of that influence. That, for me, is where the real power is.
Jeffrey Wright: The miscalculation was more that these tools would usher in an age of information, rather than an age of misinformation. That’s the thing that, for me, was most surprising. We’ve just added to the chaos as opposed to any clarity.
Wood: We’re in an era of pull technology, of self-selecting reality now. When you watched the news to see what was going on, you had what you at least thought was a reliable narrator. You only had about five stations and two newspapers. There was a whole industry that pushed the same information out to people, and people received it and the accompanying debates. There was a certain level of moderating going on.
Question: Also uniformity.
Wood: Yeah, there was a uniformity, a kind of fact base that was relatively uncontested. I’m not even making a judgment right now. I’m just explaining what I think is an interesting and very important change that’s happened. Now, if I don’t want to—it becomes very easy to become entrenched in your opinion. You pull your own news stories. You find your favorite websites and people to follow on Twitter.
Newton: And anybody who doesn’t like that threatens you personally. Which is insane.
Wood: It’s really interesting. Even fact-checking is difficult to—
Newton: It feels like there was a time when honesty was a moral code that people didn’t break without feeling bad about it. Without having a real conscience. But you look at the debate three nights ago, honesty is selective. Honesty is not something we all absolutely adhere to and assume that everyone else adheres to as well. It’s now part of the manipulation, whether you’re truthful or not. You can be dishonest.
What was it, in Tom Sawyer? A lie can get halfway around the world before the truth has got its boots on. It’s a tactic that’s now used.
Question: And knowing that people will lie to you because they lie to themselves.
Newton: Sure. That’s one of the most difficult lies to unravel.
Question: There’s an argument you can make that this is a show that’s very relevant to Silicon Valley, to the creators of technology. What would you say is the message you would send them?
Jonah Nolan: I wouldn’t say there’s a message. I don’t think the show is necessarily teaching anything. You have to be very careful.
Newton: It’s a fantasy, as well.
Jonah Nolan: First and foremost it’s entertainment. I think we’re asking, hopefully, interesting questions about human behavior, about our appetites, and about the moment that seems imminent and urgent, in which our creations begin to ask question of us. They begin to regard us. For us, that was the key jumping-off point for the series. It’s not about us talking about A.I. It’s about A.I. talking about us.
Newton: It’s a great way to pose those moral questions.
Question: In preparing to play robots and a tech industry employee, what did you consult to get into your characters? Were there individuals who you thought your characters were like?
Wood: The Singularity is Near became my Stanislavsky actor’s handbook. [laughs] It did. I kept it in my trailer. It informed me a lot. You have to ask yourself different questions as an actor when you’re playing an artificial intelligence, a robot. You don’t work the same way, literally. There’s a whole deeper layer to it. The more I did research that way, the more I feel like — whether or not it’s obvious on screen — it informed the performance.
I talked a lot with Jonah and Lisa. They gave such beautiful direction in guiding us to where we needed to be. Having the knowledge and knowing the answers to a lot of the strange questions we had to ask. We called them “bot thoughts” on set. “I got a bot thought. Should I be squinting in the sun?” There are so many things you have to bring up.
Newton: But then you’re fucking told that no, they don’t blink overly. So you’re doing this thing in the sun — I swear to God, I could tell sometimes….
Wood: Oh, God, I’m fighting that sun so hard.
Newton: There was one scene where Evan and I have to stop in the street, and we look at each other, and we try to say something incredibly fucking important. I have to hear it, and it lands. And then we walk away, and we’re both just like—tears streaming down my face, just trying to keep my fucking eyes open, because it was so fucking bright. And we’re robots that don’t blink in the sun. We don’t have a problem with that.
I’m grateful for the opportunity to play these roles. It was so interesting to hear how other actors were playing every one of them, because none of this was real. This was fantasy. We’re creating a world that could go on for a very long time in parallel to our world. I just felt that the robots were innocent, like babies. They were beautiful. They were perfect. It felt like they were better than us, because we as adults are so polluted with information and confusion and whatever, whereas babies just—the clarity of thought and presence. The robots, to me, felt like that.
Really, what affects the robots, what makes them so beautiful and so perfect, is how they’ve been programmed. That’s reflected in human beings, to me, too. How a person has been influenced, what they’ve chosen to take on to them as important or not, how they’ve educated themselves, the experiences they look for in life, those are the measure of who you are. I think I said this one time, that I felt more exquisitely human as a robot. These characters are so clear, and it’s all down to the programming.
Wright: From my perspective, we used Jonah and Lisa as….
Newton: Encyclopedias.
Wright: Encyclopedias for a consistent understanding of the parameters of the place, so we’re all on the same page. Certain rules we had to understand. But again, building it from the ground floor and having this wonderful, surreal, poetic writing to work with — I took a different tack with it. I didn’t go up here and hang out with the guys. What was more interesting to me were the levels of this question of consciousness, in terms of replicating human behavior and human-ness.
What I found the kind of mirror reflection was what we go through as actors. That became an interesting meditation. As we sit and either manipulate or retrieve memories that evoke certain ideas or certain emotions, likewise it’s the same thing that we as creators are trying to embody, if you will, in the host, in the creations. Playing with that reflection was the more interesting thing to me.
Question: So you’re saying you didn’t come up here and hang out with tech employees and do a lot of reading about the themes.
Wright: This isn’t an anthropological study. We don’t specifically say what time period we’re in. But no. Particularly, again, we’re creating this anew. It’s fiction. I was more interested in just the idea of imagining, as opposed to referencing. But still with a level of authenticity.
Newton: Do you think it feels like a real projection into the future, how it would be? Does it feel — not, is it really going to be like this, but do you think we got it right?
Question: I’m almost on the fourth episode, and I’m torn between a bunch of questions. One is, is there someone who’s mysteriously actually not a robot, or is a robot? It makes me think about how real A.I. can become in the future. And then the other thing is, what is everyone’s intention? You’re freaking me out. I’m not totally sure about where you’re coming from. Same with everyone else. You’ll get that gleam in your eye. [laughter] But then I think, maybe A.I., or even people….
Question: It’s like she says in episode two. If you can’t tell the difference, what’s it matter?
Newton: Part of what’s so important are boundaries, incredibly important. We’re all talking about boundaries, about divisions in nations and all sorts of things. That’s what’s crucial to Westworld, that what happens in Westworld stays in Westworld. You literally go through a door, get on a train, and transition into a place. What’s going to be fascinating is when those boundaries become messy, which you see – these breakthroughs of dreams.
Wood: There’s a bigger picture of what you would want to be using A.I. for. It’s mentioned in the pilot. What is the real interest here, and in the hosts? When you think about all the possibilities and how they could be manipulated and what you could use them for, good or evil—there’s endless possibilities.
Question: It seems like one of the questions you’re grappling with is the morality of — to use the Grand Theft Auto example, at what point does it turn over? It seems like the show takes a position that to treat these people in this way, with so much violence and sexual violence, is an immoral act. Those are the black hats.
Newton: To lie to them, to make them feel that they’re human when they’re not. That’s the biggest betrayal. It’s been so incredible in the last few days, talking to people and having genuine conversations, as opposed to just, “How is it playing this role?” The contribution of the viewer is critical to it. I feel like we’re both the same in how we’re appreciating and relating to this material. The conversations that can be had as a result are really a value to us. We’re not just talking about the show. We’re talking about real life, real people, real values, real boundaries.
Question: I’m curious about that. Watching the show, at the point where you wonder why this black-hat guy, the bachelor party guy — why is he so vile? And then you realize that this is what you’d do in a game like Grand Theft Auto.
Wood: This is what people do in real life. It happens every day. It’s happening right now. It needs to be explored, and that’s why it’s amazing to see from an objective standpoint, from something that’s like a human being, but not. What would we look like to other beings? What do we look like? We’re not making this up. This is stuff that’s very much a part of the world. All the good and the beautiful love stories we’ll explore in the show, the people that choose to be heroes, but also the people that choose, whether in Westworld or not, to be sadistic and vile.
Question: But we don’t put that in a theme park.
Newton: Don’t we? Wouldn’t you say sex tourism is like a fucking theme park? If you have enough money — just because it’s not above board, just because people don’t talk about it, then apparently it isn’t there?
Question: In terms of theme parks, though, and especially how the world has moved since the original Westworld, it’s more about — I don’t want to say infantalization, but a sort of Disneyfication of everything. Everything is Disney now. You have Harry Potter and Universal and so on.
Lisa Joy: It’s an interesting thing, to mention Disney in relation to this. We thought about Disney a lot. When we started thinking about Dolores’s character, in some ways she’s the quintessential Disney princess. But with a twist. [laughs] For us, and I know you guys all know this feeling — we just had a daughter. You work so hard to try to find books and movies that won’t traumatize the hell out of her. Something where the exciting element isn’t literally death. “Oh my God, I just saw my parents die in front of me!” And it’s the start of a movie about a deer.
There’s a certain amount of trauma and violence that seems to be coded into iconic fiction, in big stories throughout time. The question becomes, why? There are different answers to it. Some of them are cynical and some of them are idealistic. The cynical answer is that there’s a part of us that’s sick and twisted at heart, that wants to see that. Some kind of wish fulfillment. For some people, perhaps that’s the case.
But there’s another side to it, in which fables and stories are cautionary tales. We’re not robots, but there are ways of running simulations around scenarios that we haven’t personally experienced yet. We want to gauge what the proper reaction would be. In doing so, it helps us model worst-case scenarios, best-case scenarios, everything in between. I don’t think it’s an accident, sometimes, that children’s stories are so violent and so dark. It comes from parents preparing their children with a story that won’t hurt them – they can say it’s just pretend – but some kernel of truth from that becomes coded into their psyche and serves as a platform for learning.
Jonah Nolan: Stories are our oldest form of simulation. Storytelling is something that the show is interested in. It’s something we’re interested in. It appears to distinguish human consciousness from other animals that we share this planet with, our ability to tell a good fucking story.
Newton: And code information within that story, so it’s memorable.
Wood: Look at the Bible, all the violence and craziness. That makes Westworld look tame, you know?
Newton: Fear is a great way of remembering something. When your kid gets lost in the supermarket and you find them and immediately you shout, “WHAT HAVE YOU BEEN DOING?” Because you’re terrified. That fear will impact them and make them remember in a way that, “Hey, is everything cool? You just got lost for a second, did you?” doesn’t. That’s not necessarily a quite clever thought, but it works. It can be misused. Again, the awareness factor is key.
Wright: Another angle, another facet, is this idea of story as consciousness, which we play with. Whether consciousness exists or not, it certainly exists for us through story, as a collective consciousness. The history of literature and whatever sources you consider our collective consciousness, it’s something I hadn’t considered as we tell this story.
Jonah Nolan: We’ve figured it out. [laughs]
Wright: As I say, this is why we rely on Jonah and Lisa to handle all the existential questions.
Question: Is Westworld a physical place, would you say, or a virtual place? To me the answer makes a huge difference. If this is a Star Trek holodeck, it’s a virtual place. You can do anything there. You’re encouraged to consider it pretend, and when you walk out it’s not there anymore. Just like how you can be in Grand Theft Auto. You can be very bad in Grand Theft Auto because there are no consequences. But if it’s a real, physical place and all this stuff has real consequences, this idea of no consequences is a lie, and I have a better feeling for how I should behave in Westworld.
Lisa Joy: It’s funny, because again, is morality a circumstantial thing or a personal thing? In that way, morality is something that’s based on rendering. How detailed is the image? Because you’re doing the same things. It’s just to one thing that’s more detailed and more lifelike than another. Your behavior is the same in both cases. If it’s one way in a video game, and then you have the game as a virtual reality, and then you have the virtual reality manifest in automatons playing out the exact same scene, does it suddenly become immoral because they become too lifelike?
Jonah Nolan: Part of the reason why there isn’t great outcry — there are some conversations about it. I’m conflicted about it myself. But there isn’t a great outcry about the morality of people playing video games because the simulations aren’t very good. They’re great compared to what we had when we were kids, but they’re still distinctly un-real.
Lisa and I had an experience when we went to talk with HBO about budget. We wound up getting put into a small room in their headquarters in Santa Monica, a windowless room, and we demo’d the HTC Vive about a year before it became commercially available. We were both kind of stunned by the experience, what we call “reality shock.” Here was the first simulation that felt so immersive and attractive and engaging that coming out of it to a windowless office — it was a bit like, “Oh, shit, I want to go back in.”
We were struck, on the drive home, by how much time our daughter spent in simulated realities versus base level reality. It was a very instructive, very interesting experience. With Westworld we’re looking beyond the immediate—VR is clearly poised now. It works. It doesn’t make you sick. It’s incredibly immersive. It’s going to explode as the evolution of interactive storytelling. We were looking for the next step. You imagine that this story is taking place in a world where, if you have the means to do so—VR is what everyone else does. That’s basic. This is reality. That’s what they’re selling. Forget VR and come here, because it’s real.
That introduces a problem. Morality isn’t a problem in video games because the simulation—a character says this in the pilot. The simulation is poor enough that you don’t conflate the experience. In film and television, we struggle to make violence as real as possible. We look at visual effects shots and blood splatter and say, “Oh, it doesn’t quite look real, do it again.” We’re looking for verisimilitude, but that two-dimensional screen is a distancing devices. As our simulations get more granular, more perfect, we will be confronted with this problem. Now it feels too real.
The morality of what you do in that world becomes a lot more confusing when two things happen: when the simulation is indistinguishable from reality, and when the intelligence of the non-player characters you interact with eclipses a certain level. Then it’s much more problematic. Driving around in Grand Theft Auto and running over a bunch of pedestrians is happening right now.
Question: Asking the actors: Have any of you tried out VR?
Wright: I have, yeah. Just once. My son is trying to change all that. He’s insisting that his birthday’s coming up. But I had an experience at the U.N. about two years ago, which was mind-blowing. It was a virtual tour of Monrovia with an Ebola survivor, a walk through that experience. Her sickness and progress to recovery, and also the consequences of the epidemic on the city. It was staggering. The narrative I found a little concerning. It was a bit missionary in its tone, which reflected this weird judgment that these people survived because they’d been saved and those who didn’t survive hadn’t. But aside from that, it was very powerful. As a tool for empathy, in this case, it was seriously effective.
Newton: I was once asked if I’d be a voice for a new game, which was as immersive as things are these days. I was asked if I would be the voice of a Congolese woman, and the game was set in Bukavu, in the eastern DRC. The baddies were the different militia groups taking over towns – not raping women, although that’s what happens, but just horrible violence – and you were one of the aid workers, a relief team going in and killing people who were doing bad things, saving the communities.
This was a job offer to me because I work as a human rights activist. I’ve been to Congo fighting for women’s rights, ending sexual violence. Congo is a terrible place for that. When I questioned this, because it was one of the most revolting things I’d ever been asked to participate in — I couldn’t believe they had sent this my way, because they thought I would find it interesting and useful. They said the justification was that this would raise people’s awareness about what’s going on in Congo. I said, “But in an immersive experience where you’re shooting people?” It made no sense to me at all.
It was interesting thinking about how these immersive experiences are — we have to be so careful about the desensitization. If it’s virtually real — like Lisa was saying these are wonderful opportunities to teach, to communicate, to warn. But if they’re being used to desensitize, to isolate, to misinform people about what happens in different parts of the world, it’s just fucking terrifying, frankly.
Lisa Joy: Same thing with the show. We’re worried about the violence, the sexual violence, but you also have an obligation as a storyteller to raise awareness and show the horrors of that, so that people aren’t desensitized to it. I don’t think there’s anything titillating about what we’re doing. It’s all horrific, as it should be, in its context.
Newton: The context, but also the cost. One of the brilliant things about having these 10 episodes is we get to see the consequences and ramifications of this violence, the cost of this violence. We look at it from so many different points of view. The perpetrator. The person who’s affected by it. The people who are complicit by being around it. When do you ever really get a narrative where you get to see it from all those different points of view? That’s incredibly valuable. But the only way we can look at it is by showing it. That’s not literally showing everything, because I think the show is incredibly good at just hinting at things.
Lisa Joy: We don’t directly show sexual violence toward women.
Newton: But you have a very strong sense. You notice — I read a review in the Hollywood Reporter that pointed this out, and I was glad that it did. It’s not about the special effects. It’s not about a lot of modern film technology. It relies on old-fashioned storytelling. The pace of it is beautifully traditional, almost. It’s relying on performance and character. All those things allow you to feel more, as opposed to lots of shocking bang-bang-bang and kinetic movement that just scrambles your fucking brain. This is gentle, but the message is strong.
What it asks you as an audience to contribute is very powerful, too. It’s not like, “We’ll show you this, and then distract you with something else so you forget you’ve just seen something so fucking disgusting.” You don’t have time to sit with it and process it and challenge it in your own mind. All those details, people don’t think about those things.
But it’s hugely responsible, sensitive filmmaking to, first of all, be brave enough to put this stuff out there, because it’s the opposite to what we want to promote, as a team. It’s brave to have that material there, and to allow both us and the audience time to investigate these things. I’m so grateful for that opportunity, as a participant and as a viewer.
Question: We probably won’t get a Westworld video game with a lot of shooting, then?
Jonah Nolan: It would be deeply ironic. [laughter]