ADVERTISEMENT
ADVERTISEMENT

The father of virtual reality sounds off on the changing culture of Silicon Valley, the impending #MeToo backlash, and why he left Google for Microsoft

Widely recognized as the father of VR, Jaron Lanier has been hugely influential in shaping the technology of today — so we asked him about the future.

Widely recognized as the father of virtual reality, Jaron Lanier has been hugely influential in shaping the technology of today. Lanier's work is considered foundational to the field of VR; he's spurred developments in immersive avatars, VR headsets and accessories, and was involved in early advancements in medical imaging and surgical simulator techniques. He's also credited with coining the phrase "virtual reality."

In addition to his work as a programmer and inventor, Lanier is a prolific author and celebrated tech critic. His most recent book, 'Dawn of the New Everything,' explores his upbringing in New Mexico, his years pioneering virtual reality in Silicon Valley in the 1980s, and his experiences working with pre-eminent scholars, critics, scientists, and developers.

Lanier sat down with Business Insider's Zoë Bernard and Steven Tweedie to chat about his latest book, the current debate over the impacts of social media, his decision to join Microsoft after working at Google, and whether or not artificial intelligence will eventually wreak havoc on humanity.

This interview has been edited for length and clarity.

ADVERTISEMENT

Steven Tweedie: In the last year, we've seen an adjustment to expectations when it comes to the consumer market for virtual reality and the hype around VR in general. What would you say to those skeptical of whether or not it will take off?

Jaron Lanier: Let’s break this down just a little bit. First of all, there’s one side of VR which is the industrial side, not the consumer side, that’s been a total success.

I’ll give you a very personal story from my life that’s an example of it. In the book, you’ll read about the surgical simulator, which dates back to the ‘80s. I did that with a few people, Dr. Joe Rosen, for example, who is a Stanford Med guy. In the last couple of years, my wife has been battling cancer and she had a bunch of operations. She’s post-cancer now, but one of her surgeons for the most difficult operation was a student of a student of Joe Rosen’s, and he used a procedure that was designed in the surgical simulator that evolved from the original one and trained in one. Since I’ve worked more on that side of things than the consumer end, I don’t have any doubts about whether or not VR is going to happen. For me, it’s been great. I think this is an established technology. I’m really proud of what we’ve done. But I’ve also played around with the consumer side a lot, starting with the Power Glove which a lot of people still have a bit of fondness for, which charms me.

By the way, I was supposed to be interviewed by Leonard Lopate on WNYC in the morning, and I just got this email that he’s been fired for sexual misconduct, 'so we’re finding another host to interview you.' The same thing happened to my interview with Charlie Rose last week. It’s hard to promote a book right now because all of the people who are supposed to interview me keep getting outed for sexual misconduct.

Tweedie: Yep, it's been non-stop — our Entertainment team has been quite busy for the past month or two. So on the consumer side of the VR market, Sony's PlayStation VR headset is leading the pack when it comes to sales, and there seems to be genuine interest in the gaming side of VR and augmented reality (AR) — what are your thoughts on how those markets will evolve?

ADVERTISEMENT

Lanier: Sony has found some success with headsets, there has been some pretty good adoption of the phone and holder for things like news clips — The New York Times has been a pioneer in that. And Pokémon Go needs to be mentioned. Pokémon Go was super crude, barely over the line of usability, and yet there it was and it engaged a lot of people and that gave us a taste of mixed reality in a wide area. People like it, it makes sense. I feel like we’re doing fine, actually. For me, this is what a new market looks like. I don’t know what people are expecting. Do you know what it is? Everybody is still in this weird post-Steve Jobs period where they want that big thrill of the iPhone intro, and those things just don’t happen a lot.

Tweedie: You've been involved with Microsoft's HoloLens headset, so I have to ask you about one of its competitors, Magic Leap, which one investor compared to the first time he experienced multi-touch technology, a key selling point of the iPhone. What's your opinion on Magic Leap?

Lanier: I want nothing more than for Magic Leap to ship and thrive. I think it would be really good for everybody, and I really hope they do, I think it’d be great. I don’t know if they will, but I hope they do. You can’t just have a single vendor in something. You can have a most innovative vendor, you can have a vendor who's ahead, but you can’t just have a single vendor. That’s not a market.

Tweedie: You've been at Microsoft for around a decade, is that right? How'd that come about?

Lanier: Well, it depends on how you count it. Never in a million years would have expected that I would have worked at Microsoft Labs, but it’s been a brilliant, amazing thing which I wouldn’t really have expected. I was a critic of Microsoft in the ‘90s, and I’ve always a bit of a radical purist, and Microsoft was the punching bag for people like me for a long time.

ADVERTISEMENT

How I ended up at Microsoft is really simple. Sergey [Brin] told me, “We don’t want people writing all of these controversial essays,” because I’ve been writing tech criticism for a long time. I’ve been worried about tech turning us into evil zombies for a long time, and Sergey said, “Well, Google people can’t be doing that.” And I was like, really? And then I was talking to Bill Gates and he said, “You can’t possibly say anything else bad about us that you haven’t said. We don’t care. Why don’t you come look at our labs? They’re really cool.” And I thought, well that sounds great. So I went and looked, and I was like, yeah, this is actually really great.

Zoe Bernard: I wanted to ask you about Silicon Valley. You’re living very close to there, in Berkeley. What is your perception of how the culture has changed?

Lanier: Well, the tech world has such incredible stories of quick money, quick power, and quick status, that I think it’s made people a little drunk and crazy, and also a little shallow, and that makes me a little sad. The amazing thing about the old days was that you could have some people in a room from early Silicon Valley, and one of them might be a billionaire, one of them might be living out of a car, and what it was all about was how much you could do. We respected technical ability over money, and I think that was a really healthy and interesting culture. And now it’s gone. Sure, broadly speaking, in the whole world, hacker culture still exists, but Silicon Valley and San Francisco have both become so intense. For one thing, you can’t afford to live there unless you’re doing really well, so a lot of people have been priced out. And I’m not down on anybody, I mean, I live there. But if you’re asking me how it’s changed, that’s how. There’s this thing that happened which is that there’s more diversity of ethnicity and background perhaps, but less diversity of cognitive style. If you have a certain kind of nerdy, quantitative problem-solving oriented cognitive style, that will get you more friends, and that will get you along better than if you have a more contemplative, aesthetic center.

Bernard: You mentioned the lack of cognitive diversity in Silicon Valley. Do you think that this lack of cognitive diversity plays an influence in the technologies being created there?

Lanier: Sometimes I do. A lot of the tools we have tend to be more usable by people who are similar to the engineers who made the tools. It’s not always true, but in general it’s a principle that seems to take hold. Engineers are designing things that work better for people who are similar to the engineers, and that turns into a social effect that favors and disfavors certain classes of people.

ADVERTISEMENT

Tweedie: It seems like that would just lead to more isolated communities and some people thinking they're smarter than others.

Lanier: This is an ongoing conversation and argument that goes back for years. If I’m in an environment with a bunch of technical men, and I say, you know, we’re doing this thing that excludes people, they’ll say, “What are you complaining about? At least you’re on the good side of it.” And my response is, “Actually, from a purely selfish point of view, it does hurt me because I’m in this weird echo chamber where I’m being told ‘you're a hacker, you’re a technical man, you’re a white man’” and it becomes this ongoing reinforcement where you’re that thing — but the thing is this total artificial bullshit classification that just happens to rise from the resonance of this stupid tool. So while I’m on the beneficial side of it, in some ways, it forces me into this box. I think this kind of thinking hurts everyone, even the people who appear to be the beneficiaries of it. They’re forced into a place where they can’t reach their full potential.

Bernard: In your first book, 'You Are Not a Gadget,' you wrote about how technology is doing us a disservice, and that computers are not yet worthy to represent people. You wrote that almost ten years ago — have your views changed at all?

Lanier: I like to think that my views are always changing. I’m always interested in re-examining my stuff and seeing if I can find some way to make it better. But that general principle — that we’re not treating people well enough with digital systems — still bothers me. I do still think that is very true.

Bernard: What do you think about programmers using consciously addicting techniques to keep people hooked to their products?

ADVERTISEMENT

Lanier: This was an open secret for a long time. Maureen Dowd published an interview with me in The New York Times that talked a little bit about it, and then the next day, Sean Parker, who I used to know, admitted to it and said, “Yeah, we did that.”

There’s a long and interesting history that goes back to the 19th century, with the science of Behaviorism that arose to study living things as though they were machines. Behaviorists had this feeling that I think might be a little like this godlike feeling that overcomes some hackers these days, where they feel totally godlike as though they have the keys to everything and can control people.

So if you zoom ahead to the 1950s or so, Norbert Wiener, one of the founders of computer science after Alan Turing and Jon van Neumann, wrote a book called 'The Human Use of Human Beings,' and in that book he points out that a computer (which at that time was a very new and exotic device that only existed in a few laboratories) could take the role of the human researcher in one of these experiments. So, if you had a computer that was reading information about what a person did and then providing stimulus, you could condition that person and change their behavior in a predictable way. He was saying that computers could turn out to have incredible social consequences. There’s an astonishing passage at the end of 'The Human Use of Human Beings' in which he says, “The thing about this book is that this hypothetical might seem scary, but in order for it to happen, there’d have to be some sort of global computing capacity with wireless links to every single person on earth who keeps some kind of device on their person all the time and obviously this is impossible.”

The behaviorists got pretty far in understanding the kinds of algorithms that can change people. They found that noisy feedback works better than consistent feedback. That means that if you’re pressing the button to get your treat, and once in a while it doesn’t work, it actually engages your mind even more — it makes you more obsessive, whether you’re a rat, or a dog, or a person. And the reason why is that the brain wants to understand the world and if there’s this thing that isn't quite working, your brain just keeps on trying to get it and wants to figure out how to build a better model. So you can really grab the brain that way.

The results from the behaviorists’ research transformed the gambling industry and made it what it is today — an algorithmic, person-manipulation industry. People are driven by emotions and some emotions are cheaper, more efficient ways to engage us. Negative emotions get you first. Fear, anger, resentment, jealousy, insecurity, grab you, and it’s easier to renew them and keep you grabbed than positive things like nurturing, adoration, appreciation of beauty. Those emotions are softer. They’re easier to kill and harder to nurture in an audience. There’s an unfortunate imbalance. So, according to Sean Parker, these types of programming were put in intentionally [in Facebook’s design]. I wasn’t in the middle of Facebook, but my memory of those days — how people were talking and what was going on — is a little different. I don’t think that it’s so much that people were evil geniuses saying, “Let’s take the worst of behaviorism and manipulate the entire world.” I think what they were doing was: let’s maximize the efficiencies of our algorithms for a purpose.

ADVERTISEMENT

Tweedie: That purpose being engagement?

Lanier: Well, this is maybe the greatest tragedy in the history of computing, and it goes like this: there was a well-intentioned, sweet movement in the ‘80s to try to make everything online free. And it started with free software and then it was free music, free news, and other free services. But, at the same time, it's not like people were clamoring for the government to do it or some sort of socialist solution. If you say, well, we want to have entrepreneurship and capitalism, but we also want it to be free, those two things are somewhat in conflict, and there’s only one way to bridge that gap, and it’s through the advertising model. And advertising became the model of online information, which is kind of crazy. But here’s the problem: if you start out with advertising, if you start out by saying what I’m going to do is place an ad for a car or whatever, gradually, not because of any evil plan — just because they’re trying to make their algorithms work as well as possible and maximize their shareholders value and because computers are getting faster and faster and more effective algorithms — what starts out as advertising morphs into behavior modification. It morphs into the very thing Weiner was warning about.

A second issue is that people who participate in a system of this time, since everything is free since it’s all being monetized, what reward can you get? Ultimately, this system creates assholes, because if being an asshole gets you attention, that’s exactly what you’re going to do. Because there’s a bias for negative emotions to work better in engagement, because the attention economy brings out the asshole in a lot of other people, the people who want to disrupt and destroy get a lot more efficiency for their spend than the people who might be trying to build up and preserve and improve.

There used to be this sense of an arc in history in which, if there was something that seemed like an injustice in society and people worked to improve it, there might be some backlash, but gradually it would improve. Now, what happens is that the backlash is greater than the original thing, and in some ways worse. For instance, the Arab Spring, driven by social media, turned into networks of terrorists. A few women trying to improve their place in the gaming world turned into Gamergate, which, in turn, became a prototype for the alt-right. Black Lives Matter is followed by a rise of white supremacy and neo-fascism which would have been inconceivable until recently.

Now, I’m just waiting to see what happens with the #MeToo movement, because the same thing always happens with these moments that are social media-centric. That good energy becomes fuel for a system that is routed to annoy another group of people who are introduced to each other, and then get riled up and that becomes even more powerful, because the system inherently supports the negative people more than the positive people.

ADVERTISEMENT

My prediction, which I hate and which I’m sorry for, is that the #MeToo backlash will be far more powerful than the #MeToo movement. And that’s because the backlash from all these other movements was more powerful than the original. And I’d say that social media driven by the so-called advertising media is fundamentally incapable of doing anything positive for society as it stands.

Bernard: What do you think that #MeToo backlash would look like?

Lanier: It’s unpredictable. It will be algorithmic. As long as it’s really annoyed and mean-spirited, that’s the thing that will count, because that would be the most engaging thing. We can’t predict what it will be, but it will be mean, and it might take on a surprising character, but it will happen. People don’t understand that #MeToo will inevitably lead to a negative outcome because of the way that things are figured structurally right now. I find that it takes about a year for it to cycle through the system, for the good stuff to turn into the bad stuff.

I try to draw a certain line, and it’s a difficult line to draw. I don’t want to become a judgmental, middle-aged person. If we can identify a particular process that’s doing damage and draw a circle around it and say, “This is it,” then I think we have to talk about it. I don’t think it’s possible for us to do better unless we change the incentive structure. Right now, of the big five tech companies, three of them don’t rely on that [advertising] model. Whatever you think of Apple, Amazon, and Microsoft, they’re selling goods and services primarily. In terms of big companies, it’s really Google and Facebook. It’s not even the whole tech industry, it’s really kind of narrow. I’m totally convinced if companies like Google and Facebook can shift to a more monetized economy, then things will get better, simply because people participating will have some incentive to add to the attention economy, where they at least have something else to do, rather than just be assholes.

Bernard: So the model you’re presenting is that you would like to see users get paid for the data they contribute rather than have Facebook and Google give that money to advertisers?

ADVERTISEMENT

Lanier: Yeah. The way I imagine it is that you’d pay a small fee to use Facebook. We pay for all kinds of things we like, so don’t freak out. Netflix proves that this can work. Look at what happens when people pay their Netflix bills, we suddenly have peak TV. People say “I’ll pay for this,” and suddenly better stuff is there. I really reject this zero-sum idea where we should volunteer because there’s no way we can be better anyway. So Facebook would charge a fee. I’m sympathetic to a lot of people who say that young people or people in poverty couldn't afford it. And sure, make some accommodation for that. But in general, people will pay a small fee, but then they’d also have a chance to earn money. If someone is a super-contributor to a social network, if they’re really adding a lot of content, they should get paid for it. Like, what Google is doing now is communist central control. They’re saying that certain YouTube personalities should be paid because they like them, but not others. That’s ridiculous. It should be a market. It should be a gradual curve, it shouldn’t be some arbitrary rule where everything is free except for this designated group. It should be universal. I think it will make things better because it will give people a different game to play in addition to seeking attention.

Sometimes people come to me and say, “You don’t make any sense,” because on the one hand I’m a tech critic and I say that tech is turning us into zombies and destroying the world. But, on the other hand, I love virtual reality and I'm promoting it. But there’s no contradiction — it’s all true at once. There’s zero contradiction. We can afford to be honest. If we’re going to look at the good side of tech, it's good enough that it’s not going to kill us to also look at the bad side and be fearful of it. I don’t think there is any inconsistency in looking at the whole spectrum.

Bernard: You have an eleven-year-old daughter. Do you monitor her interactions with technology?

Lanier: I’ve had extraordinary good fortune in that I was the one that made my daughter get a smartphone. I’m in this wonderful position where the problem took care of itself. I don’t have a problem with her being too into technology. Sometimes you get lucky. There does seem to be a correlation, though. The more a parent is involved in the technology industry, the more cautious they seem to be about their kids’ interactions with it. A lot of parents in Silicon Valley purposefully seek out anti-tech environments for their kids, like Waldorf Schools. I hope we won’t have to go there.

Bernard: I’m interested in what you think the future of technology looks like. From reading your new book, I got the sense that you’re slightly anxious, but that you also have a sense of optimism about the future. What do you think is in store?

ADVERTISEMENT

Lanier: I’m optimistic for many reasons, one reason is that it’s dysfunctional not to be. If you look at history, people have been through horrible things in the past, including very confusing things. The world has seen horrifying mass phenomenon. Somehow, we seem to be able to find our way through, and I do believe in an arc of history. I believe that as technology improves, it gives us more opportunities to learn to be decent. I think in the big picture, I am optimistic.

Bernard: Do you think that there’s a problem with people becoming progressively addicted to technology or growing too reliant on it?

Lanier: It’s all in the details. Using a technology a lot is not necessarily a bad thing, people use books a lot too. The mere use of it is not bad. When we talk about addiction, we should make it specific, and in the case of behavioral addiction, it’s really a noisy feedback loop. I do believe that these noisy feedback loops are dysfunctional, and they should not exist.

Bernard: There’s also been so many differing perspectives regarding artificial intelligence (AI). Some people, like Elon Musk, think that we should be more skeptical because it could end up controlling us, while others, like Mark Zuckerberg, seem to think it’s less insidious. Where do you fall in the spectrum of that debate?

Lanier: I have a position that is both unusual and yet entirely correct. From my perspective, there isn’t any AI. AI is just computer engineering that we do. If you take any number of different algorithms and say, “Oh, this isn’t just some program that I’m engineering to do something, this is a person, it’s a separate entity,” it’s a story you’re telling. That fantasy really attracts a lot of people. And then you call it AI. As soon as you do that, it changes the story, it’s like you’re creating life. It’s like you’re God or something. I think it makes you a worse engineer, because if you’re saying that you’re creating this being, you have to defer to that being. You have to respect it, instead of treating it as a tool that you want to make as good as possible on your terms. The actual work of AI, the math and the actuators and sensors in robots, that stuff fascinates me, and I’ve contributed to it. I’m really interested in that stuff. There’s nothing wrong with that. It’s the mythology that’s creepy.

ADVERTISEMENT

Tweedie: In your book, you describe AI as a wrapping paper that we apply to the things we build.

Lanier: Yeah, you could say that. AI is a fantasy that you apply to things. The issue with AI is that we’re giving these artifacts we build so much respect that we’re not taking responsibility for them and designing them as well as possible.

The origin of this idea is with Alan Turing, and understanding Turing’s life is important to understanding that idea about AI because he came up with this notion of AI and the Turing test in the final weeks of his life, just before he killed himself while he was undergoing torture for his sexual identity. I don’t want to presume to know what was going on in Turing’s head, but it seems to me that if there’s this person who is being forced by the state to take these hormones that are essentially a form of torture, he’s probably already contemplating suicide or knows that he’ll commit suicide. And then he publishes this thing about how maybe computers and people are the same and puts it in the form of this Victorian parlor game. You look at it, and it's a psycho-sexual drama, it's a statement, a plea for help, a form of escape or a dream of a world where sexuality doesn’t matter so much, where you can just be.

There are many ways to interpret it, but it’s clearly not just a straightforward, technical statement. For Turing, my sense is that his theory was a form of anguish. For other people, maybe it’s more like religion. If you change the words, you have the Catholic church again. The singularity is the rapture, you’re supposed to be a true believer, and if you’re not, you’re going to miss the boat and so on.

I think our responsibility as engineers is to engineer as well as possible, and to engineer as well as possible, you have to treat the thing you’re engineering as a product. You can’t respect it in a deified way. It goes in the reverse. We’ve been talking about the behaviorist approach to people, and manipulating people with addictive loops as we currently do with online systems. In this case, you’re treating people as objects. It’s the flipside of treating machines as people, as AI does. They go together. Both of them are mistakes.

ADVERTISEMENT

Jaron Lanier's latest book, is on sale now.

JOIN OUR PULSE COMMUNITY!

Unblock notifications in browser settings.
ADVERTISEMENT

Eyewitness? Submit your stories now via social or:

Email: eyewitness@pulse.ng

Recommended articles

7 do's and don’ts of the Holy month of Ramadan

7 do's and don’ts of the Holy month of Ramadan

Top 5 sweetest celebrity mother-child relationships that stand out for us

Top 5 sweetest celebrity mother-child relationships that stand out for us

International Women's Day: 5 Nigerian female celebrities championing women’s rights

International Women's Day: 5 Nigerian female celebrities championing women’s rights

Top 5 female directors in Nollywood

Top 5 female directors in Nollywood

6 things that will break a Muslim's fast during Ramadan

6 things that will break a Muslim's fast during Ramadan

5 benefits of fasting during Ramadan

5 benefits of fasting during Ramadan

Dos and don’ts of supporting Muslims during Ramadan

Dos and don’ts of supporting Muslims during Ramadan

Here are common things people rarely dream about

Here are common things people rarely dream about

5 young women who embody Y2K and alte fashion

5 young women who embody Y2K and alte fashion

ADVERTISEMENT
ADVERTISEMENT