Vedant Misra



Distractions

Nick Bostrom is a philosopher at Oxford. He wrote Superintelligence, a book about how things could go wrong on the path to human-level AI.

One core idea in the book and in other AI Safety research is that it will be difficult to teach human preferences to an AI. We underestimate the complexity of human preferences because we're only ever communicating about them with other humans, all of whom have basic human preferences in common with us.

When we need to teach a superintelligent system what humans want, we'll find that it's hard.

Imagine we've developed superintelligence, and we need to give it a goal. It seems safe enough to have it maximize happiness for all humans.

But there are many ways that can go wrong---one example is a Matrix-style outcome. The AI could just render us all unconscious, and embed our brains in a bliss simulator. Billions of humans fed through tubes, trapped in simulated reality, capable of experiencing literally anything they want. Objectively, this achieves the stated goal, but it very much isn't what we wanted. This is referred to as the problem of perverse instantiation.

The thing we find so repulsive about the brains-in-a-vat scenario is that we know none of it is real. But why is it so repulsive when it isn't all that different from the world we're voluntarily creating for ourselves?

First, recognize that there seems to be no limit to the degree to which humans seek novel distractions. Take a trip to GameStop, or pull up YouTube. Millions of humans spend thousands of hours, day after day, generating new ways for people to spend their attention. Every month, enough new games, films, and shows are released to keep you engrossed for the rest of your life. Overheard at GameStop recently:

Have you seen this new PlayStation exclusive? This game looks great...I really want a PS4 now. I wish I were a millionaire, I'd buy all of these games."

Not only is there limitless content for us to consume, there's also limitless content about other content---gameplay videos, movie reviews, people talking about and parodying TV shows. Not to mention books, articles, articles about books, books about articles... it never ends.

Of course, to a degree, this is wholesome. The internet makes it possible for each of us to project our thoughts to billions of people. But when it comes to content consumption, for many people, it's hard to draw a line.

If you can regulate yourself well enough, you'll be all right. Maybe you restrict yourself to one TV binge per week. The problem is, it's already extremely hard to regulate yourself, and it will be much, much harder in the future. Not only is content getting better---there is more and more of exactly what you want to watch/read/listen to---but we're also getting more free time.

Our insatiable desire for novel methods of escaping reality is colliding with our ability to generate more content that better matches what people want.

For now, the proliferation of hyperspecific content is due to the massive number of people who can now easily generate and distribute their thoughts, thanks to the massive growth in bandwidth, the drop in cost of technology, and the explosion in how many people can access the internet.

But this gets a bit scarier when you consider that we're starting to build machines that can generate quality content. Check out Google's Project Magenta, for example. This is just the beginning.

Machines will grant us a massive amount of free time, as well as ways to spend that time. We're rapidly approaching a world in which we can have algorithms generate almost any creative content, so we can fritter away our lives doing both (a) exactly what we want and (b) nothing at all of consequence.

Did you enjoy the Harry Potter series? Have the AI procedurally generate a version of it set in Rio de Janeiro. And, hmm, let's have Voldemort win. And throw in some Game of Thrones plot elements.

Do you enjoy listening to EDM music? Have the AI generate a hundred different two-hour mixes, all better than the best trance DJ. That'll keep you going for a while at the gym.

All of the content we consume is just stuff that humans made to occupy the attention of other humans. Attention is the very currency of life, and yet it's remarkably normal that we allocate much of it to seeking out distractions.

Somehow we're okay with binge-watching television. Maybe that's because when you binge-watch a show, the fact that it was written by people, directed by people, and acted in by other people, imbues the show with meaning. It took a large group of people a lot of time and energy to produce that show.

But does content remain meaningful if we can get machines to generate all of it? I'd venture to say the answer is no.

Blissfully and distractedly, we're inching toward a slippery slope. The space of possible distractions is infinite, and it's becoming easier and easier to fritter away our lives chasing distraction after distraction. Content is opium, and we're all junkies.

Remember the guy I overheard at GameStop?

"I wish I were a millionaire, I'd get all of these games."

I imagined turning to him to say "You know, the irony is, the way to become a millionaire is to stop playing games." Instead, we both paused to look around at the other junkies in the store, trading cash for distraction. He laughed, I laughed, we paid for our videogames, and went off on our way.