Bill Gates: Why Wouldn’t You be Concerned About AI


Photo 3 of 5

Nick Bostrom

The Swedish philosopher is the director of the Future of Humanity Institute at the University of Oxford, where he’s spent a lot of time thinking about the potential outcomes of the singularity. In his new book Superintelligence, Bostrom argues that once machines surpass human intellect, they could mobilize and decide to eradicate humans extremely quickly using any number of strategies (deploying unseen pathogens, recruiting humans to their side or simple brute force). The world of the future would become ever more technologically advanced and complex, but we wouldn’t be around to see it. “A society of economic miracles and technological awesomeness, with nobody there to benefit,” he writes. “A Disneyland without children.”

In this slideshow


Editors’ Picks

CES 2018 AI Conference Schedule
Robotics Trends' AI conference at CES 2018 examines recent developments, current applications, and...

Unibo Robot Stars in Fujitsu AI Cloud Platform
Unibo can recognize users and customize conversations accordingly. Unibo can move its...

Jibo Music Brings iHeartRadio to Social Robot
ibo and iHeartRadio have teamed up to launch Jibo Music that will...

Japanese Startup GROOVE X Goes Viral as Teaser for LOVOT Robot
GROOVE X is teasing its LOVOT companion robots that are scheduled to...