About Artificial Intelligence…

Sept. 8, 2015
Gx As

I spent part of my Labor Day weekend watching the movie, Terminator: Salvation. It’s the fourth Terminator movie, and in this one, the war with the machines has already been raging for a number of years. If you don’t already know, the movies are based on the premise that a manmade artificial intelligence becomes self-aware, and it determines that mankind is not fit to live.

As telematics and machine control and other automated technologies continue to evolve, they do so alongside the growing complexity of artificial intelligence (AI) and big data. I’ve written about this partly in my Editor’s Comments for the July/August 2015 issue of Grading and Excavation Contractor magazine.

I wrote about my concerns over AI, and the concerns of others:

Now we need to ask, how far will the innovation go? What will be the end product?

If the answer is artificial intelligence, a few of the smartest people on the planet are worried about AI. To name three, they are Bill Gates, Elon Musk, and Stephen Hawking. Their collective concerns are simply that humans, in a worst case scenario, will not be able to control intelligent machines which will lead to the enslavement or extermination of the human race. They fear that these machines will learn AI research and development and then program themselves to be exponentially smarter than humans.

But for a moment, let’s try to forget that the same AI techniques used to make battlefield drones are being used to develop autonomous cars and autonomous dirt moving equipment. In the next decade trillions of dollars are going to be spent on developing these intelligent products. The problem is it seems very little is being spent on the safety and ethics of autonomous machines. The machines will want to protect themselves and will look for the resources needed to do so. They won’t want to be turned off, and they’ll fight to survive.

Master everything from OSHA regulations, to high-tech safety equipment in this FREE Special Report: Construction Safety Topics That Can Save Lives. Download it now!

Today I would like to shine a light of hope on this dark artificial intelligence scenario.

Evernote is an app building company whose products reach more than 100 million users across the globe. Its founder, Phil Libin, disagrees with a doomsday AI outcome.

In a podcast interview, Libin said this on what makes humans so afraid of intelligent machines:

I’m not afraid of AI. I really think the AI debate is kind of overdramatized. To be honest with you, I kind of find it weird. And I find it weird for several reasons, including this one: there’s this hypothesis that we are going to build super-intelligent machines, and then they are going to get exponentially smarter and smarter, and so they will be much smarter than us, and these super-smart machines are going to make the logical decision that the best thing to do is to kill us.

I feel like there’s a couple of steps missing in that chain of events. I don’t understand why the obviously smart thing to do would be to kill all the humans. The smarter I get the less I want to kill all the humans! Why wouldn’t these really smart machines not want to be helpful? What is it about our guilt as a species that makes us think the smart thing to do would be to kill all the humans? I think that actually says more about what we feel guilty about than what’s actually going to happen.

If we really think a smart decision would be to wipe out humanity then maybe, instead of trying to prevent AI, it would be more useful to think about what are we so guilty about, and let’s fix that? Can we maybe get to a point where we feel proud of our species, and like the smart thing to do wouldn’t be to wipe it out?

I think there are a lot of important issues that are being sublimated into the AI/kill-all-humans discussion that are probably worth pulling apart and tackling independently…I think AI is going to be one of the greatest forces for good the universe has ever seen and it’s pretty exciting we’re making progress towards it.

Maybe the movies’ tagline is right: “The future is not set. There is no fate but what we make for ourselves.”