Other nations - and unquestionably non-state actors - will not have such qualms.
I don't think I have it quite right. I don't think I have been either cynical or dark enough in my thoughts.
You cannot classify open source technology. With 3-D printing and the universal ability to write code, once a capability is out there - especially ones that do not require hard to get physical material - you can't pull it back.
Civilized people cannot force others to be civilized.
The danger is not flights of AI driven fighter-bombers, submarines, or surface ships fighting manned versions of the same - though we will see that - no. Things I believe will be much more personal. Much smaller. Much deadlier. Much more on the homefront.
No, I have it wrong I think. I think the odds are 50/50 that in my lifetime the scenarios outlined below will be close to being a fact of life.
This is nothing. In a few years, that bot will move so fast you’ll need a strobe light to see it. Sweet dreams… https://t.co/0MYNixQXMw— Elon Musk (@elonmusk) November 26, 2017
UC Berkeley professor Stuart Russell and the Future of Life Institute have created an eerie viral video titled "Slaughterbots" that depicts a future in which humans develop small, hand-sized drones that are programmed to identify and eliminate designated targets.Watch this in full while wearing your red hat ... and then think a bit deeper.
...
Russell, an expert on artificial intelligence, appears at the end of the video and warns against humanity's development of autonomous weapons.
ALSO: Artificial intelligence may soon be able to build more AI
"This short film is just more than speculation," Russell says. "It shows the results of integrating and militarizing technologies that we already have."
As I started this post with Elon Musk, might as well end with this interview; we're summoning the demon.
No comments:
Post a Comment