AI is unstoppable. And this is why.
- Patrick Trancu
- Jun 10, 2024
- 2 min read
This is why AI is unstoppable. For better or worse, history will tell but as Carl Jung stated:
"The only real danger that exists is man himself. He is the great danger and we are pitifully unaware of it. (...) We are the origin of all coming evil".
This is not a good start to the story.
«Imagine a drone equipped with enough sensors and intelligence to identify a person by the sound of their voice. And then imagine that drone attacks»
writes Mark Bergen in today's Bloomberg News Tech Diary.
You might think the tech journalist is writing about “Slaugherbots” a 7-minute video by Stewart Sugg released in January 2019 (just 5 years ago by the way). The video opens with a Silicon Valley CEO-type delivering a product presentation to a live audience a la Steve Jobs. Switch Steve for Jensen Huang. The presenter seems to be unveiling some new drone technology—but takes a dark turn when he demonstrates how these autonomous drones can slaughter humans like cattle by delivering “a shaped explosive” to the skull. All these drone bots need is a profile: age, sex, fitness, uniform, and ethnicity. Nuclear is obsolete. Take out your entire enemy virtually risk-free. Just characterize him, release the swarm, and rest easy.
But he is not. He is writing about the present. This scenario was laid out by Alex Bornyakov, Ukraine’s deputy tech minister, last week at a NATO event in Poland, «detailing how a military drone could take out a Russian “war criminal” with a targeted assassination.
It was an unsettling advancement to weaponized drones», writes the tech journalist «which the deputy minister added was only in the “prototyping” phase inside Ukraine. But much of the artificial intelligence needed for it exists now. “Computer vision works,” he said. “It’s already proven.”
The idea would take advantage of one of Ukraine's warfare innovations. The country has installed thousands of mobile phones, on cell towers and gas stations, to act as its digital ears. Data from these sensors are paired with a neural network to create artificial intelligence tools that Bornyakov said can track enemy drones or hear when Russia fires off rockets.
However, giving computers potential control over lethal decisions, like the system Bornyakov described, is controversial among Ukraine's allies. He was speaking at a North Atlantic Treaty Organization forum in Krakow, organized to announce a new partnership with Ukraine and to showcase the nation’s rapid deployment of wartime drones, software, and other equipment.»
And if you have missed it, the Chinese military last week unveiled robot dogs armed with rifles (see first comment below).
These developments as well as many others in the military domain clearly illustrate why it is unrealistic to think about AI regulation.
As someone recently stated, "the genie is out of the box" and into weapons, we could add. And Carl Jung's words return to haunt us.