Advertisement

US colonel backtracks on claim that AI drone killed human operator in simulation

  • The officer had described a scenario in testing where a rogue AI had taken out its minder because the person was stopping it from accomplishing the mission
  • He later admitted that the story was a thought experiment that came from outside the military and never really happened

Reading Time:2 minutes
Why you can trust SCMP
13
A US Air Force MQ-9 Reaper drone. File photo: AFP

Killer AI is on the minds of US Air Force leaders.

Advertisement

An Air Force colonel who oversees AI testing used what he now says is a hypothetical to describe a military AI going rogue and killing its human operator in a simulation in a presentation at a professional conference.

But after reports of the talk emerged on Thursday, the colonel said that he misspoke and that the “simulation” he described was a “thought experiment” that never happened.

Speaking at a conference last week in London, Colonel Tucker “Cinco” Hamilton, head of the US Air Force’s AI Test and Operations, warned that AI-enabled technology can behave in unpredictable and dangerous ways, according to a summary posted by the Royal Aeronautical Society, which hosted the summit.

03:08

What if robots took over the world? One ‘imagines’ nightmare scenario

What if robots took over the world? One ‘imagines’ nightmare scenario

As an example, he described a simulation where an AI-enabled drone would be programmed to identify an enemy’s surface-to-air missiles (SAM). A human was then supposed to sign off on any strikes.

Advertisement
Advertisement