Posted on: May 31, 2021 Posted by: Betty Lee Comments: 0


The revelation raises concern over terminator-style AI weapons which may kill individuals in battle with none human management. The drone was deployed in March final 12 months in the course of the battle between the Libyan authorities forces and a breakaway army faction led by Khalifa Haftar, commander of the Libyan Nationwide Military.

The report on the incident from the UN Safety Council’s Panel of Specialists on Libya was obtained by the New Scientist journal.

The drone was a Kargu-2 quadcopter created by Turkish army tech firm STM.

The weapon has an explosive cost and might be aimed toward a goal and detonates on affect.

The report, revealed earlier this 12 months, mentioned how Haftar’s forces had been “hunted down and remotely engaged” by the drones which had been working in a “extremely efficient” autonomous mode which required no human controller.

READ MORE: US gears up for warfare with China because it sends extra weapons to Australia

Writing in The Bulletin of the Atomic Scientists, he mentioned: “Present machine learning-based methods can’t successfully distinguish a farmer from a soldier.

“Farmers may maintain a rifle to defend their land, whereas troopers may use a rake to knock over a gun turret. … Even ample classification of a automobile is troublesome.”

Mr Kallenborn defined how with no human to make a judgement name, the dangers are too excessive.

He added: “Any given autonomous weapon has some likelihood of messing up, however these errors may have a variety of penalties.

“The best threat autonomous weapons are those who have a excessive chance of error and kill lots of people after they do.

“Misfiring a .357 magnum is one factor; by accident detonating a W88 nuclear warhead is one thing else.”





Supply hyperlink

Leave a Comment