Autonomous Drones Operating Independent Of Human Control Cause “Significant Casualties” In Libya

Live Science

At least one autonomous drone operated by artificial intelligence (AI) may have killed people for the first time last year in Libya, without any humans consulted prior to the attack, according to a U.N. report.

According to a March report from the U.N. Panel of Experts on Libya, lethal autonomous aircraft may have “hunted down and remotely engaged” soldiers and convoys fighting for Libyan general Khalifa Haftar. It’s not clear who exactly deployed these killer robots, though remnants of one such machine found in Libya came from the Kargu-2 drone, which is made by Turkish military contractor STM. More

18 Comments on Autonomous Drones Operating Independent Of Human Control Cause “Significant Casualties” In Libya

  1. FFS… Who is the stupid asshole psychopath who green-lighted this insanity? Never trust human life 100% to a goddamn piece of software. That is flat out fucking nuts.

    12
  2. MJA – That’s bad if you’re thinking like me, I’d say you should see someone about that except the shrinks are more crazy than everyone ELSE at this point.

    …Also, as I’ve said many times, I work extensively with robotics, robot programming, I/O devices that allow the robots to “see” the world, and have a fleet of autonomous cargo handling vehicles that I manage in addition to several robot arms, gantries, and delta pickers.

    So I can say catagorically and with extensive experience that the people who thought this was a GOOD idea are absolutely batshit crazy.

    I can’t say this enough…ROBOTS DO NOT THINK. Robots are PROGRAMMED. even so-called “AI” robots are working FROM A PROGRAM THAT CAME FROM A HUMAN MIND. And programming is a one-man birthday party, you don’t get any presents you don’t bring.

    Think of your average programmer. Is THAT guy likely to know anything about battlefields and troops and recognition and movements and such, to the point where he can anticipate EVERY variable, EVERY changed condition, EVERY human unpredictability, EVERY weather condition, and EVERY choice that has to be made, AND transfer that knowledge fully and completely to a robot without possiblitiy of error?

    HELL TO THE NO.

    …and that assumes the robot will PROCESS it without error, which – trust me – will NOT be the case. Inputs can be fooled, outputs can stick, camera systems can be thrown off by lighting issues and laser systems can get wigged out by rain, among a MILLION other things, and that’s on a NEW system. Put a few miles and indifferent maintenance on it, and it is NOT going to work. Ive had photoeyes wigged out by applesauce splashes, its not hard for it to happen, and that’s in a CONTROLLED environment.

    A battle field is NOT a very controlled environment.

    …robots are good for repetitive task, for heavy lifting, for things where they can be given a set of conditions that are more or less predictable and detectable, THAT they can do. They DO NOT see the world as we do no matter HOW elaborate the camera, heat, or laser height detection system, and they are NOT good at making decisions AT ALL, they only run the logic they are given when conflicted, and the conflict resolution ALSO must be programmed.

    And in THIS particular case, it was likely programmed by a Muslim for jihad purposes. I’m guessing that “ethics” and “safety” wasn’t a BIG part of the decision making here.

    …this could take hours to discuss tho, and no one read THIS far, but on the off chance any of our Star Trek fans did, just remember that Captain Kirk covered this too, with NOMAD and with The Doomsday Machine, and he didn’t have an easy time of it, either, ESPECIALLY when the Enterprise herself tried to destroy Star Fleet single handedly in “The Ultimate Computer”…

    https://www.youtube.com/watch?v=cXRsdNQEkAo

    https://www.youtube.com/watch?v=lS7cK-mQ0fQ

    https://www.youtube.com/watch?v=p3aIkX_fPUQ

    …and things like “War Games” point out that computers do not know the DIFFERENCE between simulation and real, but this is WAYYY too long, so I’ll stop…

    3
  3. For Biden it’s “DI” = Diminishing Intelligence.
    He doesn’t have the ability to see the world as it is, particularly if there is a 9 year old girl in the room.

    2
  4. 30 years ago, in another life, I worked on and with remotely operated weapon systems. They were capable of independent operation back then. I can only imagine the capability of currently fielded weapon systems today…..

    1
  5. It’s all in one’s perspective, put yourself where mosthave never been.
    I’m sitting in a foxhole, we have claymore mines set up, antipersonnel mines buried, radio in my hand to call in artillery and two platoons ready to fire before the regiment sized enemy element breaches our concertina razor wire perimeter.
    Do I want Autonomous Drones to indiscriminately shoot anyone outside my perimeter. Well, hell yeah.
    I wish we had the technology 51 years ago, I would have used them without a second thought.
    I’m damn sure those who served and survived in Korea with thousands of Chinese blowing bugles and charging their thinly held line would have used the drones and saved thousands of US and NATO warriors. Same for the WWII vets who fought in Europe and the Pacific, Iraq, Afghanistan and every other conflict.
    It’s all about where you are, whether you would use drones or not.
    Realistically, does it matter how you kill your enemy who is hell bent on killing you? Never mattered to me.

    1
  6. People recognition and a GPS defined kill box. Doesn’t seem like a tall order for the designers.

    Make sure you tell your friends the boundaries of the box.

    1

Comments are closed.