Hacker News new | past | comments | ask | show | jobs | submit login
Soon, Drones May Be Able to Make Lethal Decisions on Their Own (nationaljournal.com)
33 points by 1337biz on Oct 8, 2013 | hide | past | favorite | 46 comments



This isn't surprising. I think the future of warfare will be massive armies of drones controlled by algorithms. If we can remove the pilot from the equation, we can drastically increase the size of our military. That's not to say that algorithmic warfare will be good. I'm merely saying it is inevitable.


Can't we just do that in a computer and pretend that things were destroyed and killed? Maybe calculate the transfer of wealth that would have occurred as a consequence of inferior algorithms leading to the destruction of your virtual societies?


Then you'd need a real army to enforce the victory.


I'm sure the financial sector is ready to take on such a challenge.


This only works when both sides are sufficiently advanced and willing to take part in the system. You would still need an army for when enemyCountryX decides that they can just pummel you because you're busy playing video games.


“We can not solve our problems with the same level of thinking that created them”

― Albert Einstein

ie. We are peaceful, they are not, so we must not rest lest they destroy us while we sleep, and we must destroy them while they are.


Wasn't this already answered in an episode of Star Trek from the 1960s?


That was the basis of these thoughts, except for the fact that in that episode they would calculate the casualties and vaporize them; my idea is based on the pure transfer of 'wealth' that would have occurred through the destruction and re-allocation of the 'resources' from the 'loser' to the 'winner'. No actual death need occur directly due to violence but only through purely economic and financial means (job loss, starvation), similar to what has already happened after the last 'Financial Crisis' in developing countries.



Why would there be anything past the first "battle" - terrorist fill drones with dirty radioactive dust, launch from sea into US, slowly release payload.

Game over, half the population on the east or west coast becomes extremely ill or develop cancers, food and water sources permanently tainted.

Instead of making drones we should be making drone detection and capture systems.


"radioactive dust" does not work that way. You should run some numbers and see if your movie-plot threat is plausible before proposing it as a risk. (hint: it's not)


As a psychological weapon it's extremely effective; sure the actual dose of radiation would be less than you'd be exposed to flying cross-country.

But the fear?

That would be very effective. A terror weapon in multiple senses of the word.


US soldiers and others near battlefields overseas are constantly getting ill from dust from the (badly named) depleted uranium ordinance that the US uses (and white phosphorus weapons).

That's what leads me to believe this is plausible.


> US soldiers and others near battlefields overseas are constantly getting ill from dust from the (badly named) depleted uranium ordinance that the US uses (and white phosphorus weapons).

No, they aren't. What's your source claiming this? Why do you say depleted uranium is "badly named"? And if you took some of this dust and spread it on "the east coast" or "the west coast", wouldn't that dilute the concentration by more than a factor of a million and turn an already speculative and unproven risk into a truly truly insignificant risk?

https://en.wikipedia.org/wiki/Depleted_uranium#Studies_indic...


Did you purposely link to heritage or was that an accidental result of google?

Because that is pretty much a showstopper as they are wild right-wing nuts.

Okay now you are using wikipedia. Read the rest of the page and realize that only the US, UK and Israel say the DU is okay, the rest of the UN says not. I wonder who has DU ammunition.

https://en.wikipedia.org/wiki/Depleted_uranium#Chemical_toxi...


Australia, too, and the IAEA. I think you are in too much of a hurry to make your point.


*ordnance

It's a shame US tank designers also use DU tank armor to protect soldiers. They're actually killing them with the radiation!


You seem to be missing my point.

Should those drone detection and capture systems be controlled by humans or algorithms? Just because you remove the word "drone" doesn't make algorithmic warfare any less inevitable.



You have stumbled across an idea older than nuclear weapons. Heinlen's "Solution Unsatisfactory" was written with a very similar solution in mind ( http://en.wikipedia.org/wiki/Solution_Unsatisfactory ).

That said, that is perhaps the silliest way of using drones if one meant business.


Landmines already make lethal decisions on their own. They just use far simpler heuristics.

There's not going to be some big red switch that is flipped and suddenly the machines decide what to shoot at. Instead there will just be a gradual extension of fire control mechanisms.


Landmines already make lethal decisions on their own. They just use far simpler heuristics.

And they have an appalling track record for making the wrong decision, because those simple heuristics also tend to produce answers like "Kill this five-year-old child because they're playing in a field where a war took place twenty years ago".

If you're trying to make an argument for why automated killing machines are a bad idea, landmines are just about the perfect example.


"on Their Own" isn't correct, there is still legal liability, even if it's distant, someone has to ok the algo, someone has to fund it, people have to start the process off... Heat seeking missiles aren't new, and are also automated yet lethal...


Lethal Autonomous Robots are inevitable. They think faster than humans, are much more tactically coordinated and all of them can be updated with new instructions and bug fixes. Just ask a marine captain how difficult it is to bug fix and update his men and you can see the value.

The best way to use drones is to think of them as the appendages or hands of one robotic brain instead of individual units. The beauty of networked robots is that they are ultimately coordinated, one knows what all the others know at all times. This is the absolute dream of military commanders. They would adopt the idea much more readily if it didn't put them out of a job. The only reason we can hang on the nostalgia of men at war is because the US hasn't faced a serious military threat in a very long time. Eventually this will no longer be the case and drone on drone combat will be the only reasonable option.

One objection that people have is that drones will eliminate the human cost of war and make destructive combat more common. I say that frequent combat is just fine as long as humans aren't involved. In the future opposing sides will resolve conflicts by destroying a few hundred million dollars of equipment instead of destroying human lives. I'm hoping that total war will be phased out as machines become powerful enough to dominate the battlefield.


Lethal Autonomous Robots are inevitable. They think faster than humans

On the contrary. They follow orders faster than humans. They don't think at all, in the sense that a human can with their adaptability and ethical awareness and ability to make a judgement call that a situation is not something the predetermined rules were designed to handle.

This is the main reason I don't see LARs becoming widely used. They are superior to human warriors in the kind of war that is decided by split-second reactions, when you know exactly who the enemy is and you're otherwise roughly evenly matched. Very few modern wars are actually like that.

The beauty of networked robots is that they are ultimately coordinated, one knows what all the others know at all times. This is the absolute dream of military commanders.

But that dream becomes a nightmare if the system is ever compromised. Of course, no battlefield-wide, 100% computer-driven, 100% remote-controlled system would ever be deployed with a security vulnerability, for the same reasons that no military equipment ever fails under combat conditions, no military unit in the field ever finds itself unable to communicate with its commanders, no computer virus has ever found its way into a secure installation, no software bug has ever caused military equipment to fail, and no human involved in the military or intelligence communities or their suppliers has ever been disloyal.

One objection that people have is that drones will eliminate the human cost of war and make destructive combat more common.

What is the point of having these machines, if they will not ultimately result in a human cost? Weapons of war are made for one purpose, and one purpose only: to be able to kill people. Any more desirable outcomes, like taking control of a hostile country without actually killing everyone, tend to be predicated on the threat that you can do so if you need to.


What if a drone could drop something like a small cluster-bomb munition, but instead of bomblets, the payload was a bunch of smaller quadcopter drones equipped with TASERs and programmed to immobilize anyone in the area? This wouldn't be free of fatalities, but it could involve far fewer of them.


And then what? You'd have to have a commando team in these tribal areas close enough to catch the incapacitated victims before they can stand up again.


I was thinking in the context of domestic use along the border or for use with drug interdiction.

Also, I can think of uses for temporarily incapacitating people. The US government sends in an airstrike along a corridor covered by airbases. However, when the opposing air forces try to scramble their jets, their pilots are incapacitated and harassed by quadcopter drones deployed from a container dropped earlier by a stealthed drone. I think there would be a lot of uses for such capability for special forces units.


A cluster bomb delivered in that same container would be cheaper, more effective and with less risk to fail.

In military applications, it is almost always easier to permanently incapacitate people. Tech for temporary incapacitation is needed only in those very rare cases where we really, really need someone alive; where the capture is so vital that it's worth risking and often spending our troop lives to achieve that.


> A cluster bomb delivered in that same container would be cheaper, more effective and with less risk to fail.

Yes. This scheme is bending over backwards to try not to kill people. Just killing them would be much simpler.


Or you actually blow up their jets during the airstrike, so that they can't use them ever.

Seriously, "harass" the pilots? A gentler, more playful war I suppose.


You wouldn't feel playful about being harassed by Tasers. But yes, this is bending over backwards to avoid killing people. It would be easier to just bomb them.


It'd be more reliable just to blow the pilots aircraft up before they got off the ground. (Or the runway for that matter.) You can disable a fighter with a couple of bullets from a handgun easily enough, they're fragile in that way.


Tiny drones with tasers wouldn't be much of a match for a bunch of guys with automatic rifles - a rifle has a much greater range and a single burst of fire would destroy the drone. Also, it's hard to taser someone who's wearing body armor or is in a vehicle. This kind of weapon would only work on unarmed civilians.


You'd have to be one hell of a shot to hit a small, fast moving quadcopter. A shotgun would be a much better bet than a rifle.

Not that I think the GP's idea is remotely plausible...


I always thought about packing these up to be air-dropped by cargo aircraft into live fire areas:

http://en.wikipedia.org/wiki/Phalanx_CIWS

I believe it has the accuracy necessary against drones, people, damn near anything that moves actually.


Quadcopters have terrible speed and maneuverability against free falling bomblets and would require a lot of expensive GNC, precision tracking in each (disposable) unit. Basically you have a lighter version of the "hit a bullet with a bullet" problem as in ABM.

Something with lasers might be feasible though.


Good TED talk for why that shouldn't happen:

http://www.youtube.com/watch?v=pMYYx_im5QI


I for one look forward to Judgement Day...


So, what you're saying is "Eagle Eye" might come true soon then?


whadya think of the redesigned site?


I think if you want to maintain the image of a respected journal you should probably disable commenting on the site..


It would be nice to see it but I have a full page advertisement in my way of it. /snark


I just got an ad that blocked the whole site.

Shit like this is the reason your site will almost never get bookmarked. I definitely have no desire to come back to it.


Sorry, but machines do not make decisions, never will. Humans that create machines make decisions.


    void decide() {
        while(((float)rand() / RAND_MAX) > 0.00000001);
        push_the_button();
    }
    
    int main() { decide(); }
Relevant: http://www.vagrearg.org/content/edm




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: