39 thoughts on “Autonomous Vehicles And Drones”

  1. As we look to the future of long guns and hand guns, one of the most existential questions of their future is how to prevent them from being used for harm. What is to stop a pistol from being loaded with bullets and pointed at its target, or a rifle from being loaded with bullets and used to target an individual with pinpoint precision? Terrorist groups are already making increasing use of modified semi-automatic hunting rifles as weapons, while militaries are increasingly eyeing “sport” AR-15s as inexpensive weapons systems ideally suited for close-quarters navigation of urban environments. What can manufacturers do to curb such repurposing of their products?

    1. Though you miss the point, anyway. No one disputes that guns are weapons. This discussion is how to prevent, or at least minimize, repurposing transportation vehicles as weapons. We know that they already are, but it generally requires a driver who can be arrested or killed. What do we do when that is no longer necessary?

      1. Yeah. The point isn’t the existence of blasters, it’s the army of untrackable Cylons armed with the blasters.

    2. No modern military is buying AR-15s Bob-1.

      You can safely disabuse yourself of that piece of nonsense. Militaries buy M-4 variants. You need me to explain the difference because I easily can.

  2. We can’t prevent it, any more than we can prevent some idiot from mowing down people on a sidewalk with a car. We’ll deal with this advance in technology the same way we deal with weapons use now. Any weapon with beyond line of sight range will probably leave some kind of electronic footprint, or the criminal will need to brag about their brilliance. Good police work will find most of them. There are really not that many criminals.

        1. My point is that measures to minimize such usage with guns are shot down, if you’ll pardon the expression, due to concerns about freedom, and measures to prevent such usage with drones seem to be equally anti-freedom.

          1. Guns generally don’t shoot people on their own.

            Drones can.

            Now, it’s pretty obvious to me that the only way to prevent people from abusing drones is to ban them. So if we’re not going to call an end to technological progress, we’d better get used to people abusing them.

            This is just another reason why the future is not big cities, but small communities of people who know and trust each other.

          2. Rand Simberg said “There is no constitutional amendment prohibiting individual rights against… ”

            Indeed.

          3. They are also routinely shot down because the suggestions are stupid anent the problem allegedly being addressed. Pretty much all the usual leftist nostrums have already been demonstrated to be worthless including background checks, “assault weapons” bans, magazine size restrictions and on and on. We had a federal “assault weapons ban” in place for an entire decade. It had zero effect on gun crime in general and mass shootings in particular.

            When the Left keep coming back, time after time, with the same old crapola, I think we’re all allowed to entertain the notion that maybe safety isn’t what the Left is about so much as power and depriving political opponents of an historically effective last-ditch means of resisting its illegitimate exercise.

            The only repeatedly demonstrated way of limiting mass shootings is liberalized gun carry – open or concealed. On occasions when an armed citizen is present, body counts are much lower. It’s no accident that virtually all mass shootings take place in loci that are explicitly posted as gun-free zones. Remove illegitimate restrictions on individual self-defense and mass shootings will end when massive body counts are no longer routinely possible.

          4. Bob-1 makes some excellent points, and his link the ACLU site regarding bans on photographing “critical infrastructure” with drones adds another layer to it.

            As for any novelty associated with using drones to photograph anything has been long overtaken by events. Check out the warning signs on roads leading to, say, the Verrazzano-Narrows Bridge, prohibiting photography of that marvel. It is an attempt to prevent terrorists (i.e. white supremacists, and, uh, no one else who comes readily to mind…). Adding drone photography restrictions is a nit. In other words, we’ve already internalized Soviet-style suppression of “spying,” so what’s the big deal? (Hint: The big deal is that no one needs permission to photograph a bridge, get it?)

            On the other hand, the point about drones and the Second Amendment is well made. I, for one, would want to see no restrictions put on civilian weaponizing of drones. I’m not kidding about this, either. The Second Amendment is there primarily to allow American citizens to protect themselves from their government. The Firearms Act of 1934 and the Gun Control Act of 1968 effectively ended the right of citizens to keep and bear automatic weapons, thus putting us at a decided disadvantage against our government. That was the sole intent, if anyone has any illusions to the contrary.

            Weaponizing “drones” (i.e. UAVs) is a very, very cheap way of evening the odds. And it’s nothing new. RC airplanes have had this ability for 50 years or more. The most recent developments have been in the areas of propulsion (electric, which makes them more stealthy), remote control (radio and TV via digital links), and fully autonomous. In the latter category, the genie has been out of the bottle for some time. NASA was able to put the entire world geodetic data base into a smart phone, and use it to control a small autonomous drone out in the Mojave desert. It was able to navigate into box canyons, and change altitude to avoid any collisions. With the phone’s GPS and readily available MEMS INS capability, anyone can build a super accurate autonomous cruise…well, I hesitate to call it a “missile,” because it can do so much more.

            Anyone who thinks that the FAA’s completely unconstitutional and un-American ban on commercial drones without FAA permission is their way of protecting “privacy” and air traffic is seriously mislead. Neither is in jeopardy, and the former is none of the FAA’s business in the first place. It was the only way the USG could head off the citizens of this country developing something that could become the most effective line of defense against them that we’ve ever had.

          5. Bob-1, in 1792, privateers owned cannon-armed warships. Letters of marque and reprisal anyone?

            Seems the second covers much more than it excludes.

        2. For starters, we put an end to the aggressive and defiant ignorance about, and irrational fear of, guns that media, politicians, and public schools like to promote.

          How any of that applies to drones, we all await your articulate and detailed explanation.

      1. Make the punishment fit the crime. Most criminals are not idiots. One who can reprogram a drone without leaving a trace will be pretty rare. A fingerprint inside the box of circuitry, or dna on the structure would get most of them.

  3. “And not just weapons of war, but of domestic assassinations?”

    Well obviously the solution is to allow everyone to have their own heavily armed drone with an automatic fire suppression system – like that Israeli system that, on hearing the sound of anything that sounds like a rifle instantly returns fire.
    🙂

  4. I think a driverless car would be more useful for drug deliveries. Anonymizers and VPNs should make it virtually impossible for the police to figure out who dispatched it. Transfers from the supplier’s personal vehicle to the driverless car can occur on a random street or parking lot that isn’t monitored by cameras, so the car’s routing data wouldn’t reveal anything about the supplier except the location of the parking lot he used. An undercover cop, posing as a buyer, would find himself arresting a driverless car with no way to trace it to “Mr. Big.”

    1. The car itself would have cameras.

      What you describe also requires a high trust relationship between buyer and seller. So, it would work great in places drugs are legal and it would keep impaired delivery drivers off the roads. You don’t have to be an addict to work at a lot of these companies but they do prefer it.

  5. Rand, in a free society it will be a very tough challenge. Technology that can do a great deal of damage at a distance is getting easier to implement. So much so that it is already within the skillset of a dedicated hobbyist to make a drone that could potentially be a very effective weapon.

    This is where the totalitarian regimes have it easy, simply outlaw tinkering which such technology and make public examples of anyone who dares breaks the rules with severe punishment.

    For those of us who would abhor such restrictions on freedom to experiment and tinker (and the benefits that society gets from such individuals innovating), the real answer lays where it ultimately always does. Work on creating a truly tolerant society (and not the leftist version of faux tolerance) that everyone has a stake in preserving the peace. A society that is strong enough that even when attacks do take place we are strong enough not to reflexively grasp at humanities darker tendencies to seek totalitarian solutions of complex problems. This seemed to work pretty well in much of this country’s history, but I think we have somehow dropped the ball.

  6. The same thing that keeps cars and trucks from being used now. You hardly need a self driving vehicle, simply set a timer and walk away.

    A more interesting question is why the last big truck bombs were 93 and 95. Did all the would-be bombers since buy their bombs from the FBI?

    I think part of the answer is that it’s actually pretty hard to build a big bomb without access to military or industrial explosives. The people wanting to do it aren’t the sharpest knives in the drawer, thankfully.

    1. Bombs are such a quaint technology. By the time we have fully-autonomous drones capable of assassinations, there’ll be much more dangerous tech that’s much easier to make. Drones flying around dispersing weaponized, aerosol Ebola, for example.

      Or, even today, there’s this:

      https://www.youtube.com/watch?v=07rtBip9ixk

      A swarm of flamethrower drones with basic targeting software would make a heck of a mess.

    2. “A more interesting question is why the last big truck bombs were 93 and 95. Did all the would-be bombers since buy their bombs from the FBI?”

      I think it is due to the stringent controls that have been placed on the sale of red LED time displays, necessary for any good bomb to warn would-be-disarmers that they have only 5 seconds left…

    3. Commercial fertilizers were used in the 95 bomb — even without deliberate weaponization, they can make a heck of an explosion if handled or stored improperly, viz the Texas City disaster of 1947 or the West Fertilizer Plant explosion of 2013.

      After the 95 bomb, the Federal government instituted an awareness program among fertilizer sellers, generally small agricultural product companies in small communities where the usual customers are all known personally to the sellers. Anomalous purchases would tend to stick out — someone who is not local to the community, not known to have a farm, and often doesn’t look or talk like a typical farmer. So easy access to large amounts of fertilizer for potential bomb-makers was cut off, and the only fertilizer explosion since then in the US has been a careless company that forgot actions have consequences.

      1. An awful lot of dry fertilizer has been replaced by liquid ammonium-sulfate. It’s produced at the fertilizer warehouse by reacting anhydrous ammonia with sulfuric acid by the tank car load in a skid mounted plant that moves from place to place. Pumping and spraying is easier than shoveling and spreading.

        The ammonium-nitrate that is still sold for fertilizer is adulterated with something I can’t remember that makes it useless for bombs. ANFO is a low velocity explosive, and works best if it is confined, hence its main use in rock blasting. McVeigh, I remember used nitro-methane racing fuel instead of diesel fuel.

        As I said, making a large bomb explode on demand without some sort of confinement or real explosives is non-trivial.

  7. I had an interesting chat with a good friend who has experience writing software for autonomous vehicles (AVs). When I asked him how the software determined the horizon effect to what is ethical driving behavior he stated that preserving human life and preventing injury was always the priority and software should be coded appropriately. I then proposed to him the following dilemma of situational ethics. Suppose I’ve developed driving software for autonomous semi-tractor trailers. That software makes sure it can use its superior sensors to detect human beings in its path and makes every attempt to avoid contact. Now after a few years of these being on the road, bored teenagers (as bored teenagers will) invent a new game of chicken called “Wreck The Truck”. Where they wait for an on-coming autonomous semi, on a fast highway and jump in front of it, knowing that there is no risk of harm to themselves but watch the truck go completely out of control to avoid hitting them. Normally “good fun” and only causing property damage which may or may not be prosecuted when the teenagers flee the scene and there are no witnesses. But there is always the exception. Let’s say in the process, the truck jackknifes into an oncoming lane and kills a family in another vehicle. So what would be more ethical? Keep the software as is and risk encouraging other games of chicken or re-program it to make sure the vehicle remains under control at all times even if it means a teen might get plowed over if they can’t move their ass out of harm’s way in time? Thus discouraging such future games. (The red smear and no tire marks argument of avoiding injury to innocent third parties). I even tried to give my friend an ethical out by suggesting that AV senors be de-rated so that they cannot be “better than human” giving the AI at least the ethical consideration it could only do as good as a human driver would have been able to under the same circumstances. My friend rejected this argument out of hand saying no one would purposely de-rate AV sensors in this fashion because it would not allow the software to improve. It’s a really interesting topic that is fast becoming relevant in today’s world.

    1. Actually, your scenario is played out many time a day now. The most common form is the car that races to cut in ahead of a truck in the right lane and then hits the brakes to make the exit. If you’re behind or in the oncoming lane, better hope the truck driver saw him coming and started to ease off. Any time brakes are applied suddenly, there is a chance of jack-knifing.

      I remember many times driving past jack-knifed semis during perfectly clear, dry weather on good surfaces and will bet that something like this is the cause of most.

  8. Your hypothetical is much more plausible than those contrived situations where the robot truck has to select between swerving into a wheelchair-bound disabled veteran in the crosswalk or a van occupied by nuns in an adjoining lane.

    Your situation was the central plot point of Asimov’s “The Naked Sun”, where a robot subject to Asimov’s Laws against harming humans is tricked by a human into acting as a murder weapon. Any kind of automation can be “hacked”, and your hypothetical teen prank is as much a version of hacking as uploading a computer virus.

    There are a couple more considerations about your scenario. One, do you not think that the robot truck would at least have a really good version of anti-lock brakes and stability control where it could apply full braking force without jack-knifing? On the other hand, this truck with super-human driving abilities stands to get rear-ended by human drivers in this situation.

    On the other, other hand, teens wanting to do mischief can more easily drop rocks onto cars from an overpass, as many of them have done, and what do you do about that apart from security cameras and the criminal justice system? This robot truck would have really hi-res video of the persons engaged in the Chicken Game with a real-time link to the police? The teens would end up in jail for trying their stunt, even once, long before it results in the harm of which you speak?

    1. A friend of mine with a Tesla is convinced the car can read speed limit signs to know the speed limit on a particular section of highway. However, I’m not convinced. If this were the case, what’s to prevent someone from posting what looks like a realistic 5 MPH sign on a section of highway and laugh as the Teslas screeched to a crawl causing a traffic jam. Humans would simply ignore such an absurd sign, but how would an autonomous car know?

      1. More likely that information is in the database it uses, along with things like where highway construction has the road torn up.

Comments are closed.