What's new
DroidForums.net | Android Forum & News

This is a sample guest message. Register a free account today to become a member! Once signed in, you'll be able to participate on this site by adding your own topics and posts, as well as connect with other members through your own private inbox!

Who is responsible for a self driving car getting pulled over

I say, blame the car. ;) BUT from the stats I've seen relative to Google's cars I think going too slowly is going to be the biggest infraction ;);)
 
Holding the owner responsible would kind of be like blaming someone for eating something that gave them food poisoning.

The entire idea of a self-driving car is - SHOCK! - you don't have to drive it. So the manufacturer has to be responsible. That will probably slow adoption because of cost, but there's no way an average person could justify a self-driving car if they face jail time and/or bankruptcy (from lawsuits) if it malfunctions.

On the other hand, assuming these cars can be overridden to drive manually (certainly in early versions), speeding and moving violations have to assume the owner's fault driving in manual because the programming shouldn't get it wrong. And, theoretically, if was a programming malfunction you could prove it with a log showing the car was self-driving at the time of the infraction.
 
I have to disagree with kodiak to a point. I think you hold the owner/passenger responsible. I would assume these things have some sort of override. Though you're not driving, there should still be some accountability there for you to pay attention to what the car's doing, your surroundings, etc.

Once these things get to the point of driving at highway speeds, I would fully place the blame on the person "responsible" for the car at that time for not, let's say, slamming on the brake override to avoid a collision when some animal or something darts into their path and the car's sensors fail to register it.

If we were to reach a point where ALL cars were autonomous and communicated with each other in real-time, then I could see blaming the manufacturer when the vehicle fails.

At this point in the game, Google has required "human overseers" I think I heard one article call them. If you're the overseer, I would argue that you're responsible for ensuring that the car is following the rules and laws of the road.
 
At this point in the game, Google has required "human overseers" I think I heard one article call them. If you're the overseer, I would argue that you're responsible for ensuring that the car is following the rules and laws of the road.

You may be right, initially. But everything you described goes against the entire concept of a self-driving vehicle. The cars are DESIGNED to react to avoid everything you're concerned with, and faster and more effectively than human drivers. If that weren't the case, they have no business being on public roads.

I think the ability to override the controls, at least quickly enough to react to a potential accident, would ultimately be less safe. And I can guarantee the law, potential victims and the manufacturer will blame driver intervention just as often and easily for contributing to or causing the accident.

The owner's liability will begin and end with proper maintenance. If I'm responsible for driving this vehicle while not actually driving it, why in the world would I ever buy it? That literally takes away the entire purpose of self-driving cars.
 
First, we all know how this is going to go. Someone is going to set the autopilot....self driving mode, crawl to either the passenger side or backseat and take a nap, the car is going to crash and people will be severely injured and/or killed, there will be lawsuits, the government will set regulations, and manufacturers not wanting to lose their heads over taking the blame will put a sticker that say "drivers are responsible for cars in self driving mode".

Just like with autopilot on an aircraft, setting it should not remove the responsibility from the human driver. Tech is a good tool for providing relief but tech is known to fail. Hey I am sympathetic to the guy (or gal) pulling the graveyard shift and personally been in that situation where I fell asleep while passing on a highway (thankfully I woke up and no one was hurt, killed, or any damage done to the car). I would love to have tech in place that will take the decision out of the person who is impaired whether through a long night of work or a long night of partying. But at the end of the day we are not willing to get on a plane where no one is in the cockpit so I doubt people are going to accept the car is the culprit. But then again @kodiak799 makes a good point, what is the point of having a self driving car?

You can bet car manufacturers, law makers, and engineers are putting this into the equation.
 
But then again @kodiak799 makes a good point, what is the point of having a self driving car?

You can bet car manufacturers, law makers, and engineers are putting this into the equation.

If the owner is liable, these cars simply won't sell. And that would be a failure to move forward if these cars are ultimately safer, or even as safe, as the average driver. I bet you wouldn't even want autodriving safety features that could malfunction or adversely affect your ability to control the vehicle.

It's a huge problem to take over to react - even with your hands on the wheel and foot on the brake, by the time you are reacting the auto-drive is already reacting. Thus even an experienced driver is going to be completely inexperienced in operating a vehicle that is already self-maneuvering. This would actually make the vehicles less safe. For legal, liability and safety reasons you can't have on-the-fly overrides. It's different for these prototypes because the drivers are extensively experienced and trained

The only middle ground is if these cars are safer and thus insurance companies are willing to cover the risk. But you'd still have to protect the owner from criminal liability.

The problem with your airplane example is we as passengers can afford and WANT to pay a pilot. Just like we can choose to hire a car driver or taxi. What this suggests is that people want the ability to override controls (which we can't fly airplanes) but that doesn't mean they want to be responsible for the autopilot on their car anymore than they would on an airplane or bus.
 
You're responsible now if you wreck your car and are found to be at fault. It's the same thing to me. Having the car drive itself is an aid or a tool to me, not something to completely take over for common sense.

Let's look at this from another angle that's been debated recently. If the car works as it's supposed to and it can read situations...Let's say you're in your driverless car and it's taking you to work, you get in the situation where either the car can hit another car or a group of people in a crosswalk or it can crash you into a guardrail. In this scenario, someone has to die. It's either you, or multiple other people. It's been tossed back and forth whether the car should choose to kill you (1 life) or multiple other people if there is absolutely no other option.

So, let's say these cars are programmed to kill the least amount of people in a scenario like that. Let's also say that we all know this when buying one of these cars. We also all know that there's an ability to override.

Now, let's say you find yourself in a situation like I've described above, but instead of letting the car make the decision, you decide and act. You override the car's programming and kill 4 people to save your own life.

In that situation, would you blame the driver/passenger of the car for overriding the car's programming and killing those people?

Keep in mind, for this scenario, there was no other option. Whatever happened, you found yourself suddenly in this situation where it was your life or these other people's lives and there wasn't anything else that could happen. Maybe you came around the corner and the car was going the proper speed, but there's this crosswalk and these people decided to go even though the sign said not to, or they're jaywalking. Either way, are you to blame for killing those people, knowing that everyone knows the car should've killed you based on it's programming?

Because I don't know how you couldn't hold that driver responsible for that scenario. And I can't see the difference between choosing to act and override the car's programming, resulting in death for other people and choosing NOT to act, letting the car have complete control, and getting in some other accident where your car kills other people.

Maybe it's just me.
 
You're responsible now if you wreck your car and are found to be at fault. It's the same thing to me. Having the car drive itself is an aid or a tool to me, not something to completely take over for common sense.

Let's look at this from another angle that's been debated recently. If the car works as it's supposed to and it can read situations...Let's say you're in your driverless car and it's taking you to work, you get in the situation where either the car can hit another car or a group of people in a crosswalk or it can crash you into a guardrail. In this scenario, someone has to die. It's either you, or multiple other people. It's been tossed back and forth whether the car should choose to kill you (1 life) or multiple other people if there is absolutely no other option.

So, let's say these cars are programmed to kill the least amount of people in a scenario like that. Let's also say that we all know this when buying one of these cars. We also all know that there's an ability to override.

Now, let's say you find yourself in a situation like I've described above, but instead of letting the car make the decision, you decide and act. You override the car's programming and kill 4 people to save your own life.

In that situation, would you blame the driver/passenger of the car for overriding the car's programming and killing those people?

Keep in mind, for this scenario, there was no other option. Whatever happened, you found yourself suddenly in this situation where it was your life or these other people's lives and there wasn't anything else that could happen. Maybe you came around the corner and the car was going the proper speed, but there's this crosswalk and these people decided to go even though the sign said not to, or they're jaywalking. Either way, are you to blame for killing those people, knowing that everyone knows the car should've killed you based on it's programming?

Because I don't know how you couldn't hold that driver responsible for that scenario. And I can't see the difference between choosing to act and override the car's programming, resulting in death for other people and choosing NOT to act, letting the car have complete control, and getting in some other accident where your car kills other people.

Maybe it's just me.

WOW you chose a 1/100000000 scenario there. Good point but just with anything in life we are not going to have an answer for everything. In your scenario you would be liable because you tampered with the car and the lawyers of that billion dollar car maker will point that because you tampered with it you took away its safety protocol, whether that is correct or not you are dead and the lawyers are not.

At the end of the day I have a feeling before we see self driving cars become a mainstay there will be legislative governing on which party will be responsible for what. If the car manufacturer is taking liability you can bet you will be driving at or below the speed limit and will not be passing anyone, especially on a 2 lane highway.

@kodiak799 makes a good point, why would I buy it if I am liable. Here is the answer to that; you may not buy it personally, but they will get people to buy it. There is a reason they pay billions upon billions on marketing. People in marketing study human behavior and know how to put a certain product in a certain movie driven by a certain actor.

Heck the floss factor alone is what is going to make people buy.

 
WOW you chose a 1/100000000 scenario there. Good point but just with anything in life we are not going to have an answer for everything. In your scenario you would be liable because you tampered with the car and the lawyers of that billion dollar car maker will point that because you tampered with it you took away its safety protocol, whether that is correct or not you are dead and the lawyers are not.

At the end of the day I have a feeling before we see self driving cars become a mainstay there will be legislative governing on which party will be responsible for what. If the car manufacturer is taking liability you can bet you will be driving at or below the speed limit and will not be passing anyone, especially on a 2 lane highway.

@kodiak799 makes a good point, why would I buy it if I am liable. Here is the answer to that; you may not buy it personally, but they will get people to buy it. There is a reason they pay billions upon billions on marketing. People in marketing study human behavior and know how to put a certain product in a certain movie driven by a certain actor.

Heck the floss factor alone is what is going to make people buy.


It's not that slim of a chance for a scenario like that to happen. I may have gotten too detailed, but I read an article a while ago that posed the question, should your car kill you instead of multiple other people if those are the only 2 options. Someone pulls out or runs out in front of the car without enough time for it to stop, there's a dropoff on one side & a car in the lane next to you. Does it smash into them or take you off the cliff?

I agree that you'd be responsible for choosing to override the car's safety feature, but my question still remains. How can you be responsible for choosing to override those features & cashing an accident (whether anyone dies or not) & not be responsible for the car crashing into someone when you could've overridden the system & avoided it?

Sent from my Nexus 6P
 
So, let's say these cars are programmed to kill the least amount of people in a scenario like that. Let's also say that we all know this when buying one of these cars. We also all know that there's an ability to override.

I don't really see that as a realistic scenario. In theory, the car could make those calculations and choose the best probable course of action. In reality, the car is going to react to avoid an accident - just as almost every person would naturally do in almost every scenario - and rely on safety features to protect the passenger.

I'll say it again - if these cars aren't better drivers than us, then they don't belong on the road. And if they are better drivers...then why are we debating people should be liable for letting the better driver handle the situation?

And you guys aren't considering redundant systems, early warning systems and fail safes that MUST be part of the equation. If you're talking a driver needing to recover from a catastrophic failure, then I think we may disagree about what a catastrophic failure is.
 
@kodiak799 makes a good point, why would I buy it if I am liable. Here is the answer to that; you may not buy it personally, but they will get people to buy it. There is a reason they pay billions upon billions on marketing. People in marketing study human behavior and know how to put a certain product in a certain movie driven by a certain actor.

I didn't mean literally no one, but not enough for the production to be profitable. And if you're insurance doesn't cover it, you can't legally drive it in public no matter how much money you have to throw away.

Otherwise, this is limited to self parking cars. And so far I haven't heard of anyone going to jail for sleeping while their car parallel parks and hits someone.

A self-driving car that malfunctions might as well have an exploding fuel tank. They aren't going to be able to pass the buck and still get regulatory approval to put these on the roads.
 
If the car manufacturer is taking liability you can bet you will be driving at or below the speed limit and will not be passing anyone, especially on a 2 lane highway.

Yeah, but the Pushy-rom will lay on the horn and flash the lights until they speed up or pull off.
 
Back
Top