Random Thread - Anything Goes
-
@dashrender said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
@scottalanmiller said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
I do have a problem with this considering that the owner of the car, let's say it's Scott's car, is not the programmer of the system, nor can he mod it, so it seems kinda unjust to claim that Scott is to blame if the car is in an accident that is the car's fault.
Which is why it doesn't work that way. The driver is at fault in an accident, not the owner. The owner, however, is liable for a stationary car.
so how do you solve the driver problem? Who is the driver in a driverless car? is it the manufacturer?
You still haven't answer this question, @scottalanmiller
I did. You asked if it was the manufacturer and obviously the answer is yes. They alone make the driving decisions.
-
@dashrender said in Random Thread - Anything Goes:
@scottalanmiller said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
@nerdydad said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
A big presumption with self driving cars is that they simply wouldn't violate the law. So it's unlikely that you would have a law breaking issue. It's more likely that you would have a mechanical or software failure that causes an accident.
So today, if you're driving down the street, and your tire comes off and causes you to cause an accident, then you are of course at fault.
But what if the software is what's at fault? Let's take the recent Tesla that drove under a semi-truck. The software couldn't tell the difference between the truck and the sky, so it thought nothing was there.
Now - discounting that it's probably not legal that the driver was sleeping (again, let's just completely remove this, and assume it's a time when this IS legal) - who's fault is this? Is it the owner's or Tesla's?
Good question. IF the owner was awake and aware of his surroundings, would he be able override Autopilot to take over control if he saw that the Tesla was about to make a bad decision?
Today? yes he would.
Because it is not a self driving car, though.
It is self driving, just not fully autonomous. That's more when the steering wheel is removed.
That's not self driving. It's driving assist. Not the same thing. Any car will keep going if you take your hands off of the wheel, but they don't fully drive themselves.
-
@scottalanmiller said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
@scottalanmiller said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
@nerdydad said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
A big presumption with self driving cars is that they simply wouldn't violate the law. So it's unlikely that you would have a law breaking issue. It's more likely that you would have a mechanical or software failure that causes an accident.
So today, if you're driving down the street, and your tire comes off and causes you to cause an accident, then you are of course at fault.
But what if the software is what's at fault? Let's take the recent Tesla that drove under a semi-truck. The software couldn't tell the difference between the truck and the sky, so it thought nothing was there.
Now - discounting that it's probably not legal that the driver was sleeping (again, let's just completely remove this, and assume it's a time when this IS legal) - who's fault is this? Is it the owner's or Tesla's?
Good question. IF the owner was awake and aware of his surroundings, would he be able override Autopilot to take over control if he saw that the Tesla was about to make a bad decision?
Today? yes he would.
Because it is not a self driving car, though.
It is self driving, just not fully autonomous. That's more when the steering wheel is removed.
That's not self driving. It's driving assist. Not the same thing. Any car will keep going if you take your hands off of the wheel, but they don't fully drive themselves.
Driving assist, in the same category as cruise control. But we've had cruise control for the last 20 years with no problems (except for a few Darwin-award recipients). The driver is still aware and controlling the car, but has just told the car at what speed to keep driving at. Same goes from the standard to automatic transmission. We don't have to tell the car what gear to drive in, it does it automatically for us based on throttle and RPM of the engine.
At what point are we going from driver-assisted to fully autonomous? When the driver seat is the same as the passenger seat?
-
@nerdydad said in Random Thread - Anything Goes:
@scottalanmiller said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
@scottalanmiller said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
@nerdydad said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
A big presumption with self driving cars is that they simply wouldn't violate the law. So it's unlikely that you would have a law breaking issue. It's more likely that you would have a mechanical or software failure that causes an accident.
So today, if you're driving down the street, and your tire comes off and causes you to cause an accident, then you are of course at fault.
But what if the software is what's at fault? Let's take the recent Tesla that drove under a semi-truck. The software couldn't tell the difference between the truck and the sky, so it thought nothing was there.
Now - discounting that it's probably not legal that the driver was sleeping (again, let's just completely remove this, and assume it's a time when this IS legal) - who's fault is this? Is it the owner's or Tesla's?
Good question. IF the owner was awake and aware of his surroundings, would he be able override Autopilot to take over control if he saw that the Tesla was about to make a bad decision?
Today? yes he would.
Because it is not a self driving car, though.
It is self driving, just not fully autonomous. That's more when the steering wheel is removed.
That's not self driving. It's driving assist. Not the same thing. Any car will keep going if you take your hands off of the wheel, but they don't fully drive themselves.
Driving assist, in the same category as cruise control. But we've had cruise control for the last 20 years with no problems (except for a few Darwin-award recipients). The driver is still aware and controlling the car, but has just told the car at what speed to keep driving at. Same goes from the standard to automatic transmission. We don't have to tell the car what gear to drive in, it does it automatically for us based on throttle and RPM of the engine.
At what point are we going from driver-assisted to fully autonomous? When the driver seat is the same as the passenger seat?
Cruise control can't stop a car, can't change lanes - driver assist as it's called can and does.
If the programming had been able to tell that the semi-truck was a truck and not part of the sky, it likely would have stopped the car. These types of things make it very different from cruise control. -
@dashrender said in Random Thread - Anything Goes:
@nerdydad said in Random Thread - Anything Goes:
@scottalanmiller said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
@scottalanmiller said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
@nerdydad said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
A big presumption with self driving cars is that they simply wouldn't violate the law. So it's unlikely that you would have a law breaking issue. It's more likely that you would have a mechanical or software failure that causes an accident.
So today, if you're driving down the street, and your tire comes off and causes you to cause an accident, then you are of course at fault.
But what if the software is what's at fault? Let's take the recent Tesla that drove under a semi-truck. The software couldn't tell the difference between the truck and the sky, so it thought nothing was there.
Now - discounting that it's probably not legal that the driver was sleeping (again, let's just completely remove this, and assume it's a time when this IS legal) - who's fault is this? Is it the owner's or Tesla's?
Good question. IF the owner was awake and aware of his surroundings, would he be able override Autopilot to take over control if he saw that the Tesla was about to make a bad decision?
Today? yes he would.
Because it is not a self driving car, though.
It is self driving, just not fully autonomous. That's more when the steering wheel is removed.
That's not self driving. It's driving assist. Not the same thing. Any car will keep going if you take your hands off of the wheel, but they don't fully drive themselves.
Driving assist, in the same category as cruise control. But we've had cruise control for the last 20 years with no problems (except for a few Darwin-award recipients). The driver is still aware and controlling the car, but has just told the car at what speed to keep driving at. Same goes from the standard to automatic transmission. We don't have to tell the car what gear to drive in, it does it automatically for us based on throttle and RPM of the engine.
At what point are we going from driver-assisted to fully autonomous? When the driver seat is the same as the passenger seat?
Cruise control can't stop a car, can't change lanes - driver assist as it's called can and does.
If the programming had been able to tell that the semi-truck was a truck and not part of the sky, it likely would have stopped the car. These types of things make it very different from cruise control.But you still see the evolution of the car. Its more common for cars to parallel park, to monitor lanes and blind spots for you, etc.
-
Wife just ordered the kids and her breakfast from this thing.
Said it took all of 5 minutes and was very easy to do. Automating jobs already.
-
@nerdydad said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
@nerdydad said in Random Thread - Anything Goes:
@scottalanmiller said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
@scottalanmiller said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
@nerdydad said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
A big presumption with self driving cars is that they simply wouldn't violate the law. So it's unlikely that you would have a law breaking issue. It's more likely that you would have a mechanical or software failure that causes an accident.
So today, if you're driving down the street, and your tire comes off and causes you to cause an accident, then you are of course at fault.
But what if the software is what's at fault? Let's take the recent Tesla that drove under a semi-truck. The software couldn't tell the difference between the truck and the sky, so it thought nothing was there.
Now - discounting that it's probably not legal that the driver was sleeping (again, let's just completely remove this, and assume it's a time when this IS legal) - who's fault is this? Is it the owner's or Tesla's?
Good question. IF the owner was awake and aware of his surroundings, would he be able override Autopilot to take over control if he saw that the Tesla was about to make a bad decision?
Today? yes he would.
Because it is not a self driving car, though.
It is self driving, just not fully autonomous. That's more when the steering wheel is removed.
That's not self driving. It's driving assist. Not the same thing. Any car will keep going if you take your hands off of the wheel, but they don't fully drive themselves.
Driving assist, in the same category as cruise control. But we've had cruise control for the last 20 years with no problems (except for a few Darwin-award recipients). The driver is still aware and controlling the car, but has just told the car at what speed to keep driving at. Same goes from the standard to automatic transmission. We don't have to tell the car what gear to drive in, it does it automatically for us based on throttle and RPM of the engine.
At what point are we going from driver-assisted to fully autonomous? When the driver seat is the same as the passenger seat?
Cruise control can't stop a car, can't change lanes - driver assist as it's called can and does.
If the programming had been able to tell that the semi-truck was a truck and not part of the sky, it likely would have stopped the car. These types of things make it very different from cruise control.But you still see the evolution of the car. Its more common for cars to parallel park, to monitor lanes and blind spots for you, etc.
OK parallel parking is autonomous driving, can you still interfere? probably, grab the wheel, the car will likely stop. But blind spot sensors in most cars today do nothing but turn on an audible tone to alert the driver.
In the case of the Telsa, the blind spot sensors (not really though, just sensors in general) will change lanes when in driving assist mode (so I understand, and will accept a correction if my understanding is false).
-
@dashrender said in Random Thread - Anything Goes:
@nerdydad said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
@nerdydad said in Random Thread - Anything Goes:
@scottalanmiller said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
@scottalanmiller said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
@nerdydad said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
A big presumption with self driving cars is that they simply wouldn't violate the law. So it's unlikely that you would have a law breaking issue. It's more likely that you would have a mechanical or software failure that causes an accident.
So today, if you're driving down the street, and your tire comes off and causes you to cause an accident, then you are of course at fault.
But what if the software is what's at fault? Let's take the recent Tesla that drove under a semi-truck. The software couldn't tell the difference between the truck and the sky, so it thought nothing was there.
Now - discounting that it's probably not legal that the driver was sleeping (again, let's just completely remove this, and assume it's a time when this IS legal) - who's fault is this? Is it the owner's or Tesla's?
Good question. IF the owner was awake and aware of his surroundings, would he be able override Autopilot to take over control if he saw that the Tesla was about to make a bad decision?
Today? yes he would.
Because it is not a self driving car, though.
It is self driving, just not fully autonomous. That's more when the steering wheel is removed.
That's not self driving. It's driving assist. Not the same thing. Any car will keep going if you take your hands off of the wheel, but they don't fully drive themselves.
Driving assist, in the same category as cruise control. But we've had cruise control for the last 20 years with no problems (except for a few Darwin-award recipients). The driver is still aware and controlling the car, but has just told the car at what speed to keep driving at. Same goes from the standard to automatic transmission. We don't have to tell the car what gear to drive in, it does it automatically for us based on throttle and RPM of the engine.
At what point are we going from driver-assisted to fully autonomous? When the driver seat is the same as the passenger seat?
Cruise control can't stop a car, can't change lanes - driver assist as it's called can and does.
If the programming had been able to tell that the semi-truck was a truck and not part of the sky, it likely would have stopped the car. These types of things make it very different from cruise control.But you still see the evolution of the car. Its more common for cars to parallel park, to monitor lanes and blind spots for you, etc.
OK parallel parking is autonomous driving, can you still interfere? probably, grab the wheel, the car will likely stop. But blind spot sensors in most cars today do nothing but turn on an audible tone to alert the driver.
In the case of the Telsa, the blind spot sensors (not really though, just sensors in general) will change lanes when in driving assist mode (so I understand, and will accept a correction if my understanding is false).
Not saying that your understanding is false. I thought I had a point but it has more likely just dwindles into oblivion as I don't remember what it was.
-
Scott is doing to many vids to keep up lol need an app to auto download new stuff to my phone for offline viewing........Just don't have time to look at the moment ..... job for next week.
-
@hobbit666 said in Random Thread - Anything Goes:
Scott is doing to many vids to keep up lol need an app to auto download new stuff to my phone for offline viewing........Just don't have time to look at the moment ..... job for next week.
I've not seen one of those. Let us know what you find. I've seen them for desktop, but not for phone.
-
does Youtube now allow offline viewing?
-
@nerdydad said in Random Thread - Anything Goes:
@scottalanmiller said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
@scottalanmiller said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
@nerdydad said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
A big presumption with self driving cars is that they simply wouldn't violate the law. So it's unlikely that you would have a law breaking issue. It's more likely that you would have a mechanical or software failure that causes an accident.
So today, if you're driving down the street, and your tire comes off and causes you to cause an accident, then you are of course at fault.
But what if the software is what's at fault? Let's take the recent Tesla that drove under a semi-truck. The software couldn't tell the difference between the truck and the sky, so it thought nothing was there.
Now - discounting that it's probably not legal that the driver was sleeping (again, let's just completely remove this, and assume it's a time when this IS legal) - who's fault is this? Is it the owner's or Tesla's?
Good question. IF the owner was awake and aware of his surroundings, would he be able override Autopilot to take over control if he saw that the Tesla was about to make a bad decision?
Today? yes he would.
Because it is not a self driving car, though.
It is self driving, just not fully autonomous. That's more when the steering wheel is removed.
That's not self driving. It's driving assist. Not the same thing. Any car will keep going if you take your hands off of the wheel, but they don't fully drive themselves.
Driving assist, in the same category as cruise control. But we've had cruise control for the last 20 years with no problems (except for a few Darwin-award recipients). The driver is still aware and controlling the car, but has just told the car at what speed to keep driving at. Same goes from the standard to automatic transmission. We don't have to tell the car what gear to drive in, it does it automatically for us based on throttle and RPM of the engine.
At what point are we going from driver-assisted to fully autonomous? When the driver seat is the same as the passenger seat?
BUt there HAVE been problems of people putting their are on cruise control and going to sleep. I remember a car in the 1980s. It was out west in the desert and the car went straight on its own for so long that they thought it was like "auto pilot" and thought the car was driving itself. It wasn't. It eventually drove into the desert. They lived, it was funny.
But same as the assist today. It's not in charge. It's just cruise control +. The drive is still the driver and in charge of all decisions.
-
@dashrender said in Random Thread - Anything Goes:
does Youtube now allow offline viewing?
With a paid subscription.
-
@nerdydad said in Random Thread - Anything Goes:
At what point are we going from driver-assisted to fully autonomous? When the driver seat is the same as the passenger seat?
That will certainly happen. But really it's when no one needs to sit in the driver's seat. As long as a driver is required, it's not self driving. Once a driver is not required and/or possible, it's self driving. The difference is when the car does the driving or the human does. Right now, the human is always in charge, always. The car can recommend things and do some actions on its own, but the human is always responsible for them being the right ones (and can always override them.)
-
@scottalanmiller said in Random Thread - Anything Goes:
@hobbit666 said in Random Thread - Anything Goes:
Scott is doing to many vids to keep up lol need an app to auto download new stuff to my phone for offline viewing........Just don't have time to look at the moment ..... job for next week.
I've not seen one of those. Let us know what you find. I've seen them for desktop, but not for phone.
What desktop app?
-
@dashrender said in Random Thread - Anything Goes:
@nerdydad said in Random Thread - Anything Goes:
@scottalanmiller said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
@scottalanmiller said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
@nerdydad said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
A big presumption with self driving cars is that they simply wouldn't violate the law. So it's unlikely that you would have a law breaking issue. It's more likely that you would have a mechanical or software failure that causes an accident.
So today, if you're driving down the street, and your tire comes off and causes you to cause an accident, then you are of course at fault.
But what if the software is what's at fault? Let's take the recent Tesla that drove under a semi-truck. The software couldn't tell the difference between the truck and the sky, so it thought nothing was there.
Now - discounting that it's probably not legal that the driver was sleeping (again, let's just completely remove this, and assume it's a time when this IS legal) - who's fault is this? Is it the owner's or Tesla's?
Good question. IF the owner was awake and aware of his surroundings, would he be able override Autopilot to take over control if he saw that the Tesla was about to make a bad decision?
Today? yes he would.
Because it is not a self driving car, though.
It is self driving, just not fully autonomous. That's more when the steering wheel is removed.
That's not self driving. It's driving assist. Not the same thing. Any car will keep going if you take your hands off of the wheel, but they don't fully drive themselves.
Driving assist, in the same category as cruise control. But we've had cruise control for the last 20 years with no problems (except for a few Darwin-award recipients). The driver is still aware and controlling the car, but has just told the car at what speed to keep driving at. Same goes from the standard to automatic transmission. We don't have to tell the car what gear to drive in, it does it automatically for us based on throttle and RPM of the engine.
At what point are we going from driver-assisted to fully autonomous? When the driver seat is the same as the passenger seat?
Cruise control can't stop a car, can't change lanes - driver assist as it's called can and does.
If the programming had been able to tell that the semi-truck was a truck and not part of the sky, it likely would have stopped the car. These types of things make it very different from cruise control.No, but cruise control can speed up and slow down the car. Both are a form of driver assist, one just has more features. It's a slow addition of more and more features over time. But none of that matters till we switch over to driverless. There is no gradual there, you actively have to switch who is in charge of driving the car.
-
@scottalanmiller said in Random Thread - Anything Goes:
@nerdydad said in Random Thread - Anything Goes:
@scottalanmiller said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
@scottalanmiller said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
@nerdydad said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
A big presumption with self driving cars is that they simply wouldn't violate the law. So it's unlikely that you would have a law breaking issue. It's more likely that you would have a mechanical or software failure that causes an accident.
So today, if you're driving down the street, and your tire comes off and causes you to cause an accident, then you are of course at fault.
But what if the software is what's at fault? Let's take the recent Tesla that drove under a semi-truck. The software couldn't tell the difference between the truck and the sky, so it thought nothing was there.
Now - discounting that it's probably not legal that the driver was sleeping (again, let's just completely remove this, and assume it's a time when this IS legal) - who's fault is this? Is it the owner's or Tesla's?
Good question. IF the owner was awake and aware of his surroundings, would he be able override Autopilot to take over control if he saw that the Tesla was about to make a bad decision?
Today? yes he would.
Because it is not a self driving car, though.
It is self driving, just not fully autonomous. That's more when the steering wheel is removed.
That's not self driving. It's driving assist. Not the same thing. Any car will keep going if you take your hands off of the wheel, but they don't fully drive themselves.
Driving assist, in the same category as cruise control. But we've had cruise control for the last 20 years with no problems (except for a few Darwin-award recipients). The driver is still aware and controlling the car, but has just told the car at what speed to keep driving at. Same goes from the standard to automatic transmission. We don't have to tell the car what gear to drive in, it does it automatically for us based on throttle and RPM of the engine.
At what point are we going from driver-assisted to fully autonomous? When the driver seat is the same as the passenger seat?
BUt there HAVE been problems of people putting their are on cruise control and going to sleep. I remember a car in the 1980s. It was out west in the desert and the car went straight on its own for so long that they thought it was like "auto pilot" and thought the car was driving itself. It wasn't. It eventually drove into the desert. They lived, it was funny.
But same as the assist today. It's not in charge. It's just cruise control +. The drive is still the driver and in charge of all decisions.
Crazy stories like this exist everywhere. One crazy person (lack of understanding person) does not a reality make.
-
@nerdydad said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
@nerdydad said in Random Thread - Anything Goes:
@scottalanmiller said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
@scottalanmiller said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
@nerdydad said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
A big presumption with self driving cars is that they simply wouldn't violate the law. So it's unlikely that you would have a law breaking issue. It's more likely that you would have a mechanical or software failure that causes an accident.
So today, if you're driving down the street, and your tire comes off and causes you to cause an accident, then you are of course at fault.
But what if the software is what's at fault? Let's take the recent Tesla that drove under a semi-truck. The software couldn't tell the difference between the truck and the sky, so it thought nothing was there.
Now - discounting that it's probably not legal that the driver was sleeping (again, let's just completely remove this, and assume it's a time when this IS legal) - who's fault is this? Is it the owner's or Tesla's?
Good question. IF the owner was awake and aware of his surroundings, would he be able override Autopilot to take over control if he saw that the Tesla was about to make a bad decision?
Today? yes he would.
Because it is not a self driving car, though.
It is self driving, just not fully autonomous. That's more when the steering wheel is removed.
That's not self driving. It's driving assist. Not the same thing. Any car will keep going if you take your hands off of the wheel, but they don't fully drive themselves.
Driving assist, in the same category as cruise control. But we've had cruise control for the last 20 years with no problems (except for a few Darwin-award recipients). The driver is still aware and controlling the car, but has just told the car at what speed to keep driving at. Same goes from the standard to automatic transmission. We don't have to tell the car what gear to drive in, it does it automatically for us based on throttle and RPM of the engine.
At what point are we going from driver-assisted to fully autonomous? When the driver seat is the same as the passenger seat?
Cruise control can't stop a car, can't change lanes - driver assist as it's called can and does.
If the programming had been able to tell that the semi-truck was a truck and not part of the sky, it likely would have stopped the car. These types of things make it very different from cruise control.But you still see the evolution of the car. Its more common for cars to parallel park, to monitor lanes and blind spots for you, etc.
Right, we actually have had years of cars assisting a little more each year over the last decade. But it is still all assisting.
-
@scottalanmiller said in Random Thread - Anything Goes:
@nerdydad said in Random Thread - Anything Goes:
@scottalanmiller said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
@scottalanmiller said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
@nerdydad said in Random Thread - Anything Goes:
@dashrender said in Random Thread - Anything Goes:
A big presumption with self driving cars is that they simply wouldn't violate the law. So it's unlikely that you would have a law breaking issue. It's more likely that you would have a mechanical or software failure that causes an accident.
So today, if you're driving down the street, and your tire comes off and causes you to cause an accident, then you are of course at fault.
But what if the software is what's at fault? Let's take the recent Tesla that drove under a semi-truck. The software couldn't tell the difference between the truck and the sky, so it thought nothing was there.
Now - discounting that it's probably not legal that the driver was sleeping (again, let's just completely remove this, and assume it's a time when this IS legal) - who's fault is this? Is it the owner's or Tesla's?
Good question. IF the owner was awake and aware of his surroundings, would he be able override Autopilot to take over control if he saw that the Tesla was about to make a bad decision?
Today? yes he would.
Because it is not a self driving car, though.
It is self driving, just not fully autonomous. That's more when the steering wheel is removed.
That's not self driving. It's driving assist. Not the same thing. Any car will keep going if you take your hands off of the wheel, but they don't fully drive themselves.
Driving assist, in the same category as cruise control. But we've had cruise control for the last 20 years with no problems (except for a few Darwin-award recipients). The driver is still aware and controlling the car, but has just told the car at what speed to keep driving at. Same goes from the standard to automatic transmission. We don't have to tell the car what gear to drive in, it does it automatically for us based on throttle and RPM of the engine.
At what point are we going from driver-assisted to fully autonomous? When the driver seat is the same as the passenger seat?
BUt there HAVE been problems of people putting their are on cruise control and going to sleep. I remember a car in the 1980s. It was out west in the desert and the car went straight on its own for so long that they thought it was like "auto pilot" and thought the car was driving itself. It wasn't. It eventually drove into the desert. They lived, it was funny.
But same as the assist today. It's not in charge. It's just cruise control +. The drive is still the driver and in charge of all decisions.
I heard a similar story with a Winnebago I believe. Guy just bought a Winnebago with cruise control, drove it off the lot home, put it on cruise control, went to the back and went to sleep. It drove straight for a couple of miles until the road veered off and the Winnebago drove off of the road. As far as I know, nobody was injured, but the guy was attempted to sue and lost the lawsuit.
-
@nerdydad said in Random Thread - Anything Goes:
Wife just ordered the kids and her breakfast from this thing.
Said it took all of 5 minutes and was very easy to do. Automating jobs already.
We've been using those for a long time. McDonald's is SO much better with them.