I've been in this situation, and it's something that seems to get tweaked every week.
They definitely pushed the "aggressive" lever up a little over the last month.
I think it's one of those cases where it's sort of obvious it can calculate trajectories and speeds much better than a human, so it's a safe manoeuvre in theory, but it "feels" bad as a passenger.
Same thing for where it can see that it can squeeze through a tight "lane" but a normal human driver would probably wait until the oncoming traffic had passed.
It's a lose lose situation. I've been reading conversation about self driving cars for years and what you always heard back in the days is that they were way too conservative on these kinds of turns and blocked traffic. Hell even in this very thread people are saying that about Cruise. Yet when they make it more aggressive, people feel unsafe.
As you say, this is something they are constantly tweaking to find the perfect middle ground. That being said, maybe people are extra hyper aware of the turns when it's a self driving car, and wouldn't bat an eye or pay attention if it was a normal taxi driver?
People have an interesting “trust curve” regarding automation. At first they’ll be very suspicious and critical of it and any issues they have will be blamed on it. Then some point later when they’re used to the system, this attitude will suddenly flip around and they’ll start regarding the system as infallible. Sounds like this might be happening here.
Fair point. When I take a passenger in a Waymo and it is there first time I'm hyper aware of the driving. But for myself personally, I'm mostly tuned out. For me it started to feel "normal" as early as about the 4th ride. So they could probably subtly adjust the "middle ground" based on the passenger's experience and feedback (As long as it doesn't create inconsistencies for surrounding vehicles).
The thing is, the capabilities are not likely to be a strict superset of a human's.
So, if you make it have the same aggressiveness on "average" by having it take advantage of its superior capabilities, those situations are going to feel sketchy to a person.
You should be able to set the "aggressive" level. I believe Tesla has this option internally on their FSD Beta. Hopefully you will be able to set this level for robot taxis and personal vehicles like you can with (some) human drivers.
Call me cynical, but this feels like a way for tech companies to absolve themselves of responsibility.
“Your Honor, the pedestrian would have never been struck if the driver had set the system in its most conservative mode as we suggest on page 1,047 of Terms and Conditions.” /s
When a Tesla can accelerate to 60 mph in less than 2 seconds, knowing the instantaneous velocity and acceleration is not very meaningful. You really need to be able to predict what the acceleration profile will look like over the next N seconds of your maneuver. Holding the currently perceived velocity and acceleration constant over the next N seconds is one naive way to do that. But the actual set of possible trajectories of the other actor is much larger, and you need to drive more conservatively to account for that.
If your sensors can distinguish a Vespa from a sports bike reliably, compared to all the other things that an autonomous vehicle has to cope with programming it to treat those two as different categories of vehicle shouldn’t be particularly hard.
> at least to the precision that it would differ in the next 10 seconds.
10 seconds? 10 seconds is an eternity.
Some vehicles of similar size might be more than 1/8th of a mile apart in straight line performance in 10 seconds-- let alone the difference once we've got multidimensional vectors.
My point is-- vehicle dynamics only make a difference in the very short term, because after like a couple of seconds, vehicles can be almost anywhere relative to you even with low performance.
(But, they can be quite different on the timescale of a second).
it only takes 0.2 seconds to turn a motorcyclist into ground beef
If you can't tell apart a bicycle from a 4-cylinder racing bike, let alone a vespa, that's what happnes. And Tesla can't. It also can't read hand signals given by cyclists. It can't tell apart a donkey and a horse.
I've been saying that vehicle dynamics is useful information in the short term. So if you're trying to argue with me, I don't think you've understood my point.
If your goal was to interject an anti-Tesla offtopic comment to the general discussion of vehicle dynamics, it was unwelcome.
Humans are terrible at high velocity estimates. That's one of the conditions described in the accident investigation report for that Irish plane which smashed a bunch of runway lighting due to insufficient take off power.
A brand new top of the line passenger jet would say e.g. "Caution: Acceleration" because it can work out the velocity of the plane, from there the acceleration and the remaining runway length (it uses GPS to identify the runway it's on) and take off speed and conclude we won't make it. Humans only decide that after it's far too late. Because it's much earlier the annunciation allows pilots to abort takeoff and investigate from the safety of the ground - in the Irish case they'd typed the wrong air temperature in and thus the engine performance was much worse than expected, with the right air temp it would have flown just fine
LOL. Tell race car drivers and truckers that they're inferior. Their senses are probably tuned just as finely as any vision system. You really discount non-quantitative measurements - as most tech people here do. You are wrong, though, that the best of the meat brains are so inferior.
It is also very important to consider the effect on other drivers.
A self driving car might calculate it can squeeze through a gap in oncoming traffic but doing so probably will cause human drivers of those cars to slam the breaks and create a large crash.
So unless they’re driving on a road where only self driving is allowed they’ll probably need to be much more conservative than they can be.
Also thinking about it, the first roads to become robot-only driven will probably be inner city streets where pedestrians abound, so no aggressive driving there either.
They definitely pushed the "aggressive" lever up a little over the last month.
I think it's one of those cases where it's sort of obvious it can calculate trajectories and speeds much better than a human, so it's a safe manoeuvre in theory, but it "feels" bad as a passenger.
Same thing for where it can see that it can squeeze through a tight "lane" but a normal human driver would probably wait until the oncoming traffic had passed.