"Cameras only" is a cost cutting for profit only feature that is subject to Wile E. Coyote attacks.
It is a shameful engineering design to leave out LIDAR and it has cost human lives.
Let's hope Musk does not leave out something important for the moon landing. His proposal for it is absolutely ridiculous, it looks like a children's book fantasy and many smaller top-heavy craft have already toppled on the moon!
What are wile e coyote attacks? Painting a tunnel entrance on a wall?
If Tesla FSD is better than the average driver using it, isn't it still a net win, even if it might crash in different scenarios than a human? Especially because a human has a window to override FSD, but FSD doesn't really get a chance to override a human, except in limited scenarios like automatic emergency braking. And it gets more people using it by providing FSD at a lower cost?
>> What are wile e coyote attacks? Painting a tunnel entrance on a wall?
Yes!!! Thank you hopefully I will get credit for inventing this attack :)
>> IncreasePosts 39 minutes ago | parent | context | flag | on: Tesla: Failure of the FSD's degradation detection ...
What are wile e coyote attacks? Painting a tunnel entrance on a wall?
>> If Tesla FSD is better than the average driver using it, isn't it still a net win, even if it might crash in different scenarios than a human?
I don't think so because it is fooled by simple things that could easily be prevented and counting on a human to override is very risky because the human is simply not alert in the passive mode.
I think cameras are great, but there is no excuse not to also use LIDAR.
> If Tesla FSD is better than the average driver using it, isn't it still a net win, even if it might crash in different scenarios than a human?
That was the analysis when the industry was in its infancy. I think a lot more work has to go into that argument for people to accept it now that the driverless car industry has been operating for a decade+, it's not really clear that this pans out.
For example, today you can look at a car and predict how it's going to behave because you have a good model for how people drive. But let's say in the future driverless cars are much "safer" on paper than human drivers, but they behave very differently from them such that it's hard for people to predict their behaviors.
Now you've created a highly dynamic system where you don't have a good model for the all the actors because some of them behave one way but others behave a completely different way. Does this increase the overall safety of the system or decrease it, despite the new actors being statistically safer than the current ones?
I don't think you can with great confidence say what's going to happen just by looking at crash rates and comparing to the current system. You're going to change the whole system by introducing large numbers of actors who "crash in different scenarios than a human"
What if I am better than average driver? Thats anyway a very low bar for success and its not going to fly with general populations, laws and so on.
tesla cars killing people would be all over the news each time and nobody would care that similar or even marginally smaller amount of people would die anyway. People simply expect way more for giving up control.
Yes but we now have very detailed topographical maps of the moon, and the GNC systems are way better. I expect they will carefully choose the exact landing spot and hit it within 10m or so. Certainly it lacks the heroism of eyeballing a good place to set down with 60 seconds of gas in the tank.
Yes, this was something that the industry figured out in 2007. But because Musk has a lot of money, people do whatever he says, no matter how ignorant and deadly and dangerous. He never even had a cogent rationale, just absurd amounts of money. The shame is so profound and widespread it's hard to fathom really.
The renderings look like the cover of a Young Adult Sci Fi novel by Robert H. Heinlein. Have Spacesuit, Will Travel comes to mind. Probably the first true Science Fiction I have ever read!
> It is a shameful engineering design to leave out LIDAR and it has cost human lives
I'm still waiting for a jurisdiction to either create a liability safe harbor for self-driving systems with lidar or outright ban cameras-only systems on public roads.
The introduction of self-driving technology at scale will inevitably result in a few accidents no matter how many sensors are used. It's the same with every new technology deployed in high-risk situations, including motor vehicles themselves.
Even malfunctioning airbags have caused fatalities. The important thing is to identify the issue early so the company can address it before more people get hurt, which the ODI in this case is thankfully doing.
His moon lander will deploy a parachute that keeps the lander suspended for as long as it takes for the AI to grok the fact that there is no air on the moon, and then it finally falls according to cartoon physics with a whistling sound you can hear through the vacuum.
It is a shameful engineering design to leave out LIDAR and it has cost human lives.
Let's hope Musk does not leave out something important for the moon landing. His proposal for it is absolutely ridiculous, it looks like a children's book fantasy and many smaller top-heavy craft have already toppled on the moon!