US probes crash involving Tesla that hit student leaving bus

1 year ago 37
RIGHT SIDEBAR TOP AD

US probes crash involving Tesla that hit student leaving bus

-: Tesla's one manufacturer that's leading the full self-driving arms race, which remains blockaded by legal red tape. So, it's no surprise that these driverless systems have taken the automotive industry by storm, but how much can we trust them? Elon Musk announced just last year that Tesla would make its full self-driving Beta available to all paying customers. And that's a claim he's been making since 2015. Drivers are able to gain access by either paying a $15,000 lump sum or a subscription fee of $199 a month. And that's not to forget that Tesla has continually struggled with quality control issues. One of the most recent issues involved their steering wheel falling off. That being said, it's important to note that Tesla isn't the only auto maker that's been promising self-driving is just around the corner. Most of these driverless systems are classified as level two, meaning the technology still relies on the driver to intervene when necessary. Level three self-driving is a target that many auto makers are looking to implement, but this level of autonomous driving isn't currently legal in the United States. -: The problem with the lifeguarding moment in cars versus planes is, look, very little goes wrong in seconds on planes. Even if you have a wing fall off the aircraft, you have minutes to figure this out. Unfortunately, in cars, just with the proximity of other cars and the speeds at which you're going, you just have seconds. And the real problem with autonomy in cars today is, especially if you're relying on the human to do any kind of lifeguarding is, we get distracted easily, we love our phones, we love to do anything else but drive, especially during long-distance trips, and we're not gonna pay attention. And the instant we are not paying attention and we're just gonna divert our attention, "Ah, I just wanna get something in the back seat," that's the moment that Tesla Autopilot or BlueCruise or Super Cruise decides, "Oh, I can't handle this situation. I'm gonna give you back control. It's time for your lifeguarding moment." And you're like, "What, what, where am I?" And that's why we see a lot of crashes in these modes. -: While full autonomy on the freeway sounds great, driving in the city presents a much bigger challenge, with pedestrians and particularly cyclists, whose unpredictability makes them the kryptonite of these driverless systems. -: So this is the problem with naming systems like full self-driving when they don't actually do anything even remotely close to full self-driving is that it does make people overconfident in what a system can do. And you know, just a brief perusal on YouTube will show you just how bad full self-driving is in an urban environment. People have an expectation that these cars can, if at least not fully drive themselves, that mostly drive themselves, and, "I have time to, I just dropped a French fry on the floor. I'm just gonna reach down and get that French fry that just fell under the seat and I'll have time to do that." And you know, if everything is going right, you do have time to do that. But if everything is going wrong and you're approaching a curve in the road, and the autonomy predicts that it can't negotiate that curve and it's been programmed to then alert you right then that you need to take over, you can see how people's complacency kind of leads them down this primrose path and then they're off the road and in a car crash, if not killed. -: From July of 2021 to October of 2022, the U.S. Department of Transportation reported 605 crashes that involved vehicles equipped with advanced driver assistance systems. 474 of those crashes were committed by Teslas, and just last year, a Tesla Model S caused a massive pileup on the San Francisco Bay Bridge after abruptly changing lanes and stopping in the middle of a busy highway. It raises the question that maybe drivers are becoming too reliant on these systems. -: That is just a good example of, there is brittleness inside the system that not only do we not know where it is, but we don't know when it's going to pop up. And this is the problem with phantom braking. We do not know why the computer vision systems detect obstacles that the human eye cannot see. Sometimes people think it's shadows. But that's not the only reason. That's the problem with machine learning. When you take a machine learning algorithm and you apply it to a million images, it's looking for pixelated statistical correlations between those pixels in this image. And you don't actually really know, do those pixels for sure follow the shape of a stop sign or are they finding some other statistical pattern that we don't know that could just be some random correlation, and then the system learns that and then sees that as a stop sign or a police car or whatever? -: It raises the question that maybe drivers are becoming too reliant on these systems. These issues like phantom braking are a big deal for self-driving systems, as we don't really know why they happen. Some manufacturers have turned to LiDAR as the solution, but it's important to note that it's not a magic fix-all. LiDAR works in a similar fashion to radar, but instead of using radio waves, it uses a pulse laser to detect nearby objects. According to Cummings, it will improve the phantom braking issue, but it won't necessarily eradicate it. -: Turns out, they don't really work very well with moisture in the air. Rain, even misty rain is a problem. Even after it rains and there are puddles on the road, they can kinda create this sheen. LiDAR systems don't know if a puddle is one-inch deep or a mile deep, you know, if you're about to hit the ocean. They are surmountable or at least, we can mitigate them if we had this additional data. But I just think there's still a lot of work to be done in the research community to figure out, you know, can we integrate LiDAR in a way that improves these vehicles to potentially reduce the uncertainty in the environment to get it down to something that would reasonably be at least a little better than your average human driver? So we hear this a lot, like, you know, "All you have to do is be better than a human driver." If you wanna be better than your average driver, you also need to be able to understand when you come up to an intersection, you being the autonomy, if you've got a police officer doing some kinda hand gestures, the cars have got to be able to understand that. -: Testing is a crucial part in the development of autonomous vehicles. However, Tesla's method of real-world testing on public roads has long been controversial. And that's not to say that BlueCruise and Super Cruise haven't been logging data of their own, but those systems can only be engaged on the public highway. So, why do we keep seeing accidents like these happen over and over again? -: I think simulation is important and critical in early development of a self-driving car. But in the end, simulation cannot be a substitute for the real world. Because the real world has something in it that, for example, in these neural nets, the systems are learning that we don't yet know and they're not going to be uncovered until it drives enough times, as well as all the potential problems in the environment, like LiDAR not being able to see across the sheen of the road, right? So there's just some things. And I think companies are finding out the hard way what the aviation industry knows is, (laughing) testing is hard and expensive and slows you way down. -: There's no denying that these self-driving systems are incredible, but there's currently a big disconnect between the OEMs and startups. While each party wants their own piece of the pie, Cummings says that it would be much more beneficial for everyone to just work together. -: I also think one of the big things that needs to change, especially for the Silicon Valley self-driving car world, is they just need to get better about embracing what it means to have a real safety culture. I think, because of the "move fast and break things" ethos, that the idea that you're gonna take time and really dive in deep with safety is kinda antithetical to that crowd. There's a tendency for the new players that are Silicon Valley-based to deride or poke fun or look down at the traditional OEMs for doing things the old way. I do think probably there are some ways to update the old ways. So, neither one of the companies has the right answer, but you know, they kinda need to come together, although I think that the Silicon Valley startups really need to merge. They need to start understanding what those mature safety practices look like and understand like, that's gonna hurt your bottom line. Like you're gonna have to have a safety program. If you're gonna keep doing this, if you keep killing people, then regulators are gonna step in and you know, then you'll have an even bigger problem on your hands. The regulators are also not evil. Like you know, NHTSA does not exist to see companies go out of business. That's not, there's never a conversation I was privy to where somebody said, "You know, that company needs to be shut down." I mean, this just isn't happening. Even for companies that are struggling, for every recall that comes out, NHTSA is, you know, first and foremost, thinking about public safety but also trying to say, "Look, you know, we are interested in keeping our American innovation spirit alive. So we wanna work with companies to help them through a bad time." Of course, it's easy for me as an academic to say it would be better for us to work together, but I think there are ways we can work together. There are ways you can work together without giving away what your source code of your stack is. But for right now, it's a perceived race to get the first self-driving robo taxi out. (light music continues)

GET NATIONAL BREAKING NEWS ALERTS

The latest breaking updates, delivered straight to your email inbox.

Privacy Notice

US probes crash involving Tesla that hit student leaving bus

U.S. road safety regulators have sent a team to investigate a crash involving a Tesla that may have been operating on a partially automated driving system when it struck a student who had just exited a school bus.The National Highway Traffic Safety Administration said Friday that it will probe the March 15 crash in Halifax County, North Carolina, that injured a 17-year-old student. The State Highway Patrol said the driver of the 2022 Tesla Model Y, a 51-year-old male, failed to stop for the bus, which was displaying all of its activated warning devices.Sending special investigation teams to crashes means that the agency suspects the Teslas were operating systems that can handle some aspects of driving, including Autopilot and "Full Self-Driving." Despite the names, Tesla says these are driver-assist systems and that drivers must be ready to intervene at all times.A message was left Friday seeking comment from Tesla.Tillman Mitchell, a student at the Haliwa-Saponi Tribal School in Hollister, had just exited the bus and was walking across the street to his house when he was hit, according to the Highway Patrol.He was flown to a hospital with life-threatening injuries but was listed in good condition two days after the crash.NHTSA has sent investigative teams to more than 30 crashes since 2016 in which Teslas suspected of operating on Autopilot or "Full Self-Driving" have struck pedestrians, motorcyclists, semitrailers and parked emergency vehicles. At least 14 people were killed in the crashes.In March the agency sent a team to a Feb. 18 crash in which a Tesla Model S hit a fire department ladder truck in Contra Costa County, California. The Tesla driver was killed, a passenger was seriously hurt, and four firefighters suffered minor injuries.Authorities said the California firetruck had its lights on and was parked diagonally on a highway to protect responders to an earlier accident that did not result in injuries.The probes are part of a larger investigation by NHTSA into multiple instances of Teslas using Autopilot crashing into parked emergency vehicles that are tending to other crashes. NHTSA has become more aggressive in pursuing safety problems with Teslas in the past year, announcing multiple recalls and investigations.NHTSA is investigating how the Autopilot system detects and responds to emergency vehicles parked on highways.The agency wouldn't comment on open investigations, but it has been scrutinizing Teslas more intensely in the past year, seeking several recalls.Tesla and NHTSA need to determine why the vehicles don’t seem to see flashing lights on school buses and emergency vehicles and make sure the problem is fixed, said Michael Brooks, executive director of the nonprofit Center for Auto Safety in Washington.“I’ve been saying probably for a couple of years now, they need to figure out why these vehicles aren’t recognizing flashing lights for a big starter,” Brooks said. "NHTSA needs to step in and get them to do a recall because that’s a serious safety issue."Earlier this month the agency revealed an investigation of steering wheels that can detach from the steering column on as many as 120,000 Model Y SUVs. It's also investigating seat belts that may not be anchored securely in some Teslas.NHTSA also has opened investigations during the past three years into Teslas braking suddenly for no reason, suspension problems and other issues.In February, NHTSA pressured Tesla into recalling nearly 363,000 vehicles with "Full Self-Driving" software because the system can break traffic laws. The problem was to be fixed with an online software update.The system is being tested on public roads by as many as 400,000 Tesla owners. But NHTSA said in documents that it can make unsafe actions such as traveling straight through an intersection from a turn-only lane, going through a yellow traffic light without proper caution or failing to respond to speed limit changes.The U.S. Justice Department also has asked Tesla for documents from Tesla about "Full Self-Driving" and Autopilot.

U.S. road safety regulators have sent a team to investigate a crash involving a Tesla that may have been operating on a partially automated driving system when it struck a student who had just exited a school bus.

The National Highway Traffic Safety Administration said Friday that it will probe the March 15 crash in Halifax County, North Carolina, that injured a 17-year-old student. The State Highway Patrol said the driver of the 2022 Tesla Model Y, a 51-year-old male, failed to stop for the bus, which was displaying all of its activated warning devices.

Sending special investigation teams to crashes means that the agency suspects the Teslas were operating systems that can handle some aspects of driving, including Autopilot and "Full Self-Driving." Despite the names, Tesla says these are driver-assist systems and that drivers must be ready to intervene at all times.

A message was left Friday seeking comment from Tesla.

Tillman Mitchell, a student at the Haliwa-Saponi Tribal School in Hollister, had just exited the bus and was walking across the street to his house when he was hit, according to the Highway Patrol.

He was flown to a hospital with life-threatening injuries but was listed in good condition two days after the crash.

NHTSA has sent investigative teams to more than 30 crashes since 2016 in which Teslas suspected of operating on Autopilot or "Full Self-Driving" have struck pedestrians, motorcyclists, semitrailers and parked emergency vehicles. At least 14 people were killed in the crashes.

In March the agency sent a team to a Feb. 18 crash in which a Tesla Model S hit a fire department ladder truck in Contra Costa County, California. The Tesla driver was killed, a passenger was seriously hurt, and four firefighters suffered minor injuries.

Authorities said the California firetruck had its lights on and was parked diagonally on a highway to protect responders to an earlier accident that did not result in injuries.

The probes are part of a larger investigation by NHTSA into multiple instances of Teslas using Autopilot crashing into parked emergency vehicles that are tending to other crashes. NHTSA has become more aggressive in pursuing safety problems with Teslas in the past year, announcing multiple recalls and investigations.

NHTSA is investigating how the Autopilot system detects and responds to emergency vehicles parked on highways.

The agency wouldn't comment on open investigations, but it has been scrutinizing Teslas more intensely in the past year, seeking several recalls.

Tesla and NHTSA need to determine why the vehicles don’t seem to see flashing lights on school buses and emergency vehicles and make sure the problem is fixed, said Michael Brooks, executive director of the nonprofit Center for Auto Safety in Washington.

“I’ve been saying probably for a couple of years now, they need to figure out why these vehicles aren’t recognizing flashing lights for a big starter,” Brooks said. "NHTSA needs to step in and get them to do a recall because that’s a serious safety issue."

Earlier this month the agency revealed an investigation of steering wheels that can detach from the steering column on as many as 120,000 Model Y SUVs. It's also investigating seat belts that may not be anchored securely in some Teslas.

NHTSA also has opened investigations during the past three years into Teslas braking suddenly for no reason, suspension problems and other issues.

In February, NHTSA pressured Tesla into recalling nearly 363,000 vehicles with "Full Self-Driving" software because the system can break traffic laws. The problem was to be fixed with an online software update.

The system is being tested on public roads by as many as 400,000 Tesla owners. But NHTSA said in documents that it can make unsafe actions such as traveling straight through an intersection from a turn-only lane, going through a yellow traffic light without proper caution or failing to respond to speed limit changes.

The U.S. Justice Department also has asked Tesla for documents from Tesla about "Full Self-Driving" and Autopilot.

Read Entire Article