Safety has always been a staple when it comes to automakers design and innovation. After all, most drivers would opt for a vehicle that has added safety features to ensure driver safety. This has been demonstrated by US buyers are more than willing to fork over extra cash for such features. But whether it is the manufacturer’s ultimate goal for a safer automobile or just a grab for expanded market share, the safety of modern cars continues to grow. So what happens when we are no longer the ones operating the vehicle and are just along for the ride?
That’s the question many have been asking, whether it be general consumers, automakers themselves, or Government regulatory agencies. And these manufacturers are now set to the task of ensuring the safety of drivers when they are operating (notice we didn’t say driving) an autonomous or self-driving vehicle.
How Can We Make Self-Driving Cars Safer?
The Automakers are utilizing little cameras and sensors to track hanging heads. They will also use directional wheel screens capable of sending our audible cautions to guarantee drivers focus when utilizing self-driving vehicles. These are seen as a response to concerns that have come to light regarding Tesla’s Autopilot, which currently enables drivers to get their hands off the wheel.
So what exactly prompted this? Last year, it was reported in May 2016, that the crash of a Tesla Model S that had resulted in driver death, was related to utilizing Autopilot – the first ever of its kind. After this, the National Transportation Safety Board exhibited that clients could, for the most part, keep their hands off the wheel for broadened periods despite of repeated notices coming from the vehicle.
In any case, the crash also managed to underscore a particularly vexing issue for automakers. Unless a car is equipped for driving itself securely for every situation, drivers will at least for now need to stay alert. This means it will be necessary for drivers operating a self-driving vehicle to be prepared to take control regardless of the auto-pilot system or autonomous program it is using.
The National Transportation Safety Board or NTSB, the government organization who are set with examining huge transportation mishaps, said amid a 37-minute segment of the 41-minute Tesla trip, the driver only kept his hands on the wheel for 25 seconds. He apparently put his hands on the wheel for about one-to-three-second additions beyond that. All of that in spite of getting visual and audio notices.
Automaker Response to Autonomous Vehicle Safety
General Motors developed a new driver-assist system called Super Cruise that was at first made for release the end of last year. Sadly the innovation is set to be discounted this fall. Barry Walkup, the main designer of Super Cruise, said they included “a driver assist system that demanded driver supervision.”
The Super Cruise system framework utilizes a tiny camera aimed on the driver. It also works with infrared lights to track them. It sees where the driver is looking. If the system notices that the driver is not focusing, it will provoke the driver to pay attention. The system utilizes facial acknowledgment programming. Should the driver fail to react it will not only activate more cautions, visual markers and audio alarms activate. A guiding wheel light bar is also on board. If the driver still does not react, the vehicle is forced into a controlled stop.
Volkswagen AG’s Audi unit also has a framework that handles directing and braking at rates of up to 40 mph. This actually requires the driver to check in with the guiding wheel at regular intervals. Audi said the framework will beep cautions at the driver. And if the driver does not react, it will the car will stop slowly.
How are Government Agencies and Regulators Responding?
The National Highway Traffic Safety Administration does not test or pre-affirm self-driving frameworks before automakers introduce them. They are the lead organization for managing cars. Rather, the organization reacts to protests or crashes. This is connected to when they examine whether or not there was a potential error that led to hazard. The May 2016 Tesla mishap has raised worries about the control of self-driving cars.
The NTSB will also in turn issue reports from their discoveries and may make suggestions to the NHTSA.