According to the U.S. Department of Transportation, approximately 10 percent of all car crashes are caused by distracted driving. Approximately one in six crashes are caused by either distracted or drowsy driving. These driver errors are responsible for thousands of deaths annually, including hundreds on North Carolina and South Carolina roads.
Unfortunately, until every vehicle on the road is fully autonomous, we probably aren’t going to completely escape this problem. Competition from driver attention is everywhere – from smartphones to kids in the back seat to increasingly interactive dashboards. There is a lot of talk of beefing up anti-texting laws or ramping up enforcement, but the reality is these types of laws are difficult to widely enforce on a regular basis. Based on a National Safety Council survey, 55 percent of Americans concede to “occasionally” making a phone call while driving, and 32 percent said if there was no law against it, they would probably text and drive. (The reality is many of those still do, regardless of the law.)
Still, there is hope that technology might be helpful in curbing this serious problem after all – and perhaps sooner than anticipated. A new system has been introduced that uses embedded computer vision to determine when a driver is either drowsy or distracted. Using an infrared camera, the system follows the driver’s eyes, while the computer vision detects the driver’s state and conducts a real-time analysis.
Makers of this technology say the camera and the software together detect problems like distracted driving that could cause a car accident, and they will alert a driver to a possible danger. The system also activates some of the vehicle’s other safety systems. For instance, an alert could trigger:
- Adjustment of adaptive cruise control to increase distance from the vehicle ahead, reducing the chances of a rear-end collision;
- A warning to the driver to keep his or her eyes on the road (via an audible beep or a vibration of the steering wheel or seat);
- Initiating automatic braking in some scenarios.
The feature uses facial recognition technology to track where the eyes are – and where they should be. Of course, not all distraction occurs as a result of having one’s eyes off the road, but this could help. This kind of technology has several other potentially helpful implications as well. For example, the technology could detect appropriate settings and automatically adjust mirror, seat, and steering wheel positions for optimal control of the vehicle.
Additionally, the system could also in the future be used to identify persons who have certain driving restrictions. For example, the government could employ the technology to identify those who have previous DUI convictions to prompt ignition lock-out, or families could program the system to limit a newly licensed driver with certain family-imposed restrictions (i.e., speed or leaving a certain navigable route). The feature may even be able to detect emotions and issue alerts in cases of apparent road rage.
However, hats, glasses, lighting, or other issues could be problematic for the sensors. There is also the possibility of false alarms, which drivers could come to view as an annoyance, prompting them to disable the system.
Manufacturers say it could also enable “gesture interfaces,” which would allow users to communicate with mobile devices, lighting systems, and other features with a simple hand gesture. Still, this feature does seem a bit counter-intuitive, given its core purpose.
Contact the Carolina injury lawyers at the Lee Law Offices by calling 800-887-1965.
Smart In-Vehicle Cameras Increase Driver and Passenger Safety, Embedded-Vision.com
More Blog Entries:
Auto Maker Cleared in Deadly Crash, Feb. 20, 2017, Charlotte Car Accident Lawyer Blog