<img height="1" width="1" style="display:none;" alt="" src="https://px.ads.linkedin.com/collect/?pid=2941673&amp;fmt=gif">
ubers-self-driving-car-crash-settlement
04/05/2018

Uber's self-driving car crash settlement doesn't make us safer

Brian Beckcom

Brian Beckcom

04/05/2018

Uber has agreed to an undisclosed settlement with the family of the woman it killed with an autonomous car. The facts of the case (as we know them) are relatively simple. A 49-year-old woman was walking her bike across a street around 10 PM. The street she had been on dead-ended into the street on which the self-driving Uber was traveling. Video shows that, from the Uber’s perspective, she was not visible to the camera until the last second.

No company is above the law

This secret settlement sets a dangerous precedent that self-driving car companies are above the law. They aren’t. Victims and families need to hold these companies accountable in a court of law. Self-driving car crashes aren’t like regular car crashes. These companies aren’t insurance companies. Insurance companies pay out settlements based on a century of evaluating the causes of car wrecks to determine the value of a person’s claim and include lawsuit settlements and verdicts in that decision-making. Uber, Waymo, and other companies are counting on the public to assume they are putting in the same work as an insurance company when they offer self-driving car crash victims a settlement.

The bottom line is that these companies are most likely lowballing these victims. They are trying to avoid having a jury examine every technical and safety failure that caused the crash. They want to avoid having their failures made public.

96% of car crashes are caused by human error. Self-driving cars are safer than cars driven by people. On the other hand, self-driving cars aren’t perfect yet. There are safety issues that must be addressed. But the public, the government, and victims of self-driving car crashes cannot hold these companies accountable if they keep their safety failures secret.

Self-driving car industry must address safety issues

Uber and other self-driving car companies all acknowledge their cars aren’t fully autonomous. And that their systems have glaring safety oversights. This fatality shines a spotlight on these safety oversights.

“If its sensors don’t detect anything, the vehicle won’t react at all,” says Bart Selman, a Cornell University computer science professor and director of the Intelligent Information Systems Institute in an article for Scientific American. The article discusses how these cars have major blind spots when it comes to spotting pedestrians, cyclists, or stopped objects.

There are many different types of sensors that an automated car can use (ultrasonic, radar, image, LIDAR, and cloud), but the image sensor is the most important. The cameras are able to record and convert information about the objects around them including the color of street lights and the lanes on the road. However, one article reports that only 95% of pedestrians are recognized as pedestrians. And the 5% of pedestrians it can’t see are still people who have the right to not have their lives endangered by an unsafe vehicle. This information is exactly what a jury needs to be examining. When companies make safety decisions, they make them based on their profit margin. When juries make safety decisions, they make them based on keeping people safe.

Uber’s secret settlement endangers America

Since the government has not yet issued regulations for self-driving cars, Waymo, Uber, and other companies test these cars on public roads despite multiple collisions due to technological failures. The government and the public deserve to know why these companies haven’t fixed the issues that have caused multiple collisions and injuries. That is exactly the type of information that would be revealed in a lawsuit. But because of this secret settlement, this information will remain hidden from the public.

By settling the issue out of court and not allowing the justice system to evaluate where the autonomous driving industry fails to prioritize safety, we are all left in the dark. Cutting edge technology may promise a better tomorrow, but if it has fatal flaws today, the public needs to know. The public also need to know how these companies are going to make their cars safer. There’s an adage that safety regulations are written in blood. However, the government can’t regulate what they don’t know. The American people have a right to know the truth about self-driving car technology.

Self-driving car crash victims deserve to have their rights protected

In normal car crashes, blame can usually be placed on a person. On human error. And we file injury claims with the responsible party’s insurance company. But filing a claim for a self-driving car wreck could be more difficult. You will need experts to determine if it was a technological failure – if so, which company is responsible for that failure; to reconstruct the wreck to determine fault; and a lawyer to help you determine which laws apply to your case. Local and state governments currently . There are no federal safety standards for self-driving cars yet. The sooner you consult with an experienced trial attorney, the sooner you can get the compensation you are owed. An experienced attorney will also help you expose the company or companies’ technological or mechanical failures so that the self-driving car industry is forced to make their cars safer.

If you have been injured in a self-driving car wreck, or a loved one has been hurt or killed, consult with VB Attorneys. At our firm, the truth matters. Our Board Certified personal injury trial law attorneys will review your case, explain which laws apply to your case, and give you their expert opinion so you can make an informed decision about your future. The longer you wait, the more time the companies have to hide or destroy crucial evidence in your case. Call us today at 877-7247-7800 or fill out a contact form on our website.