top of page

Cybersecurity and Autonomous Cars

All is relatively well here at Woesnotgone Meadow, where everyone has above average bandwidth.

The people in the Meadow certainly love their cars. They drive them to work, church, temple, grocery store, movies, and most everywhere else possible. One aspect drawing people to their vehicles is simply the enjoyment of driving down the road. There is a new forward-looking trend is with the autonomous vehicles. These eventually will drive the owner or user to the salon, to get groceries, or to the location chosen by the user. All without care. The passengers won’t have to drive or even watch the road, as the vehicle will be managing the whole process.

This has not been the case yet. The companies developing these new vehicles and systems are experiencing issues. These growing pains are experienced as the companies work through the learning curve, which is full of risk for the business, the passengers, and people near the vehicle. A particular player in this market has been Uber. Earlier this year there was the incident with the Uber vehicle and the deadly Arizona crash. This event obviously was terrible.

Unfortunately for the victim’s family, this may have been avoided. With these endeavors, there are larger teams in place. These manage the different aspects of the development process. The goal is to complete the project and have these vehicles on the road. Each member of the team has the moral and ethical responsibility for reporting the truth regarding the work, workflow, and real risks to the company, shareholders, and public. One position within the development team is the Operations Manager. Within Uber, an operations manager at the time, who no longer works at Uber, emailed a company executive detailing the autonomous car’s issues. This apparently was not heeded, as five days later the woman crossing the street was hit and killed in Tempe, AZ. This is not the first incident with an Uber car, as earlier one vehicle was in an accident and tipped onto its side.

The email was not sent to a random executive, but to the group’s top executive warning of the risks and dangers. The operations manager also warned them the human backup drivers did not receive the proper training and when they made an error, these persons were not terminated. Sadly, this was ignored.

Thanks for visiting Woesnotgone Meadow, where the encryption is strong, and the O/Ss are always using the latest version.

Resources

Efrati, A. (2018, December 10). How an uber whistleblower tried to stop self-driving car disaster. Retrieved from https://www.theinformation.com/articles/how-an-uber-whistleblower-tried-to-stop-self-driving-car-disaster

Stones, J. (2018, December 13). Uber aware cars were “routinely in accidents” just days before fatal accident. Retrieved from https://www.alphr.com/business/1010340/uber-aware-cards-in-accidents-days-before-fatal-accident

About the Author - Charles Parker, II has been working in the info sec field for over a decade, performing pen tests, vulnerability assessments, consulting with small- to medium-sized businesses to mitigate and remediate their issues, and preparing IT and info sec policies and procedures. Mr. Parker’s background includes work in the banking, medical, automotive, and staffing industries.

Featured Posts
Check back soon
Once posts are published, you’ll see them here.
Recent Posts
Archive
Search By Tags
No tags yet.
Follow Us
  • Facebook Basic Square
  • Twitter Basic Square
  • Google+ Basic Square
bottom of page