Self-driving vehicles are already on the nation’s roads and many more are coming. Uber, Google, Tesla, and the major automobile manufacturers have plans to develop and deploy tens of thousands of such vehicles on America’s roads over the next decade.
A key point on the government’s just-released 15-point checklist to guide the development of autonomous vehicles is cybersecurity, guarding against the risk of vehicles being hacked. But underlying the functionality of autonomous vehicles are the people who write the code that makes their design work. Autonomous vehicles require exquisite software. To make it secure, industry and government should consider educational standards and licensure requirements for the people who design the smarts that go into these vehicles.
[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":2062366,"post_type":"guest","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,security,","session":"A"}']As we’ve seen over and over, less-smart, human-driven cars can and will be hacked. In 2010, for example, a 20-year-old disgruntled employee in Austin, Texas, tampered with data on a Web-based system used to remotely disable repossessed vehicles, leaving more than 100 drivers unable to start their cars or stop the horns from blaring. That same year, researchers at the University of Washington exposed numerous flaws in onboard networks when they succeeded in embedding malicious code that disabled a vehicle’s braking system. And two former DARPA-funded researchers – now legends at the famous Blackhat hackers conference – have publically demonstrated how they could remotely control a modern vehicle’s acceleration, steering, brakes, windshield wipers, horn and even the tightness of the seat belts.
If today’s manned cars are vulnerable to cyber threats, the coming fleet of autonomous vehicles could be more so. One reason is because the code that controls them is so complex. The software can contain 100 million lines of software code, which controls the dozens of onboard computers and networks. You need all this to brake, steer, accelerate, heat, cool, and connect the vehicle and protect its occupants. These computers and their software are the brains that enable these cars to go down the road.
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
And that critical software code is a man-made development, created by imperfect human beings. The government’s guidelines recognize this by including consideration of remote software updates. To be proactive and achieve the cybersecurity needed, the focus should be on the people who write the software code that make the vehicles work.
If there can only be one maxim on how to design secure software code, it is this one: Security must be “baked in.” This means that they have to anticipate and understand the potential vulnerabilities and threats. Writing software that is both functional and secure is a skill that requires education, training, and experience. And therein lies the rub. Today, you don’t need a license to be able write software code for cars. You don’t need a college degree.
A century ago, anyone could be an engineer without proving they were competent. There were many notable catastrophes, including dams and bridges that collapsed. As a result, the concept of a professional engineering license, for developers who designed bridges and roadways, was conceived and implemented. States started to require that certain engineers be licensed, and a measure of competency was born. Today, the person with the authority to take legal responsibility for an engineering project, like designing a bridge or a roadway, has to be licensed.
Similarly, certain medical professionals like nurses and physicians must have a license to treat people. Of course, operators of motor vehicles must be licensed before they get behind the wheel. But right now, you don’t need a license to write software code that operates vehicles.
A professional engineers exam was created in 2013 for software engineering, but so far the idea hasn’t gained traction with federal or state policymakers. The Department of Defense and the National Institute of Standards and Technology have processes that require lengthy certification and accreditation of software that is to reside on government networks. These aren’t mere guidelines but detailed rules.
Creating similar processes to promote minimum standards on secure code and cybersecurity could help secure the development of autonomous vehicle software. We may have some of the most brilliant experts in artificial intelligence and control systems creating the software code for autonomous vehicles. A certification process could prove it.
[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":2062366,"post_type":"guest","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,security,","session":"A"}']
Isaac Porche is a senior engineer at the nonprofit, nonpartisan RAND Corporation and associate director of the RAND Arroyo Center’s Forces and Logistics Program.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More