UNIVERSITY of NOTRE DAME
Automated Vehicles, Moral Hazards & the "AV Problem"
William H. Widen*
Introduction
The automated vehicle (“AV”) industry faces the following ethical question: How do we know when AV technology is safe enough to deploy at scale?
I call the search for an answer to this question the “AV Problem.” This problem needs an answer now, more so than other ethical issues for AV design raised by the famous “Trolley Problem” in ethics or the results of MIT’s experimental philosophy poll about “Moral Machines.” We face issues similar to the AV Problem now on a smaller scale with current testing of automated driving technology on our public highways where high-profile fatalities involving automation technology already have occurred. While stories about failures of vehicle automation technology get headlines, AV companies aim to deploy the more complex SAE Level 3, 4, and 5 automated driving system technology as soon as late 2023. Indeed, Philip Koopman and I have argued elsewhere that Tesla already has deployed SAE Level 4 motor vehicles in violation of law by selling its Full Self-Driving (FSD) “beta” software.
This essay considers the AV Problem through the lens of two registration statements filed with the Securities and Exchange Commission (“SEC”): a November 5, 2021, filing of a registration statement on Form S-1 for Aurora Innovation, Inc. (“Aurora”), a company that hopes to be a leader in systems for AVs, and an August 27, 2021, filing of a registration statement on Form S-4 for Reinvent Technology Partners Y, the predecessor to Aurora. The Registration Statements reveal a potentially significant material omission: they fail to disclose Aurora’s own standard for deploying AVs at scale. Development of technology satisfying a more stringent safety standard takes longer than the development of a technology meeting a lesser standard, yet Aurora must deploy AVs quickly for financial success. For this reason, Aurora’s deployment standard is material and its omission is a potential violation of securities laws.
References
* Professor, University of Miami School of Law, Coral Gables, Florida. The author is grateful for demonstrations of object and event detection and response (OEDR) technology at Georgia Institute of Technology’s computer science laboratory, and conversations with Philip Koopman, Peter Lederer, James Nickel, Philip Nickel, Deep Samal, and Marilyn Wolf. This essay is a gently updated version of a SSRN posting originally made in 2021.
- health care
Article by Kali Peeples
- Emerging Technology
Article by Kevin Frazier
Notre Dame Journal on Emerging Technologies ©2020