Tesla Employee's Self-Driving Mode Test Reveals Importance of Safe and Secure AI - Tesla Collision Incident
February 4, 2022
A former Tesla employee, testing the Full Self Driving (FSD) Beta on-road, demonstrated its navigation capabilities across San Jose. However, during Autopilot mode, he collided with street pylons, leading to his dismissal from the company. This AI incident maps to the Govern function in HISPI Project Cerebellum Trusted AI Model (TAIM). Ready to help shape trustworthy AI? JOIN US
- Alleged deployer
- ai-addict
- Alleged developer
- tesla
- Alleged harmed parties
- john-bernal, san-jose-public
Source
Data from the AI Incident Database (AIID). Cite this incident: https://incidentdatabase.ai/cite/187
Data source
Incident data is from the AI Incident Database (AIID).
When citing the database as a whole, please use:
McGregor, S. (2021) Preventing Repeated Real World AI Failures by Cataloging Incidents: The AI Incident Database. In Proceedings of the Thirty-Third Annual Conference on Innovative Applications of Artificial Intelligence (IAAI-21). Virtual Conference.
Pre-print on arXiv · Database snapshots & citation guide
We use weekly snapshots of the AIID for stable reference. For the official suggested citation of a specific incident, use the “Cite this incident” link on each incident page.