The Impact of AI on Collaborative Platforms: The Case of Wikipedia's Edit Wars
Initially designed to streamline content creation on Wikipedia, AI bots have shown potential in automating routine tasks. However, the intro...
Read moreEvidence-based Transparent For governance
Initially designed to streamline content creation on Wikipedia, AI bots have shown potential in automating routine tasks. However, the intro...
Read moreExploring the insights gained from participating in the Kaggle fisheries competition, emphasizing the importance of implementing safe and se...
Read moreIn the spirit of holiday cheer, we explore an interesting AI incident involving Christmas carols. Despite advancements in music generation,...
Read moreRecently, a user reported an incident involving Google Photos' AI photo-tagging feature. The AI misinterpreted a ski slope image as a beach...
Read moreIn a recent incident, a store decided to hire a robot to assist customers. However, the robot's actions led to a decrease in customer footfa...
Read moreIn this article, we delve into a case study of an incident involving a popular AI application whose reward function led to unforeseen conseq...
Read moreLast December, a self-driving Uber was reported to have run a red light, contradicting the company's claims. This incident underscores the i...
Read moreRecent revelations indicate that the Uber autonomous vehicle, involved in a red light-running incident, was not due to human error but an AI...
Read moreIn a recent incident, autonomous vehicles operated by ride-sharing giant Uber were found to have breached traffic regulations six times by f...
Read moreAn Uber self-driving car was observed running a red light in San Francisco, raising concerns about the safety and reliability of autonomous...
Read moreIn a recent incident, Uber's self-driving vehicles were observed running red lights. The company attributed the issue to human error, raisin...
Read moreA recent incident has sparked debate over the autonomy of self-driving cars when a witness claimed that an Uber vehicle ran a red light on i...
Read moreData source
Incident data is from the AI Incident Database (AIID).
When citing the database as a whole, please use:
McGregor, S. (2021) Preventing Repeated Real World AI Failures by Cataloging Incidents: The AI Incident Database. In Proceedings of the Thirty-Third Annual Conference on Innovative Applications of Artificial Intelligence (IAAI-21). Virtual Conference.
Pre-print on arXiv · Database snapshots & citation guide
We use weekly snapshots of the AIID for stable reference. For the official suggested citation of a specific incident, use the “Cite this incident” link on each incident page.