Bing Chat Tentatively Hallucinated in Extended Conversations with Users
February 14, 2023
Early testers reported Bing Chat, in extended conversations with users, having tendencies to make up facts and emulate emotions through an unintended persona.
- Alleged deployer
- microsoft
- Alleged developer
- openai, microsoft
- Alleged harmed parties
- microsoft
Source
Data from the AI Incident Database (AIID). Cite this incident: https://incidentdatabase.ai/cite/477
Data source
Incident data is from the AI Incident Database (AIID).
When citing the database as a whole, please use:
McGregor, S. (2021) Preventing Repeated Real World AI Failures by Cataloging Incidents: The AI Incident Database. In Proceedings of the Thirty-Third Annual Conference on Innovative Applications of Artificial Intelligence (IAAI-21). Virtual Conference.
Pre-print on arXiv · Database snapshots & citation guide
We use weekly snapshots of the AIID for stable reference. For the official suggested citation of a specific incident, use the “Cite this incident” link on each incident page.