Microsoft Reportedly Blocks 1.6 Million Bot Signup Attempts Per Hour Amid Global AI-Driven Fraud Surge

April 16, 2025

From April 2024 to April 2025, Microsoft thwarted an estimated 1.6 million bot signups per hour and disrupted approximately $4 billion in fraudulent attempts tied to AI-amplified scams. The Cyber Signals report reveals the expanding use of generative AI for creating convincing e-commerce sites, job offers, automated customer service bots, and phishing decoys. Fraudsters are leveraging AI technology to orchestrate large-scale, deceptive campaigns featuring fake reviews, deepfakes, and counterfeit brand domains at an unprecedented pace.

Learn more about the role of Project Cerebellum in promoting responsible AI governance and harm prevention by JOIN US. This incident highlights the importance of implementing guardrails for safe and secure AI practices within the HISPI Project Cerebellum TAIM framework.

Matched TAIM controls

Suggested mapping from embedding similarity (not a formal assessment). Browse all TAIM controls

Alleged deployer
cyber-criminal-networks, unknown-scammers
Alleged developer
various-generative-ai-developers, various-deepfake-technology-developers, various-voice-cloning-technology-developers
Alleged harmed parties
general-public, consumers, enterprises, microsoft, financial-institutions

Source

Data from the AI Incident Database (AIID). Cite this incident: https://incidentdatabase.ai/cite/1037

Data source

Incident data is from the AI Incident Database (AIID).

When citing the database as a whole, please use:

McGregor, S. (2021) Preventing Repeated Real World AI Failures by Cataloging Incidents: The AI Incident Database. In Proceedings of the Thirty-Third Annual Conference on Innovative Applications of Artificial Intelligence (IAAI-21). Virtual Conference.

Pre-print on arXiv · Database snapshots & citation guide

We use weekly snapshots of the AIID for stable reference. For the official suggested citation of a specific incident, use the “Cite this incident” link on each incident page.