Instagram users have raised the alarm over a wave of account bans and suspensions, with many claiming they’ve been unfairly targeted despite not violating any of Meta’s policies. While Meta has yet to issue an official statement, users on Reddit, X, and other platforms are voicing their frustration, suggesting AI-powered moderation could be behind the issue.
Dozens of users say they’ve submitted appeals and ID verification, only to be met with silence. Some express feeling helpless without any direct way to contact Meta support. “I’ve tried everything—appeals, ID uploads—but no one’s responding,” wrote Reddit user u/Dyrovicious. Meanwhile, Meta’s paid Verified subscription service offers priority support, but it’s inaccessible to many affected individuals.
The Instagram subreddit has been dominated by posts about these bans for weeks, while Instagram’s official X account is flooded with user complaints. A Change.org petition demanding action has already gathered over 4,000 signatures. Some are even considering filing a class-action lawsuit against Meta.
Though false positives are common with automated systems, users argue that the scale of the current bans is unprecedented. Pinterest experienced a similar problem earlier this year, eventually blaming the issue on an internal error — though it denied AI was the cause.
For many, this isn’t just a social inconvenience but a financial blow. Small business owners and creators rely on Instagram for income, and sudden bans threaten their livelihoods. “This is my full-time job,” said Reddit user u/Paigejust. Another, u/CourtShaw, added, “I’ve built my brand and business here. It’s devastating.”
Some users have even been wrongly accused of serious violations such as child sexual exploitation (CSE), which could cause irreversible personal and professional harm. Despite the growing outcry, Meta remains silent on the issue.
Disclaimer
NextNews strives for accurate tech news, but use it with caution - content changes often, external links may be iffy, and technical glitches happen. See full disclaimer for details.