A wave of controversy has rippled through the indie game community this July after thousands of adult and NSFW games were delisted or hidden from view on major platforms Steam and itch.io. The sweeping crackdown followed pressure from payment processors including Visa, Mastercard, Stripe, PayPal, and Payoneer, prompted by a campaign spearheaded by Australian advocacy group Collective Shout.
While the campaign focused on extreme content like No Mercy, a now-removed title described as simulating sexual violence, its broader impact has raised significant questions with developers and audiences alike.
According to reports from Polygon and The Guardian, the policy updates issued by Steam have left many developers scrambling, as even narrative-rich or non-explicit games dealing with trauma, mental health, or LGBTQ+ themes were caught in the purge.
Meanwhile, itch.io initiated a less direct but equally sweeping change: a backend overhaul that de-indexed all ‘Not Safe for Work’ (NSFW) tagged content from search and homepage discovery, effectively making adult games invisible to new users. While the platform has not issued a formal ban, developers are calling the move a shadow removal.
Critics, including game creators and digital rights advocates, say the response was overly broad, poorly communicated, and dangerously opaque. Games with artistic or therapeutic aims, often made by marginalized voices, have also vanished from storefronts without explanation, raising concerns about financial censorship.
As noted by PC Gamer and Inverse, creators are now rallying on forums like Reddit and Bluesky, circulating petitions, and exploring decentralized platforms and alternative payment systems.
While itch.io is reportedly reassessing its payment infrastructure and user tagging policies, the episode marks a watershed moment. It exposes how financial intermediaries, not governments or communities, are increasingly shaping the boundaries of digital expression, leaving many to wonder: where does moderation end, and censorship begin?