When we talk about platform liability, the legal responsibility online services hold for content posted by users. Also known as online intermediary liability, it determines whether sites like Facebook, Instagram, or OnlyFans can be held accountable for removing sex workers’ posts—or even shutting down their accounts without warning. This isn’t just a tech policy issue. It’s a safety issue. When platforms suddenly ban sex workers without notice, they don’t just lose income—they lose their only way to screen clients, share safety tips, and stay connected to support networks.
Sex work advertising, the act of promoting escort, cam, or adult services online is often treated like illegal activity, even when no law is broken. Platforms claim they’re following ‘community standards,’ but those standards aren’t clear, consistent, or fair. A post about a massage service might get flagged as ‘adult content’ while a fitness influencer promoting bodywork gets no pushback. This double standard pushes workers offline, into riskier situations where they can’t vet clients, share location checks, or use safety apps. Online platform policies, the hidden rules that govern what content stays up and what gets deleted are written by companies with no experience in sex work, and they rarely consult the people most affected.
And it’s not just about bans. Digital censorship, the suppression of information by tech companies or governments also blocks access to vital resources. Workers can’t post about emergency contacts, safe locations, or legal aid. Banks freeze accounts linked to sex work platforms. Payment processors cut off services. Even search engines de-index pages that offer harm reduction advice. All of this is enabled by weak or nonexistent sex worker rights, the legal and human rights protections that should apply to people selling sexual services. The UN and WHO agree: criminalizing or censoring sex work increases violence and disease. Yet platforms act like they’re doing the right thing by silencing workers.
What you’ll find in these posts isn’t theory—it’s real experience. From how to spot when a platform is about to shut you down, to how workers in Australia and the U.S. are fighting back with legal challenges and public campaigns, this collection shows the human cost behind every algorithm change. You’ll read about safety tools that work when ads are gone, how contracts protect tour escorts from liability, and why some sex workers now use coded language just to stay online. These aren’t just stories—they’re survival guides written by people who’ve lost everything because a platform decided their work didn’t belong there.
CDA 230 protects online platforms from legal liability for user content, including speech related to sex work. This law enables sex workers to operate online safely-but fear and bad policy are eroding those protections.
read more