Behind the Screen. Content Moderation in the Shadows of Social Media
Social media on the internet can be a nightmarish place. A primary shield against hateful language, violent videos, and online cruelty uploaded by users is not an algorithm. It is people. Mostly invisible by design, more than 100,000 commercial content moderators evaluate posts on mainstream social media platforms: enforcing internal policies, training artificial intelligence systems, and actively screening and removing offensive material—sometimes thousands of items per day.