
Sources: former TikTok content moderators working for a third-party company say they were trained using CSAM from a shared document with wide internal access — A largely unsecured cache of pictures of children being sexually exploited has been made available to third-party TikTok content moderators …
Source link