How big tech and AI are putting trafficking survivors at risk
The tech industry’s privileging of ‘safety over privacy’ could get the most vulnerable killed
It’s very surreal to have my piece criticizing the tech industry’s use of AI and human trafficking data come out while I’m working to build an app for trafficking survivors, IPV/domestic violence survivors, and sex workers.
I’m so grateful to Open Democracy for working with me on this because most American editors won’t let me write about human trafficking issues — not even op-eds I’ve pitched because they say I’d be to biased. As though there is a “both sides” argument in favor of exploitation? Umm, ok, go on and say the quiet part loudly I guess.
Saying that survivors of human trafficking can’t report on trafficking issues with an informed lens — not even elements of human trafficking they didn’t personally live through — is Journalism’s strange death grip on platforming cis/het white men as an inherently neutral lens on the world. White men are not neutral. We’ve seen this when Black journalists have not been allowed to cover BLM protests; it is evident when women journalists who have come forward as survivors of sexual assault are told they can no longer cover any stories with an element of sexual violence; and when stories about trans people, healthcare, and legislation are disproportionately assigned to cisgender reporters who are ill-informed on the issues and history.
Follow this logic to its end and you would think that editors want only ChatGPT and an army of bots writing news stories about people, and human reporters are exclusively assigned to write stories about animals, plants, and robots.
If I sound bitter about the publishing industry, it’s because I am. But I’m also immensely grateful to the work that Open Democracy is doing in challenging these violent status quos of our increasingly plutocratic world.