Privacy is Safety: Amazon's Tech Against Trafficking Summit
[ contains sensitive content about human trafficking ]
Holding a plate of untouched endive hors d'oeuvres, I gazed out the 16th floor window of Amazon HQ at the Tech Against Trafficking Summit. From the skyscraper’s towering view, I could see the street down below where I slept in warehouse doorways as a homeless teenager. A few blocks down I could see the Westin hotel where I was raped at age 14 by a group of invenstment bankers visiting from New York. I tried to be optimistic when I registered to attend the conference as an audience member and child trafficking survivor. Instead, my fears about tech moving fast and breaking things were confirmed. Only two trafficking survivors were invited to speak as panelists out of approximately forty non-survivors including government leaders and bureaucrats, law enforcement, international NGO directors, tech executives, and project managers from Amazon, Facebook (Meta), Instagram, Google, and Microsoft.
During a Q&A, I asked what role unions and workers’ rights had in discussions about forced labor. The panelists from Amazon and Google wouldn’t look at me. A representative from the French Ministry of Labor explained that forced labor is illegal.
Prioritizing safety over privacy was repeatedly emphasized. Amazon and Ring presenters celebrated giving police video footage from neighborhoods without warrants. No panelist mentioned concerns about abuse of such surveillance. Another presenter lauded the ability to view and freeze bank accounts.
No panelists expressed concern that most anti-trafficking tech tools are built for police end users; how many law enforcement officers and border patrol agents are domestic violence abusers or human traffickers; or how tools to bypass warrants risk disproportionately targeting innocent BIPOC and LGBTQ citizens. No one discussed how our punitive systems make people more vulnerable to predatory human traffickers, as my sister endured.
After Jeffrey Epstein and Ghislaine Maxwell, I naively hoped anti-trafficking leaders would become more savvy. As a survivor whose trafficker is protected by law enforcement impunity, I would like tools like whistleblower apps for victims with abusers in positions of governmental power.
At the conference, one app for auditing forced labor supply chains looked promising, as did another for assessing risk factors if offered a job in another country. But both appear to only be available in Asian countries for now.
At its core, trafficking is the exploitation of vulnerable people. Yet our funding to “fight” human trafficking almost exclusively invests in traffickers and their punishment. The math is all wrong.
After one survivor panelist cautioned against the dangers of haphazardly collecting any and all data, the CEO of one of the world’s largest anti-trafficking NGOs glibly said “shit happens”. I told him that privacy is safety. Since testifying against my trafficker, I have to pay for a monthly service, a bot, to scrape my information from the internet because my trafficker has threatened to kill me. I asked if the “shit” he was referring to was death.
He responded that his NGO prioritizes survivors’ safety, and that while I might need to pay for data privacy in the US, there are likely more protections in his country.
I believe he is well-intentioned. I believe most people are well-intentioned. However most tech companies are based in the US and similar technology has been maliciously used worldwide: Cambridge Analytica’s influence in elections, Facebook’s role in genocides in Myanmar and Ethiopia, LexisNexis selling data to ICE. There is also the risk of data breaches seen in banks, Experian, and Uber’s recent hack in spite of (or because of) 2FA security. Tech giants seem to care little about safe and accurate statistical models, as illustrated when Google fired renowned AI researcher, Timnit Gebru, PhD, after she raised concerns about bias and racism in large datasets.
As the aunt of teenagers, I felt hopeful at the conference when Facebook explained how Instagram algorithms flag potential pedophiles. And I also thought of the time their CEO, Mark Zuckerberg said: “I don't know why. They ‘trust me’. Dumb fucks.”
I would have liked to see financial tools at the conference for survivors whose identities have been stolen by traffickers. I would also like tech tools to support people with disabilities because every survivor I know lives with chronic illness, autoimmune diseases, or other disabilities. For many human trafficking survivors, the first escape is not the hardest part.
Only the survivor panelists highlighted our need for education, healthcare, housing, and jobs in positions of leadership instead of low paid consulting gigs.
I understand the urgency to stop human trafficking. But if our datasets are rushed and biased, and if conferences don’t invite survivors from all backgrounds, our AI and Machine Learning training data will begin rife with bugs, harming the very people anti-trafficking leaders purport to protect.
Tools built with trafficking survivors are exponentially smarter and more effective than without us. We don’t need to be saved. We need to be listened to.