Crisis Text Line, a nonprofit that uses text messaging to help people struggling with suicide and other mental health issues, has been sharing data collected during those text conversations with a for-profit spinoff. The spinoff, Loris.ai, uses that data to create customer service software and has a revenue-sharing agreement with Crisis Text Line, according to a new POLITICO investigation.
But privacy experts say anonymized data can sometimes be reverse-engineered, and people rarely read terms of service — perhaps even less so when they're reaching out for help in the midst of a mental health crisis. “The fact that the data is transferred to a for-profit company makes this much more troubling and could give the FTC an angle for asserting jurisdiction," Jessica Rich, former director of the Federal Trade Commission’s Bureau of Consumer Protection, told POLITICO.
A former volunteer for Crisis Text Line has been leading an advocacy campaign calling for changes to the organization's data practices, including through a Change.org petition.
If you or a loved one needs help: