Crisis Text Line, a nonprofit that uses text messaging to help people struggling with suicide and other mental health issues, has been sharing data collected during those text conversations with a for-profit spinoff. The spinoff, Loris.ai, uses that data to create customer service software and has a revenue-sharing agreement with Crisis Text Line, according to a new POLITICO investigation.
Crisis Text Line told POLITICO that all of the data it shares with Loris.ai is anonymized and that the organization describes data usage in its terms of service. “Crisis Text Line obtains informed consent from each of its texters," the organization's general counsel told POLITICO. “The organization’s data sharing practices are clearly stated in the Terms of Service & Privacy Policy to which all texters consent in order to be paired with a volunteer crisis counselor.”
But privacy experts say anonymized data can sometimes be reverse-engineered, and people rarely read terms of service — perhaps even less so when they're reaching out for help in the midst of a mental health crisis. “The fact that the data is transferred to a for-profit company makes this much more troubling and could give the FTC an angle for asserting jurisdiction," Jessica Rich, former director of the Federal Trade Commission’s Bureau of Consumer Protection, told POLITICO.
A former volunteer for Crisis Text Line has been leading an advocacy campaign calling for changes to the organization's data practices, including through a Change.org petition.
If you or a loved one needs help:
Call the National Suicide Prevention Lifeline at 800-273-8255 to reach a counselor at a locally-operated crisis center 24 hours a day for free. Here is their privacy policy.