Trigger warning: This article discusses themes of suicide and mental illness.
The Crisis Text Line is a globally-recognized mental health support service that uses text messaging to help people through traumas such as self-harm and suicidal ideations. Since initially launching in 2013, Crisis Text Line has exchanged 6.7 million conversations through text, Facebook Messenger and WhatsApp. The organization has stretched from the United States to Canada, the U.K. and Ireland. Unaffiliated with the National Suicide Prevention Lifeline, the Crisis Text Line has recently faced scrutiny for selling caller data to for-profit organizations.
Loris.ai, the organization’s for-profit branch, has been using information from the nonprofit to produce and sell customer service software. “Crisis Text Line says any data it shares with that company, Loris.ai, has been wholly ‘anonymized,’ stripped of any details that could be used to identify people who contacted the helpline in distress,” Politico stated. Despite the anonymity of people’s identification, media outlets have questioned the ethics surrounding the nonprofit’s usage of private information for company expansion.
Ethics and privacy experts expressed several concerns with this revelation. First, studies of other datasets have demonstrated that tracing records to individuals is possible despite goals of anonymity. Second, it is questionable whether people who are seeking help are consenting to having their data shared regardless of anonymity. Although the helpline provides a link to a 50-paragraph disclosure when individuals first seek help, ethics and privacy experts have wondered if people going through immense mental turmoil are in the headspace to fully consent.
Responding to criticism, both entities have emphasized that their goal is global improvement. Loris.ai explained that their aim was to make “customer support more human, empathetic and scalable,” Loris.ai shared. On Feb. 1, several days after Politico published a viral article critiquing the sharing of data, The Verge published an article explaining that the Crisis Text Line has stopped sharing conversation data with Loris.ai.
“We hear you. Crisis Text Line has had an open and public relationship with Loris AI. We understand that you don’t want Crisis Text Line to share any data with Loris, even though the data is handled securely, anonymized and scrubbed of personally identifiable information,” Crisis Text Line explained. Loris.ai will remove any data that has been acquired from Crisis Text Line.
Sources: Crisis Text Line, Politico, The Verge