Crisis Text Line stops sharing conversation data with AI company

Crisis Text Line has decided to stop sharing conversation data with spun-off AI firm Loris.ai after being subjected to investigation by data protection experts. “Over the last few days, we have been listening closely to the concerns of our community,” the hotline service wrote in a statement on its website. “We hear you. Crisis Text Line has had an open and public relationship with Loris AI. We understand that you do not want Crisis Text Line to share any data with Loris, even though the data is handled securely, anonymously and cleaned of personally identifiable information. ” Loris.ai will delete all data it has received from Crisis Text Line.

Politics recently reported how the Crisis Text Line (which is not affiliated with the National Suicide Prevention Lifeline) shares data from conversations with Loris.ai, which builds AI systems designed to increase the empathic conversation of customer service staff. Crisis Text Line is a non-profit service that provides a text line for discussions about mental health, but it is also a shareholder in Loris.ai and according to Politics, at one point shared a CEO with the company.

Before talking to volunteer advisors, hotline users consent to data collection and can read the company’s data sharing practices. Politics quoted a volunteer who claimed that the people contacting the line “have an expectation that the conversation is only between the two people speaking” and said he was fired in August after expressing concern over CTL’s handling of data. Since then, the same employee started a petition urging CTL “to reform its data ethics.”

Politics noticed how Crisis Text Line says data usage and AI play a role in how it works:

“Computer science and artificial intelligence are at the heart of the organization – and ensure, it says, that those in the most demanding situations do not wait more than 30 seconds before sending messages with one of its thousands of volunteer advisers. It says it’s combing the data it collects to gain insights that can help identify the most pressing issues or reset people’s problems, in much the same way as Amazon, Facebook and Google my trends from likes and searches. “

Following the report, Crisis Text Line published a statement on its website and via one Twitter thread. In a statement, Crisis Text Line said it does not “sell or share personally identifiable data with any organization or company.” It went on to argue that “[t]the only for-profit partner with whom we have shared fully scrubbed and anonymized data is Loris.ai. We founded Loris.ai to leverage the experience of operating our service to make customer support more humane and empathetic. Loris.ai is a for-profit company that helps other for-profit companies apply de-escalation techniques in some of their most notoriously stressful and painful moments between customer service representatives and customers. “

To its defense, Crisis Text Line said “Our data scrubbing process has been underpinned by independent privacy guards such as the Electronic Privacy Information Center, who called Crisis Text Line” a model manager of personal data. ” now that the Electronic Privacy Information Center (EPIC) has responded with its own statement, saying the quote was used outside of its original context:

“Our statements in that letter were based on a discussion with CTL about their data anonymization and scrubbing policies for sharing academic research, not a technical review of their data practices. Our review was not related to, and we did not discuss with CTL, the commercial data transfer system between CTL and Loris.ai.If we had, we could have raised the ethical concerns with commercial use of intimate messaging data directly with the organization and their advisors.But we were not, and the reference to our letter now, out of context, is incorrect. “

On the Loris.ai website, it claims “protecting personal data is at the core of everything we do” and that “we derive our insights from anonymised, aggregated data that has been cleared of personally identifiable information (PII).” It is not enough for EPIC, which emphasizes that Loris and CTL seek to “extract commercial value from the most sensitive, intimate and vulnerable moments in life (for) the individuals seeking mental health care and the hardworking. Volunteer respondents. . No data scrubbing technique or statement in any Terms of Service can resolve the ethical breach. “

Update, 22.15 ET: This story has been updated to reflect Crisis Text Line’s decision to stop sharing data with Loris.ai.

Leave a Comment

%d bloggers like this: