The real harm of sharing data in the Crisis Text Line

[ad_1]

Another week, another Privacy Horror Show: Crisis Text Line, a non-profit text messaging service for people experiencing severe mental health crises, uses “anonymous” call data to power a profitable machine learning tool for customer support teams . (After the backlash, CTL announced it would stop.) Crisis Text Line’s response to the backlash focused on the data itself and whether it included personal information. But this answer uses the data as distraction. Imagine this: let’s say you’ve sent an SMS to Crisis Text Line and received a message saying, “Hey, just so you know, we’re going to use this call to help our profitable affiliate build a machine learning tool for customer support companies. ”Would you continue to send messages?

This is the real parody – when the cost of receiving mental health care in a crisis turns to dust for a machine learning factory. And it’s not just CTL users who pay; this is anyone who seeks help when they need it most.

Americans need help and can’t get it. The huge unsatisfied demand for critical advice and assistance has given rise to a new class of organizations and software tools that exist in the regulatory gray area. They help people with bankruptcy or expulsions, but they are not lawyers; they help people with mental crises but are not caregivers. They invite ordinary people to rely on them and often give them real help. But these services can also avoid taking responsibility for their advice or even abuse the trust people give them. They can make mistakes, promote predatory advertising and misinformation, or simply sell data. And consumer protections, which usually protect people from abuse or mistakes by lawyers or doctors, are not catching up.

This regulatory gray area may also limit organizations that offer new solutions. Take Upsolve, a non-profit organization that develops software to guide people through bankruptcy. (The company is trying to claim that it does not offer legal advice.) Upsolve wants to train community leaders in New York to help others navigate the city’s notorious debt courts. One problem: These future interns are not lawyers, so under New York law (and almost every other state), the Upsolve initiative would be illegal. Upsolve is now suing to make an exception for himself. The company rightly claims that the lack of legal aid means that people do not actually have rights under the law.

The failure of the legal profession to provide Americans with access to support is well documented. But the Upsolve trial also raises new, important questions. Who is ultimately responsible for the advice given by a program like this, and who is responsible for the mistake – the intern, the trainer, both? How to teach people about their rights as a client of this service and how to seek protection? These are extremely responsible issues. There are many policy tools for building relationships with increased responsibilities: We could give counselors a special legal status, establish a loyalty obligation for organizations that process sensitive data, or create policy test boxes to test and test. we learn from new models for providing advice.

But instead of using these tools, most regulators seem content to bury their heads in the sand. You cannot officially give legal or health advice without a professional certificate. Unofficially, people can get such advice from various tools and organizations working on the periphery. And while credentials can be important, regulators fail to commit to the ways in which software has fundamentally changed the way we give advice and care for each other, and what that means for the responsibilities of advisers.

And we need that commitment more than ever. People who seek help from experts or caregivers are vulnerable. They may not be able to distinguish good service from bad service. They do not have time to analyze the terms of the service, full of jargon, warnings and disclaimers. And they have little or no power to negotiate to set better conditions, especially when they reach the middle of a crisis. That is why the fiduciary duties of lawyers and doctors are so necessary in the first place: not only to protect the person who sought help once, but also to give people confidence that they can seek help from experts on the most critical and sensitive issues. facing. In other words, a lawyer’s obligation to their client is not simply to protect that client from that particular lawyer; it is to protect public confidence in lawyers.

And that’s the real harm – when people don’t connect to a suicide hotline because they don’t believe the hotline has their only interest in their heart. This mistrust can be contagious: Crisis Text Line’s actions may not just stop people from using Crisis Text Line. It can stop people from using all kinds similar service. What’s worse than not being able to find help? She can’t trust him.

[ad_2]

Source link

Leave a Reply

Your email address will not be published.