Tech companies are marketing AI-based note-taking software to therapists as a new time-saving tool. But by signing up, providers may be unknowingly offering patients’ sensitive health information as data fodder to the multibillion-dollar AI therapy industry.
The specific application in this instance was that it creates “progress notes”. Admittedly, as I have only the information from the article itself, having no background in this field myself, I can only make assumptions what those are like, but as the name implies it’s charting a client’s progress through therapy and would also imply to me a lot of summarising of information gleaned during sessions. I guess in as much as it also would necessarily have to create a transcript in doing this for you, I guess it also provides that too. This is portrayed as tedious and time consuming work by the creators of the service, who obviously have a vested interest in casting it in such light, but taken at its word, I would say in my opinion the advantage would be in automating some of the tedious and time consuming aspects of the job.
As I suspect you were driving at from the tenor of the question, there’s a lot of ways this could go wrong, in particular privacy concerns when this service is offered in the manner that it is here where it’s processed outside of the therapist’s own clinic by 3rd parties and information is shared with additional parties and used for many purposes with only the flimsy promise of “de-anonymisation” which appears to be hollow. It could also maybe affect how the therapy is conducted, making decisions about how to summarise this information that will influence what decisions a therapist makes and perhaps that therapist might have summarised it differently if doing the notes themselves, then again this all hinges upon how effective it is considered to be. If it can be evaluated and found to be generally good, then it seems tentatively like this could be a pretty helpful tool for a therapist. But in general, my comment was really more directed at what I feel like is a sad state of affairs across the board with recent tech advances including generative AI as applied in any aspect of life or work, that I think is often lost in these conversations where the technology really shows promise or is quite impressive but because of the manner of its development or the surveillance profit model, it’s basically tainted and ruined. I feel like I often come across commentary that fails to make the distinction between the negative aspects of how these techs have come about and are monetized and the tech itself where the latter is simply cast as inherently undesirable even when there’s clearly reason enough for people to find it appealing in the first place for it to end up in use.
that’s plausible. in my opinion a therapist should take the effort to take their own interpretation of what has been said, instead of relying on a machine that digests the system in a uniform way. words of a patient can mean a lot of things, even depending on things like their body language. but I have to admit I’m even more concerned about the privacy consequences which you pointed out. that’s like, it simply can’t go unabused in my opinion. too tempting. I wouldn’t even want to run a business that just stores it without abuse, it’s too risky too.
in your opinion what advantage would AI give to experienced therapists?
The specific application in this instance was that it creates “progress notes”. Admittedly, as I have only the information from the article itself, having no background in this field myself, I can only make assumptions what those are like, but as the name implies it’s charting a client’s progress through therapy and would also imply to me a lot of summarising of information gleaned during sessions. I guess in as much as it also would necessarily have to create a transcript in doing this for you, I guess it also provides that too. This is portrayed as tedious and time consuming work by the creators of the service, who obviously have a vested interest in casting it in such light, but taken at its word, I would say in my opinion the advantage would be in automating some of the tedious and time consuming aspects of the job.
As I suspect you were driving at from the tenor of the question, there’s a lot of ways this could go wrong, in particular privacy concerns when this service is offered in the manner that it is here where it’s processed outside of the therapist’s own clinic by 3rd parties and information is shared with additional parties and used for many purposes with only the flimsy promise of “de-anonymisation” which appears to be hollow. It could also maybe affect how the therapy is conducted, making decisions about how to summarise this information that will influence what decisions a therapist makes and perhaps that therapist might have summarised it differently if doing the notes themselves, then again this all hinges upon how effective it is considered to be. If it can be evaluated and found to be generally good, then it seems tentatively like this could be a pretty helpful tool for a therapist. But in general, my comment was really more directed at what I feel like is a sad state of affairs across the board with recent tech advances including generative AI as applied in any aspect of life or work, that I think is often lost in these conversations where the technology really shows promise or is quite impressive but because of the manner of its development or the surveillance profit model, it’s basically tainted and ruined. I feel like I often come across commentary that fails to make the distinction between the negative aspects of how these techs have come about and are monetized and the tech itself where the latter is simply cast as inherently undesirable even when there’s clearly reason enough for people to find it appealing in the first place for it to end up in use.
that’s plausible. in my opinion a therapist should take the effort to take their own interpretation of what has been said, instead of relying on a machine that digests the system in a uniform way. words of a patient can mean a lot of things, even depending on things like their body language. but I have to admit I’m even more concerned about the privacy consequences which you pointed out. that’s like, it simply can’t go unabused in my opinion. too tempting. I wouldn’t even want to run a business that just stores it without abuse, it’s too risky too.