Tech companies are marketing AI-based note-taking software to therapists as a new time-saving tool. But by signing up, providers may be unknowingly offering patients’ sensitive health information as data fodder to the multibillion-dollar AI therapy industry.
that’s plausible. in my opinion a therapist should take the effort to take their own interpretation of what has been said, instead of relying on a machine that digests the system in a uniform way. words of a patient can mean a lot of things, even depending on things like their body language. but I have to admit I’m even more concerned about the privacy consequences which you pointed out. that’s like, it simply can’t go unabused in my opinion. too tempting. I wouldn’t even want to run a business that just stores it without abuse, it’s too risky too.
that’s plausible. in my opinion a therapist should take the effort to take their own interpretation of what has been said, instead of relying on a machine that digests the system in a uniform way. words of a patient can mean a lot of things, even depending on things like their body language. but I have to admit I’m even more concerned about the privacy consequences which you pointed out. that’s like, it simply can’t go unabused in my opinion. too tempting. I wouldn’t even want to run a business that just stores it without abuse, it’s too risky too.