Looks like a really promising approach to therapy as well.. right up until they said they'd stop voicing it by a skilled psychologist and get an AI to do it, while putting the psychotic person into VR instead of over a screen.. that was a big 'fuck no'.
I see where they're coming from, though: right now you have to be certified on this very specific program, meaning you only get the benefits if you have access to one of the 38 people currently trained for it in the UK.
I would definitely want a professional to be in charge but, as the article itself points out, "Joe recently went back to his GP in search of help with his anxiety (...) The GP put him on a waiting list for NHS talking therapy, and warned that he could be in for a very long wait". Given how bad access to mental health resources is I may be willing to take "a community nurse, or a nursing assistant" now over "wait several months for a chance at a doctor who may not be the right fit for you".
I wouldn't dream of allowing an AI to roam free - as the article says patients can get more psychotic and arguably "you should end it" could very well be part of the training data. But if the AI suggests lines that a trained human can oversee... then maybe?
I think your proposal of AI therapists with human overseers would be okay if we were able to develop some kind of metrication and monitoring of the human oversight portion.
Without that control, what would inevitably happen would be that the highly-scalable part of the system (the AI) would be scaled, and the difficult-to-scale part of the system (the human) would not. We would fairly quickly end up with a situation where a single human was "overseeing" hundreds or thousands of AI conversations, at which point the oversight would become ineffective.
I don't know how to metricate and monitor human oversight of AI systems, but it feels like there are already other systems (like Air Traffic Control) where we manage to do similar things.
If they are going to get creative, perhaps apply the constructive effects of some mind altering drugs? Under AI shaman supervision of course!
I have never heard voices, but experienced two forms of dissociation for a while after a trauma. Nothing was real, was one of them. Couldn’t trust any scene I was in or the chair I was going to sit on. Unending vertigo and feelings of experiencing a fiction.
I think it’s absolutely weird that the proctor is voicing the avatar
I’m imagining some Unreal Engine Skyrim deity on screen being voiced by my therapist, acting with a vocoder. Like, c’mon.
Definitely train a computer to do this part, generate your psychosis demon and have it really say the abstractions you described. Theyre already shockingly scary in realism when theyre not prompted to be.
A VR headset might be a little too immersive and triggering
Yes you definitely want a possibly suicidal person to be talking to a "AI" engine who talks back with the avatar they normally hear. (This was sarcasm in case you missed it)
If there were some evidence that the voices people hear are generated in a readable portion of the brain and you could train the AI on specifically those parts, it could be a powerful therapy.
Hell, make it into a videogame RPG where the patient is the hero and the labyrinth they must conquer is their own mind. Their party could consist of good friends and trained psychologists who work together to probe, map, reveal, and conquer the demons in the patients mind through teamwork, collaboration, and shared experiences.
However, it probably shouldn't be done outside of a clinical setting and there should be safeguards in place. The last thing I want is for some AI trained on my most psychotic thoughts to exist in any sort of reality, even virtual.
Look at the article, people are walking out because of how ridiculous it is, not because of how triggering it is, this article is too much of a puff piece to say.
And your criticism was the exact reason the article’s therapist puppet master technique was avoided. So we are already passed that point, lets get the roleplaying puppet master out the way. I dont want some therapist that gets off on dissing me as a ventriloquist.
Looks like a really promising approach to therapy as well.. right up until they said they'd stop voicing it by a skilled psychologist and get an AI to do it, while putting the psychotic person into VR instead of over a screen.. that was a big 'fuck no'.