Your Kid’s Next Psychologist will be a Bot

As we all know, there’s been a lot of talk about artificial intelligence lately. All areas of life will be infected by AI, and psychology is not immune. AI Apps already exist to do the job of psychologists.

Will these really work? Most of my clients wouldn’t dream of dumping me for a therapy-bot (I hope).

Well, I for one think that the job of a psychologist not only can be done by AI but done better.

I, as readers of this blog will be aware, believe that jobs involving the manipulation of ideas will be much more susceptible to the threats of AI, than jobs that involve moving objects. Object-moving jobs include delivery drivers, brick layers, surgeons and interpretive dancers. Idea-moving jobs include judges, journalists and psychologists.

This is a new realisation for most. In 2019, the CSIRO warned of job losses in the trades and in factories due to automation. In 2023, Consultants Mandala Partners found that the other end of town is most susceptible: Informers, consultants and carers (us guys!).

But I confidently say that most of my clients wouldn’t dream about trading me in for a bot, because I’ve spoken to some of them about it. The people I’ve spoken to tell me that even if a bot could express the most perfect emotive response, they would still be aware than it is just an algorithm. This knowledge would make them feel that the empathy of the machine was insincere – destroying the magic of the therapy process.

The ability to differentiate man from machine is called the Turing Test. You are regularly asked to perform a Turing Test when you are asked “are you a robot” when using the CAPTCHA system on the payments page of a  website (Completely Automated Public Turing test to tell Computers and Humans Apart).

Although many people think that beating the Turing Test is the key to AI therapists taking over from human ones, I disagree.

I think generational change will do it.

Gadgets Grooming Girls and Boys

“Give me a child until he is 7, and I will give you the man” Purported to be a Jesuit saying.

This year, the popular app Snapchat released an AI friend on its device.

Snapchat users are young, 48% of users are between 15 and 25. The app itself is permitted to only those 13 and above, but take it from me, much younger kids are using this app.

The AI friend can be given any name the child wants. The friend releases friendly prompts and learns from the child the more it interacts with him/her. The child is encouraged to share him or herself with the AI. And of course, the AI via Snapchat, shares the child’s data with commercial interests.

Imagine a lonely 14-year-old, let’s call her April. Perhaps April has experienced some negative comments online and is scared to interact with a group. Perhaps she is low status and being ignored by her peers. Imagine how much safer it would feel for April to interact with her AI friend.

Now imagine April as an adult. Who does she feel comfortable talking to? To the human therapist with their unknowable and unexpected opinions and thoughts. This therapist might secretly not like her. The therapist might have views or opinions that challenge how April feels, no matter how non-judgemental this therapist tries to be. And the human therapist could leave the practice, or go on maternity leave, or die.

What about the bot? The bot who instantly works to validate and understand April. Who April knows cannot secretly hold different and difficult opinions. Who will never leave. And most importantly, has a aura of deep security engendered by childhood experiences with her Snapchat friend. Clearly, April will choose the bot. And will have an intrinsic preference for the bot for life.

The Turing Test need not apply when we are dealing with digital natives. If anything, they’ll want a reverse Turing to weed out the humans.

And I’ve personally already witnessed a generational change occur, and so have you. Ask yourself honestly, would your grandfather have been happy seeing a human psychologist at your age? I would guess not. I’ve been in private practice for 10 years and in this short timeframe I’ve seen men of all ages become much more open to seeing a psychologist.

The lord giveth, and the lord taketh away.

Psychologist Seeks Job

I’m not too concerned about my work. My guess is that the people aged from their 90’s through to about 20’s will still want to see a human therapist. I’ve got at least 20 good years in this game. After that I don’t mind throwing it in and becoming a driver or a gardener for a wealthy bot.

Jokes aside, I think that a human can fulfil one very important role that a bot cannot: to be the figure of blame.

I’ve noticed that when app makers have released therapy bots, they tend to say things like “of course, this isn’t meant to replace a regular therapist”. Statements like this lull psychologists like me into a false sense of security that they have some je ne sais quoi that the bot doesn’t. My guess, though, is that the real reason for statements like this is to avoid liability when something goes wrong.

Life is messy and things will go wrong no matter how good AI gets. Will AI take responsibility for bad advice? For deaths and injuries? The AI developers will always include the small print “not to be used as a replacement for a human therapist, to protect themselves from litigation.

At the end of the day, our human sense of justice insists on someone taking the fall. And I can’t see bots taking over that function from humans any time soon.

Speak Your Mind

*



Suite C5
102-106 Boyce Rd
Maroubra Junction, NSW 2035

info@hendriks.net.au
(02) 8958 2585

Have Questions?
Send a Message!

By submitting this form via this web portal, you acknowledge and accept the risks of communicating your health information via this unencrypted email and electronic messaging and wish to continue despite those risks. By clicking "Yes, I want to submit this form" you agree to hold Brighter Vision harmless for unauthorized use, disclosure, or access of your protected health information sent via this electronic means.