The fact that ChatGPT has only been around for a relatively short amount of time, yet we can legitimately ask questions like this without it sounding completely implausible, is quite incredible - and a bit scary. It’s astonishing what it’s capable of, and it’s only going to get better.
P.S. love the idea of AI pep talks for tricky parenting tasks. I need one of those before bedtime every night!
As ChatGPT continues to gain popularity, it is also bringing up so many potential conflicts and ethical questions. Thank you for lifting up this one and noting that it could possibly supplement but not supplant therapy.
I work as a school counsellor/psychologist and an big aspect of the job is screening for risk of harm (self or otherwise). Not something an AI therapist can necessarily do, nor would it be able to take the required action to keep the young person safe.
why AI could not do it? We have seen ( in many studies) that some users prefer to disclose these things to bot than humans (not all of course). I would say that we should use both resources. Humans and AI.
The fact that it is micking speech and people are out here asking it for therapeutical advice says alot more about us than it dose chat gpt. The technology in itself is not that scarry but people on the other hand terrifying. The argument AI will replace people has merit butt people will replace themselves long before it is ready for the task.
Dear Jacqueline, I just wanted to share that does doing research with chatbots have been reporting that many users establish a kind of bond with their chatbots. More specifically there has been some studies reporting that the level of therapeutic alliance with the bots was similar or higher than with human therapist. This is a very interesting phenomenon that was unexpected to me. But some user report that they prefer to talk to a bot about certain things than with a human. I personally prefer human to human interactions. But it seems like we are not all wired the same way. These studies were done with rule based chatbots (that are not as good as Chat GPT). So I can imagen that with Chat GPT or future (and better chatbots to come) people will actually start establishing a bond and ultimately we will all have our digital buddy.
I enjoy reading your blog and share it with my students in class :)
Here are some references:
Beatty, C., Malik, T., Meheli, S., & Sinha, C. (2022). Evaluating the therapeutic alliance with a free-text CBT conversational agent (Wysa): A mixed-methods study. Frontiers in Digital Health, 4, 847991.
Darcy, A., Daniels, J., Salinger, D., Wicks, P., & Robinson, A. (2021). Evidence of human-level
bonds established with a digital conversational agent: Cross-sectional, retrospective
Dosovitsky, G., & Bunge, E. L. (2021). Bonding with bot: User feedback on a chatbot for social isolation. Frontiers in Digital Health, 3, 138. https://doi.org/10.3389/fdgth.2021.735053
Thanks for sharing! Super interesting. In terms of the therapy efficacy piece, it will be really interesting to see if that perception of having a "bond" with the AI therapist is actually as effective as having a "bond" with a human therapist. Maybe it's just about the *feeling* of having an alliance with your therapist, and that's enough - but I do wonder if you actually need that human on the other side of the alliance to see the same efficacy results.
Thanks for replying. As i told you in my message, I am a big fan on this blog, I am teaching a class on screen time, tech and mental health in children and adolescents. I am recommending my students to follow you. I like how you explain the science in a such a simple way.
In terms of the therapy bond. We are use to bond to humans (or animals). What is happening now is something new. We may be bonding with "machines" that help us feel better. Bonding with machines means that those creating them have a big responsibility. I am working on a chatbot for parenting. Happy to chat offline.
The fact that ChatGPT has only been around for a relatively short amount of time, yet we can legitimately ask questions like this without it sounding completely implausible, is quite incredible - and a bit scary. It’s astonishing what it’s capable of, and it’s only going to get better.
P.S. love the idea of AI pep talks for tricky parenting tasks. I need one of those before bedtime every night!
Totally agree on both fronts!!
As ChatGPT continues to gain popularity, it is also bringing up so many potential conflicts and ethical questions. Thank you for lifting up this one and noting that it could possibly supplement but not supplant therapy.
I work as a school counsellor/psychologist and an big aspect of the job is screening for risk of harm (self or otherwise). Not something an AI therapist can necessarily do, nor would it be able to take the required action to keep the young person safe.
why AI could not do it? We have seen ( in many studies) that some users prefer to disclose these things to bot than humans (not all of course). I would say that we should use both resources. Humans and AI.
The fact that it is micking speech and people are out here asking it for therapeutical advice says alot more about us than it dose chat gpt. The technology in itself is not that scarry but people on the other hand terrifying. The argument AI will replace people has merit butt people will replace themselves long before it is ready for the task.
Therapeutic alliance with bots.
Dear Jacqueline, I just wanted to share that does doing research with chatbots have been reporting that many users establish a kind of bond with their chatbots. More specifically there has been some studies reporting that the level of therapeutic alliance with the bots was similar or higher than with human therapist. This is a very interesting phenomenon that was unexpected to me. But some user report that they prefer to talk to a bot about certain things than with a human. I personally prefer human to human interactions. But it seems like we are not all wired the same way. These studies were done with rule based chatbots (that are not as good as Chat GPT). So I can imagen that with Chat GPT or future (and better chatbots to come) people will actually start establishing a bond and ultimately we will all have our digital buddy.
I enjoy reading your blog and share it with my students in class :)
Here are some references:
Beatty, C., Malik, T., Meheli, S., & Sinha, C. (2022). Evaluating the therapeutic alliance with a free-text CBT conversational agent (Wysa): A mixed-methods study. Frontiers in Digital Health, 4, 847991.
Darcy, A., Daniels, J., Salinger, D., Wicks, P., & Robinson, A. (2021). Evidence of human-level
bonds established with a digital conversational agent: Cross-sectional, retrospective
observational study. JMIR Formative Research, 5(5), e27868.
https://doi.org/10.2196/27868
Dosovitsky, G., & Bunge, E. L. (2021). Bonding with bot: User feedback on a chatbot for social isolation. Frontiers in Digital Health, 3, 138. https://doi.org/10.3389/fdgth.2021.735053
Thanks for sharing! Super interesting. In terms of the therapy efficacy piece, it will be really interesting to see if that perception of having a "bond" with the AI therapist is actually as effective as having a "bond" with a human therapist. Maybe it's just about the *feeling* of having an alliance with your therapist, and that's enough - but I do wonder if you actually need that human on the other side of the alliance to see the same efficacy results.
Thanks for replying. As i told you in my message, I am a big fan on this blog, I am teaching a class on screen time, tech and mental health in children and adolescents. I am recommending my students to follow you. I like how you explain the science in a such a simple way.
In terms of the therapy bond. We are use to bond to humans (or animals). What is happening now is something new. We may be bonding with "machines" that help us feel better. Bonding with machines means that those creating them have a big responsibility. I am working on a chatbot for parenting. Happy to chat offline.
oops typo: "that those doing research"