It’s not a misnomer that lawyers are often called “counselors.” Tangling with the law–as a plaintiff, a defendant, or a participant in business and personal transactions of all kinds–could well bring us to call out for counsel. There’s all those statutes and implications we don’t understand, the question whether the lawyers involved understand our personal issues and context, and then those overwhelming feelings that are rarely acknowledged or discussed. A counselor, please!
Enter a recent study on whether a human trained therapist or an AI-generated chatbot does a better job of counseling.
A team drawn from several universities referred a relationship issue posed by 830 participants–half men and half women in their 40s–to those two resources randomly and examined their written therapeutic approaches.
For starters, the participants could rarely tell whether the advice was being given by a human therapist or a chatbox. The responses written by a chatbox were rated generally higher in key psychotherapy principles by the researchers. And perhaps most importantly, in most cases the participants preferred the chatbox’s approach.
Why?
Five factors were identified: whether the proposed approach “understood the speaker, showed empathy, was appropriate for the therapy setting, was relevant for various cultural backgrounds, and was something a good therapist would say. ChatGPT came out ahead of human therapists particularly around understanding the speaker, showing empathy, and showing cultural competence.”
We have seen evidence of this sort of AI-generated understanding and empathy in a number of different realms over the last half century–the first therapeutic chatbox, ELIZA, was generated in 1966–and increasingly so during the last decade, particularly relating to giving advice for medical care, mental health and personal therapy.
Also in law, “robot lawyers” have been proved to be effective in advising on various legal matters, in predicting legal outcomes in court, and in providing federal judges with bases for decisions.
Of course, lawyers are rarely trained counselors and their reputations clearly suffer from how obvious that is to many clients. In fact, as has been demonstrated in several studies, empathy may well be where human lawyers can make a last stand against artificially intelligent lawyers who are capable of doing so much, and increasingly much more, of human lawyers’ work.
BUT, if AI can also generate the kind of personal understanding and empathy that human lawyers don’t generate… all bets are off.