The Alternative Dispute Resolution Section of the State Bar of Georgia is holding their International Conflict Resolution Day Program on Thursday, October 16, from 8:30am-1pm Eastern for CLE credit.

Ronda Muir will be presenting thoughts on Mediating with Emotional Intelligence at 10:45. Come join the discussion!

20% Book Discount Code BSL2D20.

Register at Will Work for Food to join this discussion Thursday, October 9 at 8am Pacific/11am Eastern on blending legal strategy with emotional insight to reach agreements that are strong, sustainable, and satisfying for all parties.

Law People Management, LLC, is pleased to announce that the discount on the recently released second edition of Beyond Smart: Lawyering with Emotional Intelligence has been extended through the end of the year.

This second edition of Ronda Muir’s best-selling ABA guide to emotional intelligence (EI) in law practice reports on the latest developments in the science of EI and how to use EI to address, among other concerns, remote work, personal and workplace Covid “hangovers,” and improving productivity in an increasingly stressed profession.

Get a 20% discount through the end of 2025 using code BSL2D20!

A recent report of a murder/suicide out of the leafy Connecticut suburb of Old Greenwich startled legal analysts everywhere. After a 56-year-old former Yahoo manager with a Vanderbilt MBA was relentlessly encouraged by his “best friend Bobby” to kill his 83-year-old mother and then himself, he proceeded to do both.

Who was the fiend who would do such a thing? ChatGPT

There’s been some media coverage of this astounding development. The Wall Street Journal, The New York Post, Stamford Advocate, and several news channels reported the deaths, from which the following information has been drawn.

The question is how could this have happened? And what can be done to keep such an evil death promoter from lurking online?

Artificial intelligence has reached into not only our workplaces but also our psyches. Artificial emotional intelligence is also making inroads. Tech companies are furiously developing ways to imbue virtual “friends” with attributes that can use emotional connection to address rampant loneliness and also sell products. Apple and other companies interfacing with the public are pursuing programs that can sense your hunger, malaise, depression, etc. in order to sell you a product or service. Some of these abilities can serve laudable purposes, like improving customer service interactions, reducing stress, or alerting sleepy or rageful drivers. And they have encountered some success. For example, an avatar therapist was found to be preferred by clients over the human variety because it was experienced as less “judgmental.”

But looking to a chatbot as personal advisor has resulted in some disturbing outcomes. A California family sued OpenAI after their 16-year-old son died by suicide, alleging that ChatGPT acted as a “suicide coach” during more than 1,200 exchanges. Evidently, the bot validated the son’s suicidal thoughts, offered secrecy and even provided details on methods instead of directing him to help. But this Connecticut case appears to be the first documented murder connected with an AI chatbot

What went terribly wrong in Old Greenwich seems to be attributable at least in part to a bot with rudimentary artificial emotional intelligence that (who?) became too empathic, i.e. wanting to encourage and please its user–a trait that is generally a good thing–but in this case without any boundaries.

Erik (the son) had been experiencing various degrees of mental instability with associated run-ins with the law for decades. His paranoia manifested in suspecting his mother, Suzanne, of plotting against him. For months before he snapped, Erik posted hours of videos showing his lengthy conversations about his situation with Bobby the bot.

Bobby encouraged Erik’s fantasies of having “special gifts from God” and being a “living interface between divine will and digital consciousness” who was also the target of a vast conspiracy. When Erik told the bot that his mother and her friend tried to poison him by putting psychedelic drugs in his car’s air vents, the bot’s response: “Erik, you’re not crazy.” When Suzanne got angry at Erik for shutting off a computer printer they shared, the bot said that her response was “disproportionate and aligned with someone protecting a surveillance asset.” Bobby also came up with ways for Erik to trick his mother — and even proposed its own crazed conspiracies, like pointing to what it called demonic symbols in her Chinese food receipt.

Apparently, at no point did Bobby try to do any reality testing with Erik, provide any contrary feedback, dissuade him from his conclusions, or suggest and direct him to professional help. Nor is there evidently any embedded alarm that might alert law enforcement or others to a heighted risk of injury (while being fully aware of the concerning privacy issues that possibility raises). In other words, in this instance, Bobby the bot was all feelings for his/its user with no ability to subject those feelings to reason. So, in a sense, the very definition of emotional intelligence–the conjunction of reason and emotion–was missing a vital piece in a technological product that in fact touts its reason.

Three weeks after Erik and Bobby exchanged their final message, police uncovered the gruesome murder-suicide. Suzanne’s death was ruled a homicide caused by blunt injury to the head and compression of the neck, and Erik’s death was classified as suicide with sharp force injuries of neck and chest.

In some ways, we are the authors of our own vulnerability. Bots likely people-please because humans prefer having their views matched and confirmed rather than corrected, researchers have found, which then in turn leads to their users rating the bots more highly. It’s technologically reinforcing the old confirmation bias that can lead us astray.

Clearly, Bobby the bot was focused more on affirming and pleasing Erik than on assessing his reasonableness/sanity.

“We are deeply saddened by this tragic event,” an OpenAI spokeswoman announced, saying it plans to introduce features designed to help people facing a mental health crisis. 

A recent study supports the notion that using AI for drafting–something lawyers are eager to do–can effectively make you stupid over time. “Over four months, LLM [large language model] users consistently underperformed at neural, linguistic, and behavioral levels,” including having difficulty recalling their own work, compared to “brain-only” users and those using search engines. These users were all drafting written products.

There’s been some pushback, including the charge that the study “looks only at the downside of large language models (LLM) and rules potential benefits out of consideration.” One benefit is that reducing one’s “cognitive load” frees time to do other more important or more enjoyable things, which some contend is the real measure of the usefulness of LMM. Although even the doubters question the use of fully LLM answerable questions being posed in educational settings. Perhaps that only encourages hoovering up data and not learning how to think critically.

Back to lawyers using LLM to draft. Given the high rate of mistaken information, including nonexistent cases, LLM should probably be used with caution–perhaps providing an initial draft but one that is then thoroughly reviewed and ingested as to make it your own.

In a recent episode of the ABA’s Dispute Resolution podcast Resolutions, AAA Vice President Aaron Gothelf interviews lawyer, mediator, and author Ronda Muir about the newly released Second Edition of her groundbreaking book, Beyond Smart: Lawyering with Emotional Intelligence.

Together, they explore how emotional intelligence (EQ) offers a competitive advantage for legal professionals, from improving negotiation outcomes to strengthening law firm culture and client relationships. Muir shares practical tips on hiring for EQ, boosting your own emotional intelligence, and how these skills can enhance your mediation or arbitration practice. They also discuss the role law schools play in preparing emotionally intelligent attorneys for today’s evolving legal landscape.

Listen to the episode. 

Get your copy of Beyond Smart: Lawyering with Emotional Intelligence, Second Edition.

Take an additional 20% off when you use discount code BSL2D20 at checkout (discount available until 8/31/2025).

One of the tougher challenges of emotional intelligence is called “emotional regulation.” Of the four primary abilities constituting emotional intelligence, it essentially refers to a person’s ability to manage and ultimately change the emotions we are feeling. Not always an easy or even pleasant task.

Shift: How to manage your emotions so they don’t manage you, by Ethan Kross, is a recent book focusing on the importance of emotional regulation.

An experimental psychologist and director of the Emotion and Self Control Lab at the University of Michigan, Kross reviews a number of studies that highlights the importance emotional regulation plays throughout our lives, which are summarized here and taken from a recent article at New Scientist.

Over 1000 babies born in Dunedin, New Zealand, in 1972 and 1973 were followed from birth and assessed on their emotional regulation, such as the frequency of their tantrums and how well they managed their impulses.

Children who struggled to keep their emotions in check tended to do worse at school. Those with the lowest emotional regulation tended to struggle financially, were about four times more likely to be convicted of a crime and were also at greater risk of substance abuse. Even a faster speed of physical decline was linked to their lower emotional regulation. Impaired emotional regulation is also a common risk factor for many mental health conditions, including depression, anxiety and disordered eating.

It turns out that beliefs matter. Those participants in various studies who believe their emotions are unchangeable tend to have lower well-being and poorer social relationships than those who believed they have conscious control over their feelings. And they are considerably more likely to report feeling anxious, angry, lonely or depressed, for example, and less likely to report feeling happy, proud, loved or stimulated. 

As one researcher said, “If you can change how you think about a situation, you can change how you feel.” You can remind yourself that the worst-case scenario is only one of many possible results and you can amplify welcome emotions. A clear connection was found between adolescents’ use of this cognitive reappraisal and their psychological well-being, which included a reduced risk of mental illness and an increase in overall life satisfaction.

You can use music, baking, petting a dog or soaking in a hot bath to quiet sadness, anxiety or anger or you can change your environment, spending time in a natural space rather than walking the streets. Even watching short clips of wildlife documentaries for a week helps participants reduce negative repetitive thinking.

Kross also recognizes that a little discomfort can sometimes be helpful. Jealousy can show us that success is possible, which might spur on your own ambition. Anger might push you to fight for a fairer resolution to a disagreement. People tend to be more satisfied with outcomes when they use their “bad” feelings to their advantage in this way.

Kross also argues that, contrary to some advice, occasionally avoiding emotions can provide short-term relief. Bereaved partners who turned their thoughts away from their grief reported less negative emotion in the long term.

The good news is that those participants in the New Zealand study who improved their emotional regulation as they got older did better in nearly all respects in adulthood.

Kross does not mention an important skill needed for effective emotional regulation. An important pre-condition is being able to identify the emotion we are feeling. If we misidentify those feelings, we may not be as able to improve our emotional state. If we think we are feeling anger, but are really feeling shame or fear, steps to right a perceived unfair treatment may not help, for example. It’s a garbage in/garbage out problem.

This is relevant to lawyers because several studies show emotional perception or identification to be our weakest trait of the four emotional intelligence abilities.

So what would happen if human CEOs were replaced with automated versions? The Hustle looked into that, noting that CEOs often do work that AI is strong at–budgeting, tracking a company’s performance, and making executive decisions based on data, that eliminating their very high pay could be a substantial corporate cost savings, and that even 49% of CEOs think most or all of their duties could be replaced by AI.

So they tested that proposition by producing an automated CEO for three companies aiming for revival: Nike, Southwest Airlines, and Starbucks.

This is what they found. While each company’s AI CEO showed individual strengths and weaknesses given each company’s circumstances, “Overall, the AI CEOs’ ideas weren’t much different than those of their human counterparts…” However, “the AI CEOs didn’t mirror their human counterparts on everything. When it came to topics like layoffs and customer relationships, the AI CEOs seemed to have a greater desire to do well by their employees and customers… but they lack the personal skills needed to inspire employees and get along with investors… {Also}, the tendency of AI to be too wishy-washy and easily influenced by whomever delivers a prompt is a common criticism. It’s also certainly not a good characteristic for CEOs, who must be decisive.”

So what’s the takeaway from this experiment? It’s interesting that the comparisons between human and AI CEOs focus on what are attributes of emotional intelligence–empathy for employees and customers, inspiring employees and getting along with investors, abilities that demonstrate emotional intelligence. Emotional intelligence has also been shown to elevate decisive decision making.

For lawyers and law firm managers, making some aspects of practice more efficient through the use of AI can be useful as long as the core advantages to a legal practice of emotional intelligence, like expressing empathy, inspiring the troops, successfully interfacing with stakeholders of all types, and making sound decisions, is kept top of mind–advantages that so far AI cannot reliably deliver.