An interesting confluence is happening in the area of mental health. Technology is offering ways to recognize mental health issues that could benefit from remediation and also paths along which help can be delivered. And lawyers are in need of just such assistance.
Our last post was about CIMON, the first autonomous free-floating astronaut-assistant robot aboard the U.S. space station, which is programmed to “detect and help alleviate . . . the social issues” that can arise in such close quarters.
Many law firms have the same objectives, particularly in light of devastating data about the emotional state of legal practitioners and the health and liability risks posed. Among the most pro-active firms is Reed Smith, which has launched a global Mental Health Task Force to help address those issues.
“The mission of this task force is to ensure that our lawyers and professional staff have access to help whenever they or their family members experience or are at risk of experiencing mental health or substance use issues,” according to Partner Kimberly Gold, who will serve as the inaugural chair.
The problem that arises profession-wide, however, is the difficulty in identifying who is having those issues and which issues they are having. Lawyers tend to be closed-mouth, particularly when regarding their own vulnerabilities, and supervisors are no more likely to “rat” on their supervisees. “Full steam ahead until death (or disability) do we part” tends to be the unspoken slogan.
Reed Smith recognizes that problem, noting their intention to “cultivate a workplace culture that promotes psychological wellness and positive help-seeking behaviors” and “to develop a comprehensive strategy for assessing and addressing these outcomes.”
That culture and those strategies are in fact the keys to a healthier, more productive legal workforce. And technology may be able to play a part, particularly given that other attempts at reculturations and new strategies have not be able to crack the nut.
A recent article in Time, “Artificial Intelligence Could Help Solve America’s Impending Mental Health Crisis,” is aimed at expanding tools for psychiatrists, who in the near future are likely to be too few in number to care for all those who need them. Artificial intelligence (AI) offers the tantalizing prospect of being able to analyze data and pick up on warning signs so subtle that humans might not notice them, and then alert others to the need for help. In some instances, that can be literal lifesaving, “since research has shown that checking in with patients who are suicidal or in mental distress can keep them safe.”
This is, of course, not a new concept. Apple watch and other wearables are being ballyhooed as a medical assistant on your wrist, able to track your sleep and physical activity and potentially recognize all sorts of physical warning signs from heart arrhythmias to blood pressure spikes. Researchers suggest that AI could also analyze answers given periodically to questions about one’s emotional state, as well as the mood, pace and word choice in a person’s soundbites and written work, in order to identify signs of mental distress. Not that long ago, a Harvard researcher was looking into building a computer mouse that could detect through touch during the work day high levels of stress and that would trigger alarms and suggested remediations.
Some medical apps and programs already claim to incorporate AI, such as Woebot, an app-based mood tracker and chatbot that combines AI and principles from cognitive behavioral therapy. But the medical world recognizes that providing the best mental health assistance requires a level of emotional intelligence that technology can’t yet simulate.
There are a couple of obvious caveats. We are still talking years–perhaps as much as a decade or more–to fine tune these technological helpers. Research has to be completed, analyzed and loaded into algorithms and the end product has to be tested and marketed.
The other caveat is the current one about privacy. Are you really willing to use the firm’s highly sensitive mouse? To have your pulse and blood pressure and skin temperature recorded for Big Managing Partner to look over? Despite its potential good? Hard to answer that question now.
But there is a tremendous need. Let’s not quibble about that. If cultures can be changed–without simply waiting for 30% of the firm to leave or succumb–then now is the time to figure that out. I’m betting that the strategy of marinating every lawyer in principles of emotional intelligence aimed at benefiting their own and their colleagues’ health is the best way forward.