According to Dr. Mary Czerwinski, manager and principal researcher of the Visualization and Interaction for Business and Entertainment group at Microsoft Research,”Emotions are fundamental to human interaction, but in a world where humans are increasingly interacting with AI systems, emotions may be fundamental to our interactions with machines as well.”

Several recent announcements illustrate her point. In June, BPU Holdings announced an advanced ZimOS Operating System Cloud service that “allows the individual or enterprise to invert AI and AEI to a more personalized, synthetic, emotional emulation.” Which we’re pretty sure means that the products being developed, such as ZimGo Polling (first AEI-based political forecasting platform), ZimGo Neil (a new kind of personal AI news curator making Apple’s list of top apps) and ZimGo aiMei (a personal AEI app geared to increase self-awareness and emotional intelligence on your wearable watch or smartphone) make unprecedented use of “sentiment analysis” to improve forecasting and produce other efficiencies.

As the company’s CTO says, “We are teaching the machine to synthetically emulate emotional intelligence to better relate to how you and I feel. So many exciting applications present themselves to enhance healthcare analytics, market assessment, consumer and voter sentiment, and delivering customized content in the Internet of Things.”

Then, starting this past July, all UK Government organizations–over 39,000 workplaces–have started having access to a cloud service using artificial emotional intelligence together with emotional recognition AI to analyze social media. It detects over 20 distinct emotions in any digital content, which helps organizations measure and understand how people feel about a topic – ranging from companies, brands, and people to concepts. It is particularly calculated to equip European businesses and government organizations with a better understanding of client and citizen emotions regarding topics such as immigration, the strategic direction of healthcare services, and wider societal issues in preparation for a post-Brexit world.

Then this month, Affectiva and Nuance Communications, Inc. announced their work together to “humanize” automotive assistants and in-car experiences. The goal is to deliver the industry’s first interactive automotive assistant that understands drivers’ and passengers’ complex cognitive and emotional states from facial, voice and body cues and that helps adapt behavior accordingly. It will be able to identify facial expressions of emotions such as joy, anger and surprise, vocal expressions of anger, engagement and laughter, indicators of drowsiness such as yawning, eye closure and blink rates, as well as physical and mental distraction because of cognitive load or anger, all in real-time. Nuance’s Dragon Drive already powers more than 200 million cars on the road using more than 40 languages for Audi, BMW, Daimler, Fiat, Ford, GM, Hyundai, SAIC, Toyota, and other brands and the partnership expects to expand its penetration appreciably.

These are only the latest and greatest examples of the wisdom of Dr. Czerwinski‘s statement. Knowing emotions–how to express them, recognize them, understand them and manage them–is vital to the most successful human interactions, but the era of artificial emotional intelligence is surging and those skills will be the ones that help us interact successfully with expert machines as well.