The rise of Emotion AI in business software presents potential concerns regarding its impact and ethical implications
One unexpected trend that has emerged as businesses experiment with integrating AI into various systems is the utilization of AI to assist their numerous new algorithms in comprehending human emotion.
According to the latest Enterprise Saas Emerging Tech Research report from PitchBook, this technology is on the rise and is classified as “emotion AI.”
The rationale is as follows: If businesses assign AI assistants to executives and employees and utilize AI chatbots as front-line marketers and customer service representatives, how can an AI perform effectively if it is unable to differentiate between an angry “What do you mean by that?” and a perplexed “What do you mean by that?”
Emotion AI is purported to be the more advanced sibling of sentiment analysis. This pre-AI technology endeavors to extract human emotion from text-based interactions, particularly social media.
Emotion AI is often called multimodal, as it combines machine learning and psychology with sensors for visual, auditory, and other inputs to identify human emotion during an interaction.
Major AI cloud providers provide services that provide developers access to emotion AI capabilities, including the Emotion API of Microsoft Azure Cognitive Services and the Rekognition service of Amazon Web Services. (The latter has been the subject of some controversy over the years.)
Although emotion AI, also available as a cloud service, is not novel, the abrupt proliferation of bots in the workforce has given it a more fantastic future in the business world than it has ever had, according to PitchBook.
“The report by Derek Hernandez, senior analyst of emerging technology at PitchBook, asserts that emotion AI can facilitate more human-like interpretations and responses in light of the increasing prevalence of AI assistants and fully automated human-machine interactions.”
“The hardware component of emotion AI comprises cameras and microphones.” These may be situated physically, on a laptp or a phone.
Furthermore, Hernandez informs TechCrunch that wearable hardware will likely offer an additional method of utilizing emotion AI in addition to these devices. (Accordingly, this may be why the customer service chatbot requests camera access.)
To achieve this objective, an increasing number of ventures are being established. Uniphore (with a total of $610 million raised, including $400 million in 2022 headed by NEA) and MorphCast, Voicesense, Superceed, Siena AI, audEERING, and Opsis, each of which raised modest sums from various VCs are also included in this category, according to PitchBook estimates.
Emotion AI is, of course, a Silicon Valley approach: Utilize technology to resolve a problem initiated by using technology with humans.
However, this solution may not be effective, even if most AI programs eventually develop automated empathy.
The last time emotional AI was a topic of great interest in Silicon Valley was around the time that when the AI/ML world focused primarily on computer vision rather than generative language and art.
How ever, researchers threw a wrench in the concept. In that year, a meta-review of studies was published by researchers who concluded that facial movements cannot accurately predict human emotion.
In other words, the notion that we can instruct an AI to identify a human’s emotions by teaching it to emulate how other humans attempt to do so (by interpreting facial expressions, body language, and vocal tone) is somewhat naive in its assumptions.
This concept may be stifled by AI regulation, such as the European Union’s AI Act, which prohibits using computer-vision emotion detection systems for specific purposes, such as education.
(In addition, specific state laws, such as Illinois’ BIPA, forbid the collection of biometric readings without authorization.)
All of this provides a more comprehensive understanding of the AI-filled future that Silicon Valley is currently constructing.
These AI bots will endeavor to employ emotional intelligence to perform tasks such as customer service, sales, and HR, as well as any other responsibilities that humans may designate, or they may not be particularly adept at any task that necessitates this ability.
We may be witnessing an office environment populated by AI programs comparable to Siri in 2023. In contrast to a program mandated by management to predict the emotions of all attendees in real-time during meetings, who is to determine which is more detrimental?
Donald Trump is considering Kevin Warsh for Treasury Secretary and to succeed Jerome Powell as Fed Chair when his term…
Upbit refunded 8.5 billion won to 380 voice phishing victims, as authorities expose North Korea's involvement in previous hacks. Upbit,…
Rick Wurster, set to become CEO next year, stated he has no plans to buy crypto but aims to support…
Nine individuals were charged with laundering U.S. drug proceeds into cryptocurrency for Mexican and Colombian cartels from 2020 to 2023.…
Truemarkets raised over $4M by selling 15,071 NFTs at $250 each. Vitalik Buterin bought 400 NFTs worth $107K in the…
Apple acknowledged on Monday that its devices were susceptible to an exploit that enabled the execution of remote malicious code…