Subscribe for notification
AI

Emotion AI Trend Raises Concerns in Business Software

The rise of Emotion AI in business software presents potential concerns regarding its impact and ethical implications

One unexpected trend that has emerged as businesses experiment with integrating AI into various systems is the utilization of AI to assist their numerous new algorithms in comprehending human emotion.

According to the latest Enterprise Saas Emerging Tech Research report from PitchBook, this technology is on the rise and is classified as “emotion AI.”

The rationale is as follows: If businesses assign AI assistants to executives and employees and utilize AI chatbots as front-line marketers and customer service representatives, how can an AI perform effectively if it is unable to differentiate between an angry “What do you mean by that?” and a perplexed “What do you mean by that?”

Emotion AI is purported to be the more advanced sibling of sentiment analysis. This pre-AI technology endeavors to extract human emotion from text-based interactions, particularly social media.

Emotion AI is often called multimodal, as it combines machine learning and psychology with sensors for visual, auditory, and other inputs to identify human emotion during an interaction.

Major AI cloud providers provide services that provide developers access to emotion AI capabilities, including the Emotion API of Microsoft Azure Cognitive Services and the Rekognition service of Amazon Web Services. (The latter has been the subject of some controversy over the years.)

Although emotion AI, also available as a cloud service, is not novel, the abrupt proliferation of bots in the workforce has given it a more fantastic future in the business world than it has ever had, according to PitchBook.

“The report by Derek Hernandez, senior analyst of emerging technology at PitchBook, asserts that emotion AI can facilitate more human-like interpretations and responses in light of the increasing prevalence of AI assistants and fully automated human-machine interactions.”

“The hardware component of emotion AI comprises cameras and microphones.” These may be situated physically, on a laptp or a phone.

Furthermore, Hernandez informs TechCrunch that wearable hardware will likely offer an additional method of utilizing emotion AI in addition to these devices. (Accordingly, this may be why the customer service chatbot requests camera access.)

To achieve this objective, an increasing number of ventures are being established. Uniphore (with a total of $610 million raised, including $400 million in 2022 headed by NEA) and MorphCast, Voicesense, Superceed, Siena AI, audEERING, and Opsis, each of which raised modest sums from various VCs are also included in this category, according to PitchBook estimates.

Emotion AI is, of course, a Silicon Valley approach: Utilize technology to resolve a problem initiated by using technology with humans.

However, this solution may not be effective, even if most AI programs eventually develop automated empathy.

The last time emotional AI was a topic of great interest in Silicon Valley was around the time that when the AI/ML world focused primarily on computer vision rather than generative language and art.

How ever, researchers threw a wrench in the concept. In that year, a meta-review of studies was published by researchers who concluded that facial movements cannot accurately predict human emotion.

In other words, the notion that we can instruct an AI to identify a human’s emotions by teaching it to emulate how other humans attempt to do so (by interpreting facial expressions, body language, and vocal tone) is somewhat naive in its assumptions.

This concept may be stifled by AI regulation, such as the European Union’s AI Act, which prohibits using computer-vision emotion detection systems for specific purposes, such as education.

(In addition, specific state laws, such as Illinois’ BIPA, forbid the collection of biometric readings without authorization.)

All of this provides a more comprehensive understanding of the AI-filled future that Silicon Valley is currently constructing.

These AI bots will endeavor to employ emotional intelligence to perform tasks such as customer service, sales, and HR, as well as any other responsibilities that humans may designate, or they may not be particularly adept at any task that necessitates this ability.

We may be witnessing an office environment populated by AI programs comparable to Siri in 2023. In contrast to a program mandated by management to predict the emotions of all attendees in real-time during meetings, who is to determine which is more detrimental?

Hillary Ondulohi

Hillary is a media creator with a background in mechanical engineering. He leverages his technical expertise to craft informative pieces on protechbro.com, making complex concepts accessible to a wider audience.

Disqus Comments Loading...

Recent Posts

Hamster Kombat Introduces Earn Benefits on Telegram Wallet

The trending P2E game Hamster Kombat has introduced a new way for users to earn more for those who withdraw…

35 mins ago

Amazon Releases Video Generator Only for Ads

Like Google, Amazon has released an AI-powered video generator, but it can only do a few things at a time…

5 hours ago

Upchieve Launches Free Tool for Teachers

Upchieve, a free app offering 24/7 college counseling and tutoring for low-income students, introduces a new tool to support teachers…

6 hours ago

Hong Kong to Launch Ethereum ETF Staking by Year-End

The crypto regulators in Hong Kong may launch Ethereum ETF staking by the end of 2024, which could likely give…

6 hours ago

US SEC Seeks Coinbase Lawsuit Discovery Extension

The US SEC is seeking a Coinbase lawsuit discovery extension as they have reached an agreement with Coinbase to shift…

7 hours ago

Worldcoin Launches Face Auth Technology

Worldcoin, a global digital identity and cryptocurrency initiative, has introduced Face Auth, a new security measure for the World ID…

7 hours ago