Emotion Detection in Marketing Is About to Go Mainstream

1月 18, 2023

Companies continue to look for every possible edge to connect with, market to, and close sales with prospects and repeat customers.

And technologies continue to evolve to help improve marketing messages delivered via traditional and digital media, by either bots or human company employees. They have moved from basic text and speech analysis to sentiment analysis and have only recently begun to advance even further into the more complicated realm of emotion analysis.

Emotion detection is a level above sentiment analysis. Whereas sentiment analysis involves the matching of words to feelings, emotion detection involves behavioral clues as well, according to Rebecca Wettemann, CEO and founder of Valoir.

Sentiment analysis, though good for identifying some very basic feelings at a given moment in time, can have severe limitations, especially when trying to understand sarcasm, whether a customer’s threat to churn is just talk or a real possibility, and similar nuances. Emotion detection, on the other hand, uses many more sources of data to more accurately determine a person’s emotional state. The most advanced emotion detection engines incorporate facial recognition and artificial intelligence, but even with those additional clues, emotion detection is still highly subjective.

The more data points that can be used, the more accurate emotion detection can be. So advances in data and analytics technologies and the merging of once-separate technologies have led to tremendous improvement in emotion detection capabilities in the past few years, according to Wettemann. “I can remember when we were trying to do this stuff 10 or 15 years ago, it was really hard. You needed a team of people managing the models. Now it is baked into the software and into the marketing.”

“The ability to gauge consumer feelings on marketing materials in real time is already at our fingertips,” agrees James Brooks, founder and CEO of GlassView, a video advertising platform provider. “Through drastic improvements in AI, the use of emotion detection technologies has grown rapidly in the last year, especially within advertising and marketing. Now, we’re able to start integrating neurotechnology into the mix, promoting solutions that can read emotions to varying degrees.”

That’s important in marketing, where the better and faster that solutions can recognize and react to prospects’ or customers’ emotions, the more quickly the marketing message can be changed to elicit better responses. It’s not an easy task.

“One of the main challenges in emotion detection is that human emotions are complex and can be difficult to accurately interpret,” says Shri Ganeshram, CEO and founder of Awning.com, a San Francisco-based technology platform provider and brokerage exclusively for real estate investing.

“However, recent advances in artificial intelligence and machine learning have made it possible to create algorithms that can recognize a wider range of emotional states with increasing accuracy,” Ganeshram adds, noting that in the past year alone, emotion detection technology has evolved and improved greatly. New algorithms, he says, “can detect emotions even when they are not explicitly expressed, such as through facial expressions or vocal cues. This has the potential to provide even more detailed and accurate insights into consumer emotions.”

Also key to better emotion detection and analysis is the amount of data that companies can collect from internal and third-party sources, according to Wettemann.

For example, hotels giant Hilton used Zenus’s Emotion AI technology at one of its trade shows to help determine the relative success of different marketing efforts, using discreetly placed cameras to detect each attendee’s movements and recording and analyzing subtle contractions in their facial muscles at five to 10 frames per second. Based on the data collected, Zenus concluded that a “puppies and ice cream” event the company staged was more engaging than the open bar.

HOW FAR EMOTION DETECTION HAS COME

Marketers today have the technology to understand how customers are reacting in real time, supplemented with external data that helps them understand how customers are responding to marketing content and so much more, Wettemann says. Marketers can work with readily available technology to get very good, if not complete, intelligence on customers’ emotions in response to marketing materials.

But “for companies to be successful today, it’s not so much about building an AI engine; it’s about taking advantage of software-as-a-service offerings that are pre-trained models and tweaking them to improve their performance,” she adds.  The changes are designed to capture more true customer emotions than purely out-of-the-box solutions can.

But even today, despite all of the recent advances, how close technologies can come to true emotion detection is still a matter of debate, executives at Deepgram note in a recent blog post. Even with in-person interactions, it can be difficult to determine a person’s true emotions. The challenge becomes even more pronounced when using technology to make those interpretations.

“Not only are you trying to make a system do something that’s tricky for humans, you’re going to do so using a dataset that humans have labeled based on the emotion that they think is present, even though they might not agree and even though their labels might not accurately match the emotion the [person] was actually feeling,” the company officials go on to say. “This is further complicated by the fact that the audio used to train the model might be acted data, not people actually expressing the emotion that they are feeling.”

Another advanced technology that has just begun to make its way into the field of emotion detection is video, though today most technologies still rely on audio cues, according to Deepgram. “It’s also the case that we do more with our voices than express emotion. For example, sarcasm in English carries a particular type of intonation that is recognizable, but sarcasm isn’t an emotion. This creates an added complication for emotion detection systems.”

TAKING A CUE

Despite the challenges with emotion detection, Deepgram notes, there are advances in this type of technology in sales, education, and other disciplines to which marketers are playing close attention.

ZoomIQ, for example, offers a form of emotion detection technology specifically for sales, collecting data from and analyzing customer interactions during sales meetings.

SupportLogic is another company that is developing AI and related technology to understand customer escalation and churn management, advancing to it to the level that it can understand the context of both customer and agent frustration to determine when companies need to take the next step, such as offering a discount to prevent a customer from churning.

Such capabilities are especially important in telecommunications and insurance, which have extremely high churn rates compared to other industries.

SupportLogic “has been able to do that because they focused on training the models with a specific set of data, then structured that data model for a particular case,” Wettemann says.

Another significant development this year is an emotion detection algorithm developed by researchers at MIT that can detect emotions from facial expressions using deep learning techniques, according to Zeeshan Arif, founder and CEO of Whizpool, a software outsourcing and development firm based in Pakistan. This system outperformed previous approaches by producing more accurate results when predicting emotions based on facial expressions; however, it also produced more false positives than previous systems (which means that it incorrectly identified emotions).

Affectiva, a smaller company that specializes in emotion detection software, recently partnered with Microsoft on research to advance the technology.

And just to show how vital—and versatile—the technology is, eye tracking vendor Smart Eye acquired Affectiva, an MIT spinoff, for $73.5 million, in May 2021.

Affectiva’s technology uses deep learning, computer vision, speech science, and Big Data to detect human emotions and cognitive states. Coupled with Smart Eye’s eye tracking technology, it creates a powerful tool. “Computer vision technology powered by AI is powerful enough to detect emotion in faces just like humans [can].  In fact, it could be more accurate because it can be trained on multiple datasets vs. a single human experience,” Mike Gualtieri, a Forrester Research analyst, said at the time. Not all of these technologies were initially designed for marketing or sales.  In fact, most are being used in education, healthcare, and other verticals. But if they prove to be successful in those industries, using this technology to determine customer emotions in marketing and sales won’t be far behind.

DEVELOPMENT CHALLENGES ABOUND

While visual clues can provide additional data about a person’s emotions, the use of such information brings with it serious privacy concerns. The American Civil Liberties Union (ACLU) and several other privacy groups have sought to limit the advancement of emotion analysis software, expressing concerns about inaccuracy, discrimination, privacy invasion, and other issues.

“Harvesting deeply personal data could make any entity that deploys this tech a target for snooping government authorities and malicious hackers. This move to mine users for emotional data points based on the false idea that AI can track and analyze human emotions is a violation of privacy and human rights,” the group said in a letter to Zoom CEO Eric Yuan in May. “This is an opportunity to show people you care about your users and your reputation.”

Microsoft said in June that it would retire its facial analysis capabilities that attempt to detect people’s emotional states (as well as gender, age and other attributes) due to privacy concerns. But at the same time, its Xiaoice AI system, developed by the Microsoft Asia Software Technology Center (STCA) in 2014 based on an emotional computing framework, is being used in Chinese schools, and a Find Solution AI software package that detects tiny movements of muscles in student’s faces is being used by teachers to track student’s emotional, motivational, and focus changes.

Privacy concerns aside, though, most experts expect emotion detection technology to continue to advance and become more widespread in marketing. “As this technology becomes more sophisticated and refined, it will provide marketers with valuable insights into consumer emotions and help them to create more effective and engaging marketing materials,” Ganeshram predicts.

Biometrics and other neurotech will come to define the future of marketing and ad placement, Brooks suggests. “However, as we look forward, it’s not about automatically propelling into something that works better. It’s about changing the conversation with what companies will measure to get there. Most companies are stuck focusing their attention on media and marketing to last-click conversion optimization. The company that will be able to influence consumer motivation from the source will tap into unprecedented territory and wind up on top.”

Further development of the technology will depend in large part on companies understanding the value that it can offer, Wettemann says. It’s not the underlying technology that needs to evolve; it’s the way that it is packaged and marketed to a particular industry.

Technology companies serving industries with high levels of churn are much further along in developing true emotion detection solutions, according to Wettemann, whereas industries such as retail, with a lower concern over customer churn, are further behind.

“Companies today should not be buying AI to understand emotion but should be looking at the models that are already trained and built so that managers and marketers can use them and get up and running quickly, without data scientist intervention and with a feedback loop so that they continue to improve over time,” she concludes.