Emotion Tracking AI is the Latest Evolution in Affective Computing

Affective Computing

Follow Us:

Emotions have always been a fundamental aspect of the human existence. But at the same time, it has been ignored by technology not only because emotions cannot be quantifiable but also there was no technology developed that could interpret human emotions. However, times are changing and there are possibilities that new developments will come into existence in the near future. The formation of the concept of affective computing is quite a novel concept as it came into existence much after the scientific quantification of emotion goal was found.

But on the contrary, the goal of scientific quantification of emotion dates back to 1872 when efforts were made to decode the face as a source of information. In that year, Charles Darwin published his book “The Expression of the Emotions of Man and Animals” in which Darwin gives details of how individuals display emotions through various expressions on their face. Take an example of a child crying the emotions could be interpreted with the help of the facial expressions which would include the compression of the eyeballs, contraction of the nose and raising of the upper lip.

Development of the FACS System

After that, there were other researchers who conducted further research on this topic. In the 1970s and 80s, Paul Ekman and Wallace Friesen developed their Facial Action Coding System (FACS) which was inspired by Darwin’s ideas. The working of the FACS system works on writing a code where every tiny movement of the face can be described with Action Units.

Origins of affective computing

Though the origins of the field were traced back to these early philosophical researches into emotion, the branch of affective computing originated with Rosalind Picard’s paper on affective computing in 1995. Affective computing or emotional artificial intelligence in other words or Emotion AI is the study and development of systems that are programmed to simulate, interpret and process human emotions. It is an interdisciplinary field that is a blend of computer science, psychology, and computer science. It was a concept that was proposed by Dr. Rosalind Picard, Founder and Director of the Affective Computing Research Group at MIT when she published a book that proposed the framework in 1997. In this field, the hardware or software driving a process has a notion of the response to information through various data inputs such as a camera or a microphone. Instead of picking up on the typical responses of a user, it can pick up other responses such as the tonality of voice, expressions, body language or any electrodermal activity that can be accessed by the computer. The technology would be advanced in such a way that it would be able to determine the emotions of a human being in a more accurate way.

New possibilities in the field of affective computing

The advent of modern computing systems opened up new possibilities to the affective computing technology with the development of facial recognition technology offered researchers to start encoding emotions into machines. The concept of affective computing has started to evolve and is being adopted across many domains such as healthcare, media, and advertisement, market research, automotive, retail, communication, and education. Furthermore, Affective computing is the latest trend as it is being utilized in many applications.

The following article will talk about the application of emotion tracking AI in different industries.

Inclusion of emotion tracking AI assists autistic patients

The medical field gains from this field of affective computing and its applications since these applications are being developed to assist people on the Autism spectrum interactions. According to Rosalind Picard, the motivation behind projects on the autism spectrum was individuals suffering from the disease itself. The reason behind the application of emotion tracking AI technology in the medical field is that the people suffering from autism had a tough time reading the facial expression of the person who was engaging them in a conversation. To solve this problem of autistic patients, a tiny wearable camera was installed into eyeglasses and equipped with a program that would scan the individual’s facial features in order to classify the person’s expressions such as confusion, interest, thinking or other feelings that are subtly signaled to each other in conversations.

This was the way in which the medical or healthcare industry benefited by using emotion tracking AI. The next industry to use emotion tracking AI and benefit from this technology was the automotive industry.

Ensuring the safety of the passengers by tracking human emotions

Another industry that had incorporated emotion tracking was the automotive industry. The step to include emotion tracking AI in the automotive industry arose when the representatives of the automotive industry were looking for ways in which emotion recognition would help car safety, leverage emotional data to understand the user experience of the driver and passengers of the car in a better way.

Companies such as Affectiva are on a mission to improve their technology by incorporating emotional intelligence into their devices, systems and digital experiences so that these devices can not only sense human emotions but adapt to them as well. The next generation of cars would have sophisticated infotainment systems that offer personalized experiences to its passengers. Additionally, they are conversational interfaces that are connected with the devices that are used by the customer in their daily life which interacts with the user not only to ensure their safety but to give the user a meaningful experience.

How automotive vehicles ensure the safety of cars

Following are the ways in which emotion tracking AI is used in autonomous vehicles to ensure the safety of the consumer

Preventing Accidents: An automotive vehicle is equipped with emotion tracking artificial intelligence in order to monitor the driver’s attention. By keeping an eye on the driver, the technology identifies fatigue, distraction, and frustration of the driver. This way, it prevents an accident before it takes place in real.
Perception of Emotional Stress: Emotion imaging artificial intelligence can assist with the control of the operation of the vehicle or help determine in real-time when to introduce the hints and narratives in order to counter the perceived emotional stress of passengers. All these functions are carried out by emotion tracking AI while they are integrated with IoT devices and other data inputs such as traffic and weather patterns.

Personalized Experiences: Emotion AI not only gives emotional awareness but also these infotainment systems can make highly personalized recommendations and adjustments depending on the emotions and moods of the car’s occupants such as alternate routes, interesting stops on the route and climate control.

In this way, automotive companies have collected videos of people while they are driving in order to understand and interpret their emotions while they were behind the wheel. Furthermore, they have not only ensured the safety of the individuals in the car but provided them with a personalized meaningful experience as well. Apart from industries such as healthcare and automotive, emotion imaging technology is just not used to provide a safe and meaningful user experience. It can be used by companies for different purposes.

Using emotion tracking AI in recruiting processes

Emotion tracking AI can be used by a company for a completely different reason altogether such as its internal recruitment process of new employees. Take the example of Unilever which utilized Hirevue’s AI technology in order to screen prospective candidates based on parameters such as body language and their moods. By applying Hirevue’s emotion tracking AI technology into the recruitment process, Unilever will be able to recruit the person whose personalities and characteristics are suitable for the job. Furthermore, Hirevue’s algorithm would predict the possible reactions and behavior in certain situations by the means of the candidate’s body language and facial expressions. Besides that, the technology will also be able to detect whether a candidate is lying or the general confidence levels based on how emotions change during their responses during the interview process.

These were some examples of how various companies and industries made use of emotion imaging AI in their ways either to give the user a meaningful experience or for their internal processes. In the same manner, another company has used emotion tracking AI to directly influence its advertising and digital marketing.

Forecasting audience reactions by using emotion tracking AI

Emotion tracking AI was also used by Kellogg’s, a high profile brand that utilized the software technology of Affectiva to test the audience’s reaction to ads of its cereal. Here consumers were shown multiple versions of an ad where it was concluded that one version produced the most laughs during the initial viewing. But at the same time, it produced little engagement on the second viewing of the advertisement. With the assistance of the emotion tracking AI, the version of the advertisement that generated steadier levels of engagement amongst the audience over the course of multiple views was chosen by Kelloggs. This is how emotion tracking AI is used by some brands to test the responses of the audience and determine how the audience is responding to the marketing of their ad and product.

Future developments in the field of affective computing

There have been a lot of developments in the field of affective computing. But at the same time since affective computing is at the dawn. There is more to come in the healthcare industry with research labs working on devices that will be programmed to sense emotions from pain to depression as this is something that is tough to diagnose in the present days. Last year, an experiment at Ohio State University found that AI can recognize different emotions more accurately than human participants. This signals the expansion of artificial intelligence into jobs that were previously handled by human beings. By 2022, the affective computing market is expected to be worth $41 billion.

To sum up, there have been developments in the affective computing industry in many industries throughout the years. Over the years, different industries have been using affective computing in different ways to determine the emotions of individuals and give them a meaningful experience. Since artificial intelligence can recognize the emotions more efficiently than human participants, there is a possibility that AI will be incorporated into many more jobs and sectors. In the future, one can expect a lot of developments and more analysis of the human emotions of AI in the future.

Picture of BusinessApac

BusinessApac

BusinessApac shares the latest news and events in the business world and produces well-researched articles to help the readers stay informed of the latest trends. The magazine also promotes enterprises that serve their clients with futuristic offerings and acute integrity.

Subscribe To Our Newsletter

Get updates and learn from the best

More To Explore

About Us

West has been driving the business world owing to its developed economies. The leading part of the world is straining to sustain its dominance. However, the other parts of the world, especially Asia Pacific region have been displaying escalating growth in terms of business and technological advancements.

Copyright © 2022 - Business APAC. All Right Reserved.

Scroll to Top

Hire Us To Spread Your Content

Fill this form and we will call you.