Affective Computing Report
Affective Computing Report
Affective Computing Report
AFFECTIVE COMPUTING
SUBMITTED BY:
NAVEED S
ROLL NO.-15
S7 CSE
COLLEGE OF ENGINEERING
PERUMON
ABSTRACT
Affective computing is computing that relates to, arises from or deliberately
influences emotions. Neurological studies indicate that the role of emotions in human
cognition is essential and that emotions play a critical role in rational decision-
making, perception, human interaction and human intelligence. In the view of
increased human computer interaction or HCI it has become important that for proper
and full interaction between Humans and Computers, computers should be able to at
least recognize and react to different user emotion states.
Emotion is a difficult thing to classify and study fully. Therefore to replicate or to
detect emotions in agents is a challenging task. In Human-Human interaction it is
often easy to see if a person is angry, happy, or frustrated etc. It is not easy to
replicate such an ability in an agent. In this seminar I will be dealing with different
aspects of affective computing including a brief study of human emotions, theory and
practice related to affective systems, challenges to affective computing and systems
which have been developed which and support this type of interaction. I will also be
doing a tryst into the area of ethics related to this field as well as implication of
computers which will have emotions of their own. Affective computing is an
emerging, interdisciplinary area, addressing a variety of research, methodological, and
technical issues pertaining to the integration of affect into human-computer
interaction.
The specific research areas include recognition of distinct affective states, user
interface adaptation and function integration due to changes in user’s affective state,
supporting technologies such as wearable computing for improved affective state
detection and adaptation.
INTRODUCTION
Affective computing aims at developing computers with understanding capabilities
vastly beyond today’s computer systems. Affective computing is computing that
relates to, or arises from, or deliberately influences emotion. Affective computing also
involves giving machines skills of emotional intelligence: the ability to recognize and
respond intelligently to emotion, the ability to appropriately express (or not express)
emotion, and the ability to manage emotions. The latter ability involves handling both
the emotions of others and the emotions within one self.
Today, more than ever, the role of computers in interacting with people is of
importance. Most computer users are not engineers and do not have the time or desire
to learn and stay up to date on special skills for making use of a computer’s
assistance. The emotional abilities given to computers are intended for helping
address the problem of interacting with complex systems leading to smoother
interaction between the two. Emotional intelligence that is the ability to respond to
one’s own and others emotions is often viewed as more important than mathematical
or other forms of intelligence. Equipping computer agents with such intelligence will
be the keystone in the future of computer agents.
Emotions in people consist of a constellation of regulatory and biasing mechanisms,
operating throughout the body and brain, modulating just about everything a person
does. Emotion can affect the way you walk, talk, type, gesture, compose a sentence,
or otherwise communicate. Thus to infer a person’s emotion, there are multiple
signals you can sense and try to associate with an underlying affective state.
Depending on which sensors is available (auditory, visual, textual, physiological,
biochemical, etc.) one can look for different patterns of emotion’s influence. The most
active areas for machine motion recognition have been in automating facial
expression recognition, vocal inflection recognition, and reasoning about emotion
given text input about goals and actions. The signals are then processed using pattern
recognition techniques like hidden Markov models (HMM’s), hidden decision trees,
auto-regressive HMM’s, Support Vector Machines and neural networks.
The response of such an affective system is also very important consideration. It could
have a preset response to each user emotional state or it could learn from trying out
different strategies on the subject with the passing of time and deciding the best
option as time passes on. user, to see which are most pleasing. Indeed, a core property
of such learning systems is the ability to sense positive or negative feedback –
affective feedback – and incorporates this into the learning routine. A wide range of
uses have been determined and implemented for such systems. These include systems,
which detect the stress level in car drivers to toys, which sense the mood of the child
and reacts accordingly.
A GENERAL OVERVIEW
A general system with users who have affect or emotion and the surrounding world
can be represented by the following sketch.
The most important component of the system will be the emotive user. This is any
user or being who has emotions and whose actions and decisions are influenced by his
emotions. The user forms the core of any affective system. This affect makes him able
to communicate to other humans and computers and to his self. Human to human
affective communication is a widely studied branch of psychology and is one of the
base subjects which where explored when Affective computing was considered.
Now in a general way it can be said that a human user will display an emotion. This
emotion will be sensed by any one of the interfaces to the affective application. This
might be a wearable computer or any other device, which has been designed for
inputting affective signals. In this way the sensing of the affective signal takes place.
A pattern recognition algorithm is further applied to recognize the affective state of
the user. The affective state is now understood and modeled. This information is now
passed to an affective application or an affective computer, which uses it to
communicate back with the emotive user. Research is also going on as to synthesizing
affect in computers, which will provide further dimension of originality to the Human
Computer Interaction. Each of the dimensions of affective interaction is discussed
below.
Affective Mediation
Technology supporting machine-mediated communication continues to grow and
improve, but much of it still remains impoverished with respect to emotional
expression. While much of the current research in the group focuses on sensing and
understanding the emotional state of the user or the development of affective
interfaces, research in Affective Mediation explores ways to increase the "affective
bandwidth" of computer-mediated communication through the use of graphical
visualization. By graphical visualization, it means the representation of emotional
information in an easy-to-understand, computer graphics format. Currently the focus
is on physiological information, but it may also include behavioral information (such
as if someone is typing louder or faster than usual.) Building on traditional
representations of physiological signals – continuously updating line graphs -- one
approach is to represent the user's physiology in three-dimensional, real-time
computer graphics, and to provide unique, innovative, and unobtrusive ways to collect
the data. This research focuses on using displays and devices in ways which will help
humans to communicate both with themselves and with one another in affect-
enhanced ways.
Human-to-Human Communication
From email to full-body videoconferencing, virtual communication is growing rapidly
in availability and complexity. Although this richness of communication options
improves our ability to converse with others who are far away or not available at the
precise moment that we are, the sense that something is missing continues to plague
users of current methodologies. Affective Communication seeks to provide new
devices and tools for supplementing person-to-person communications media.
Specifically, through the use of graphical displays viewable by any or all members of
a mediated conversation, researchers hope to provide an augmented experience of
affective expression, which supplements but also challenges traditional computer
mediated communication.
Human-to-Self (Reflexive) Communication
Digitized representation of affective responses creates possibilities for our
relationship to our own bodies and affective response patterns. Affective
communication with oneself – reflexive communication - explores the exciting
possibilities of giving people access to their own physiological patterns in ways
previously unavailable, or available only to medical and research personnel with
special, complex, or expensive equipment. The graphical approach creates new
technologies with the express goal of allowing the user to gain information and
insight about his or her own responses.
Computer expression of emotion
This work represents a controversial area of human-computer interaction, in part since
attributing emotions and emotional understanding to machines has been identified as a
philosophical problem: what does it mean for a machine to express emotions that it
doesn't feel? What does it mean for humans to feel "empathized with" by machines
that are simply unable to really "feel" what a person is going through? Currently, few
computer systems have been designed specifically to interact on an emotional level.
An example is the smile that Macintosh users are greeted with, indicating that "all is
well" with the boot disk. If there is a problem with the boot disk, the machine displays
the "sad Mac".
Humans are experts at interpreting facial expressions and tones of voice, and making
accurate inferences about others' internal states from these clues. Controversy rages
over anthropomorphism: Should researchers leverage this expertise in the service of
computer interface design, since attributing human characteristics to machines often
means setting unrealistic as well as unfulfillable expectations about the machine's
capabilities? Show a human face, expect human capabilities that far outstrip the
machine? Yet the fact remains that faces have been used effectively to represent a
wide variety of internal states. And with careful design, researchers regard emotional
expression via face and sound as a potentially effective means of communicating a
wide array of information to computer users. As systems become more capable of
emotional communication with users, researchers see systems needing more and more
sophisticated emotionally-expressive capability.
SENSING HUMAN EMOTIONS
Sensors are an important part of an Affective Computing System because they
provide information about the wearer's physical state or behavior. They can gather
data in a continuous way without having to interrupt the user. There are many types of
sensors being developed to accommodate and to detect different types of emotions.
Some are listed below.
The Galvanic Skin Response (GSR) Sensor
Galvanic Skin Response is a measure of the skin's conductance between two
electrodes. Electrodes are small metal plates that apply a safe, imperceptibly tiny
voltage across the skin. The electrodes are typically attached to the subject's fingers or
toes using electrode cuffs or to any part of the body using a Silver-Chloride electrode
patch. To measure the resistance, a small voltage is applied to the skin and the skin's
current conduction is measured. Skin conductance is considered to be a function of
the sweat gland activity and the skin's pore size. An individual's baseline skin
conductance will vary for many reasons, including gender, diet, skin type and
situation. Sweat gland activity is controlled in part by the sympathetic nervous
system. When a subject is startled or experiences anxiety, there will be a fast increase
in the skin's conductance (a period of seconds) due to increased activity in the sweat
glands (unless the glands are saturated with sweat.)
After a startle, the skin's conductance will decrease naturally due to reabsorption.
There is a saturation to the effect: when the duct of the sweat gland fills there is no
longer a possibility of further increasing skin conductance. Excess sweat pours out of
the duct. Sweat gland activity increases the skin's capacity to conduct the current
passing through it and changes in the skin conductance reflect changes in the level of
arousal in the sympathetic nervous system.
The Blood Volume Pulse Sensor
The Blood Volume pulse sensor uses photoplethysmography to detect the blood
pressure in the extremities. It is a process of applying a light source and measuring the
light reflected by the skin. At each contraction of the heart, blood is forced through
the peripheral vessels, producing engorgement of the vessels under the light source--
thereby modifying the amount of light to the photosensor. The resulting pressure
waveform is recorded. Since vasomotor activity (activity which controls the size of
the blood vessels) is controlled by the sympathetic nervous system, the BVP
measurements can display changes in sympathetic arousal. An increase in the BVP
amplitude indicates decreased sympathetic arousal and greater blood flow to the
fingertips.
The Respiration Sensor
The respiration sensor can be placed either over the sternum for thoracic monitoring
or over the diaphram for diaphragmatic monitoring. In all experiments so far we have
used diaphragmatic monitoring. The sensor consists mainly of a large Velcro belt,
which extends around the chest cavity and a small elastic which stretches as the
subject's chest cavity expands. The amount of stretch in the elastic is measured as a
voltage change and recorded. From the waveform, the depth the subject's breath and
the subject's rate of respiration can be learned.
The Electromyogram (EMG) Sensor
The electromyographic sensors measure the electrical activity produced by a muscle
when it is being contracted, amplify the signal and send it to the encoder. There, a
band pass filter is applied to the signal. For all our experiments, the sensor has used
the 0-400 microvolt range and the 20-500 Hz filters, which is the most commonly
used position.
RECOGNIZING AFFECTIVE INPUT
The research work mainly involves efforts to understand the correlation of emotion
that can potentially be identified by a computer and primarily behavioral and
physiological expressions of emotion. Because we can measure physical events, and
cannot recognize a person's thoughts, research in recognizing emotion is limited to
correlates of emotional expression that can be sensed by a computer, including such
things as physiology, behavior, and even word selection when talking. Emotion
modulates not just memory retrieval and decision-making (things that are hard for a
computer to know), but also many sense-able actions such as the way you pick up a
pencil or bang on a mouse (things a computer can begin to observe). In assessing a
user's emotion, one can also measure an individual's self-report of how they are
feeling. Many people have difficulty recognizing and/or verbally expressing their
emotions, especially when there is a mix of emotions or when the emotions are
nondescript. In many situations it is also inappropriate to interrupt the user for a self-
report. Nonetheless, researchers think it is important that if a user wants to tell a
system verbally about their feelings, the system should facilitate this. Researchers are
interested in emotional expression through verbal as well as non-verbal means, not
just how something is said, but how word choice might reveal an underlying affective
state.
Our focus begins by looking at physiological correlates, measured both during lab
situations designed to arouse and elicit emotional response, and during ordinary (non-
lab) situations, the latter via affective wearable computing.
Our first efforts toward affect recognition have focused on detecting patterns in
physiology that we receive from sensing devices. To this effect, we are designing and
conducting experiments to induce particular affective responses. One of our primary
goals is to be able to determine which signals are related to which emotional states --
in other words, how to find the link between user's emotional state and its
corresponding physiological state. We are hoping to use, and build upon, some of the
work done by others on coupling physiological information with affective states.
Current efforts that use physiological sensing are focusing on:
• GSR (Galvanic Skin Response),
• ECG (Electrocardiogram),
• EMG (Electromyogram),
• BVP (Blood Volume Pressure),
• Respiration, and
• Temperature.
Affective Tutor
Another good application for affective computer is to impart education to students.
Computers are widely being used to impart quality education to students. But most of
these CBT’s or Computer Based Tutorials are either linear, that is they follow a fixed
course, or they are based on the ability of the student which is gauged from the
response of the student to test situations. Even such a response is very limited.
An affective tutor on the other hand would be able to gauge the students
understanding as well as whether he is bored, confused, strained or in any other
psychological state which affects his or her studies and consequently change its
presentation or tempo so as to enable the student to adjust just as a human teacher
would do. This would consequently increase the student’s grasp of the subject and
give a better overall output from the system.
Affective DJ
Another application, which has been developed, is a digital music delivery system that
plays music based on your current mood, and your listening preferences. This system
is able to detect that the user is currently experiencing a feeling of sorrow or
loneliness and consequently select a piece of music which it feels will help change the
mood you are in. It will also be able to make changes in your current play list if it
feels that the user is getting bored of the current play list or that the music has been
able to change the affect of that person to another state. Another promising
development is a video retrieval system might help identify not just scenes having a
particular actor or setting, but scenes having a particular emotional content: fast-
forward to the "most exciting" scenes. This will allow the user to be watching a scene,
which has his or her favorite actors, and also which suit the user’s current mood.
Affective Toys
In the age where robotic toys are the craze of the time affective toys will soon enter
the toy world to fill in the void that the robots are unable to show or have emotions
and have to be attributed to them by an imaginative child. Affective toys on the other
hand will have emotions of their own and will be able to exchange these emotions
with the child, as a normal human child would do. The most famous affective toy is
The Affective Tigger, which is a reactive expressive toy. The stuffed tiger reacts to a
human playmate with a display of emotion, based on its perception of the mood of
play.
Affective Avatars
Virtual Reality avatars that accurately and in real time represent the physical
manifestations of affective state of their users in the real world are a dream of hard-
core game players. They would enjoy the game more and also feel more a part of the
game if their Avatars would behave just like they would in a similar scenario. They
would like their Avatar to be scared when they are scared, angry when they are angry
and also excited whenever they feel excited. Work has been progressing in this
direction.
For example, AffQuake is an attempt to incorporate signals that relate to a player's
affect into ID Software's Quake II in a way that alters game play. Several
modifications have been made that cause the player's avatar within Quake to alter its
behaviors depending upon one of these signals. For example, in StartleQuake, when a
player becomes startled, his or her avatar also becomes startled and jumps back.
Many other applications have risen with continuing research in this field. More and
more possibilities are opening up every day.
CONCLUSION
In this seminar I have tried to provide a basic framework of the work done in the field
of affective computing. Over the years, scientists have aimed to make machines that
are intelligent and that help people use their native intelligence. However, they have
almost completely neglected the role of emotion in intelligence, leading to an
imbalance on a scale where emotions are almost always ignored. This does not mean
that the newer research should be solely to increase the Affective ability of the
computers. It is widely known that too much emotion is also as bad possibly worse
than no emotion. So a lot of research is needed to learn about how affect can be used
in a balanced, respectful, and intelligent way; this should be the aim of affective
computing as we develop new technologies that recognize and respond appropriately
to human emotions. The science is still very young but is showing large amounts of
promise and should provide more to HCI than did the advent of GUI and speech
recognition. The research is promising and will cause Affective computing to be an
essential tool in the future.
REFERENCES
• J. Scheirer, R. Fernandez, J. Klein, and R. W. Picard (2002),
"Frustrating the User on Purpose: A Step Toward Building an
Affective Computer"
• Jonathan Klein, Youngme Moon and Rosalind W. Picard (2002), "This
Computer Responds to User Frustration"
• Rosalind W. Picard, Jonathan Klein (2002), "Computers that
Recognise and Respond to User Emotion: Theoretical and Practical
Implications"
• Rosalind W. Picard and Jocelyn Scheirer (2001), "The Galvactivator:
A Glove that Senses and Communicates Skin Conductivity"
• Carson Reynolds and Rosalind W. Picard (2001), "Designing for
Affective Interactions"