Emotions from brain to robot1.pdf
(
177 KB
)
Pobierz
Review
TRENDS in Cognitive Sciences Vol.8 No.12 December 2004
Emotions: from brain to robot
Michael A. Arbib
1
and Jean-Marc Fellous
2
1
Computer Science, Neuroscience and USC Brain Project, University of Southern California, Los Angeles, CA 90089-2520, USA
2
Biomedical Engineering Department and Center for Cognitive Neuroscience, Duke University, Durham, NC 27708-0281, USA
Some robots have been given emotional expressions in
an attempt to improve human–computer interaction. In
this article we analyze what it would mean for a robot to
have emotion, distinguishing emotional expression for
communication from emotion as a mechanism for the
organization of behavior. Research on the neurobiology
of emotion yields a deepening understanding of inter-
acting brain structures and neural mechanisms rooted in
neuromodulation that underlie emotions in humans and
other animals. However, the chemical basis of animal
function differs greatly from the mechanics and compu-
tations of current machines. We therefore abstract from
biology a functional characterization of emotion that
does not depend on physical substrate or evolutionary
history, and is broad enough to encompass the possible
emotions of robots.
noting that not all emotions need be like human emotions.
We analyze emotion in two main senses:
(1) Emotional expression for communication and social
coordination.
(2) Emotion for organization of behavior (action
selection, attention and learning).
The first concerns ‘external’ aspect of emotions; the
second ‘internal’ aspects. In animals, these aspects have
co-evolved. How might they enter robot design? Both
robots and animals need to survive and perform efficiently
within their ‘ecological niche’ and, in each case, patterns of
coordination will greatly influence the suite of relevant
emotions (if such are indeed needed) and the means
whereby they are communicated.
A key function of emotion is to communicate simplified
but high impact information. A scream is extremely poor
in information (it says nothing about the cause for alarm),
but its impact on others is high. Moreover, neurobiology
shows that simplified but high impact information is
communicated between brain areas, through the very
different ‘vocabulary’ of neuromodulation.
The similarity in facial expressions between certain
animals and humans prompted classic evolutionary
analyses
[2]
, which support the view that mammals
(at least) have emotions (although not necessarily the
same as human emotions), and work reviewed below
explores their (neuro)biological underpinnings. What of
robots? Robots are mechanical devices with silicon
‘brains’, not products of biological evolution. But as we
better understand biological systems we will extract ‘brain
operating principles’ that do not depend on the physical
medium in which they are implemented. These principles
might then be instantiated for particular robotics archi-
tectures to the point where we might choose to speak of
robot-emotions.
Many researchers (see for example
[3,4]
) have proposed
explicit functions for emotions: coordinating behavioral
response systems; shifting behavioral hierarchies; com-
munication and social bonding; short-cut cognitive proces-
sing; and facilitating storage and recall of memories.
However, emotions are not always beneficial
[5]
– if one is
caught in a traffic jam, anger can easily set in, but anger in
this case has no apparent usefulness. How does the brain
maximize the benefits of emotion yet minimize its occa-
sional inappropriateness? And how would the understand-
ing of such tradeoffs affect our ideas about robot designs?
Interest in the creation of robots with emotions is fourfold.
First, current technology already shows the value of pro-
viding robots with ‘emotional’ expressions (e.g. computer
tutors) and bodily postures (e.g. robot pets) to facilitate
human–computer interaction. Second, this raises the
question of the possible value (or inevitability) of future
robots not only simulating emotional expression but
actually ‘having emotions’. Third, this in turn requires
us to re-examine the neurobiology of emotion to generalize
concepts first developed for humans and then extended to
animals so that the question of robot emotions becomes
meaningful. And fourth, this suggests in turn that build-
ing ‘emotional robots’ could also provide a novel test-bed
for theories of biological emotion.
This article samples the state of the art on current robot
technology, and examines recent work on the neurobiology
of emotions, to ground our suggestions for a scientific
framework in which to approach robot ‘emotions’. The
question of ‘emotional robots’ being used to test theories of
biological emotion is of great interest, but beyond the
scope of this article.
Different kinds of emotions
There is a wide spectrum of feelings, from the ‘motivation’
afforded by drives such as the search for food afforded by
hunger
[1]
to ‘emotions’ in which, at least in humans,
cognitive awareness might be linked to feeling the ‘heat’
of love, sorrow or anger, and so on. But as we have no
criterion for saying that a robot has ‘feelings’, we will seek
here to understand emotions in their functional context,
From ethology to robot motivation and emotion
We advance our discussion by reviewing work that has
added emotion-like features to robotic systems
[6]
, some
Corresponding authors: Michael A. Arbib (arbib@pollux.usc.edu),
Jean-Marc Fellous (fellous@duke.edu).
Available online 2 November 2004
www.sciencedirect.com
1364-6613/$ - see front matter
Q
2004 Elsevier Ltd. All rights reserved. doi:10.1016/j.tics.2004.10.004
Review
TRENDS in Cognitive Sciences Vol.8 No.12 December 2004
555
work inspired by ethology (the study of animal behavior),
and we then sample attempts to analyze the role of
emotion in general ‘cognitive architectures’ at the inter-
face between artificial intelligence and cognitive science.
(1) Reactive: a hard-wired releaser of fixed action
patterns and an interrupt generator. This level has only
the most rudimentary affect.
(2) Routine: the locus of unconscious well-learned
automatized activity and primitive and unconscious
emotions.
(3) Reflective: the home of higher-order cognitive
functions, including metacognition, consciousness, self-
reflection, and full-fledged emotions.
Ortony et al. address the design of emotions in
computational agents (these include ‘softbots’ as well as
embodied robots) that must perform unanticipated tasks
in unpredictable environments. They argue that such
agents, if they are to function effectively, must be endowed
with curiosity and expectations, and a ‘sense of self ’ that
reflects parameter settings that govern the agent’s
functioning.
Sloman
[19]
also offer a three-level view of central
processes:
(1) Reactive: producing immediate actions. When
inconsistent reactions are simultaneously activated one
may be selected by a competitive mechanism.
(2) Deliberative: using explicit hypothetical represen-
tations of alternative possible predictions or explanations,
comparing them and selecting a preferred option.
(3) Meta-management: allowing internal processes to
be monitored, categorized, evaluated and controlled or
modulated.
Sloman also notes the utility of an ‘alarm’ system – a
reactive component that gets inputs from and sends
outputs to all the other components and detects situations
where rapid global redirection of processing is required.
To reconcile the two schemes, we suggest using four
levels: reactive, routine, reflective–deliberative, and
reflective–meta-management.
Finally, recent work in multi-agent teamwork suggests
that in virtual organizations, agents that simulate
emotions would be more believable to humans and could
anticipate human needs by appropriate modeling of
others
[15,20]
.
Robot ethology
In the ‘ethological robots’ reviewed here, each drive,
perceptual releaser, motor-, and emotion-related process
is modeled as a different ‘specialist’
[7]
or ‘schema’
[8]
.
Each schema computes on the basis of its inputs to update
its ‘activation level’ and internal state, and sends on
appropriate outputs. A robot controller based on the
ethology of the preying mantis
[9]
provided four different
motivated visuomotor behaviors: prey acquisition, pre-
dator avoidance, mating and chantlitaxia (coined by
Rolando Lara, from the Nahuatl word chantli for ‘shelter’
and the Latin taxia for ‘attraction’). For prey acquisition,
hunger is the primary motivator; for predator avoidance,
fear serves similarly; for mating the sex drive dominates.
The behavioral controller was implemented on a small
hexapod robot.
In Bowlby’s
[10]
theory of attachment, infants view
certain individuals as sources of comfort, with the ‘comfort
zone’ depending on the circumstances. Likhachev and
Arkin
[11]
extrapolated these ideas to produce useful
behavior in autonomous robots, rather than to model the
human child. The ‘comfort zone’ ensures that the robot
does not stray from a given task or area; it can also provide
a basis for creating a robot pet that can relate to a
particular human being.
Studies of canine ethology support work on human–
robot bonding for Sony’s AIBO
[12]
, a speaking dog-like
entertainment robot. Ekman’s model
[13]
, with its six
basic emotional states, has been influential in work on
emotional expression in robots. The Kismet robot
[14]
, for
example, can communicate an emotive state and other
social cues to a human through facial expressions, gaze,
and quality of voice. The computations needed to commu-
nicate an ‘emotional state’ to a human might also improve
the way robots function in the human environment.
In order to interact with another agent, it is essential to
have a good conceptual model for how this agent operates
[15]
. As complexity of environment and interactions
increases, the social sophistication of a robot interacting
with humans must be scaled accordingly. Some would
argue that this entails that the robot ‘has’ emotions, but
others would distinguish having a model of emotions of the
other agent from having emotions oneself. This in some
sense reverses the simulation theory of human empathy
[16,17]
: in this theory, there is no question that the human
‘has’ emotion, and the proposal is that the system for
expressing one’s own emotions drives the ability to
recognize those of others.
The neurobiological roots of emotion
We turn now from outlining a functional view of robots
to review research linking brain and emotion
[3,21–23]
,
before proposing a functional framework for synthesis. We
shall see that: (i) emotion is not computed by a centralized
neural system; (ii) emotions operate at many time scales
and at many behavioral levels; and (iii) there is no easy
separation between emotion and cognition.
Behavioral control columns
An animal comes with a set of basic ‘drives’ that provide
the ‘motor’ (motivation) for behavior. Most of the neural
circuitry underlying these drives involves specific nuclei of
the hypothalamus. Swanson
[24]
introduced the notion of
the ‘behavioral control column’ (
Figure 1
), comprising
interconnected sets of nuclei in the hypothalamus under-
lying specific survival behaviors: spontaneous locomotion,
exploration, ingestive, defensive and reproductive behav-
iors. The hypothalamus sends this information to higher
centers such as the amygdala and the orbitofrontal cortex.
Cognitive architectures
We now turn to general ‘cognitive architectures’, in
which the role of emotion can be situated at several
levels. Ortony et al.
[18]
analyze the interplay of affect
(value), motivation (action tendencies), cognition
(meaning), and behavior at three levels of information
processing:
www.sciencedirect.com
Review
556
TRENDS in Cognitive Sciences Vol.8 No.12 December 2004
(a)
(b)
(c)
PR
MN
LZ
MPN
pro
AHN
suo
TH
PVHd
VMH/
TU
tub
PMv
PMd
MAM
mam
VTA
SC
SNr
e
+
i
-
d
(-)
Figure 1. Major features of interactions of the behavioral control column and cerebral cortex in regulation of motivated behavior, as seen on flatmaps of the rat central
nervous system. (a) The neuroendocrinemotor zone shown in black and three subgroups of hypothalamic nuclei: the periventricular region (PR) which contains visceromotor
pattern generators; the medial nuclei (MN); and the lateral zone (LZ). (b) Almost all nuclei in the behavioral control column generate a dual projection, descending to the
motor system and ascending to thalamocortical loops. (c) The embedding of the column in cortical computations. This prototypical circuit element consists of an excitatory
projection from cortex with a collateral to the striatum (the input system for the basal ganglia which play a key role in the sequencing and interleaving of actions). The striatum
then generates an inhibitory projection to the motor system with a collateral to the pallidum (the output system for the basal ganglia). Finally, the pallidum generates an
inhibitory projection to the brainstemmotor system, with a collateral to the dorsal thalamus (which projects back to cerebral cortex). This pallidal projection is disinhibitory
because it is inhibited by the striatal input. (Adapted from Swanson
[24]
, Figs 8,10,14, respectively, whose captions explain abbreviations not needed in this article).
Amygdala and cerebral cortex
The human ability to plan behaviors on the basis of future
possibilities rather than present contingencies alone has
been linked to the increased size and differentiation of
cerebral cortex
[25,26]
. Kelley stresses that feedforward
hypothalamic projections provide the motivational net-
work with access to associative and cognitive cortical
areas
[27]
. The amygdala can influence cortical areas via
feedback from proprioceptive, visceral or hormonal sig-
nals, via projections to various ‘arousal’ networks, and
through interaction with the medial prefrontal cortex
[28]
(
Figure 2
a). The prefrontal cortex, in turn, sends distinct
projections back to several regions of amygdala, allowing
elaborate cognitive functions to regulate the amygdala’s
roles in emotion. For example, our ability to create
heuristics and general rules from our everyday experi-
ences has been shown to depend on prefrontal cortices
[29]
and their ability to bias activity in target structures
[30]
.
Because of the tight interactions between amygdala and
prefrontal cortex, it is likely that our ability to generalize
and abstract is directed by (and influences, in turn) some
aspects of our emotional state. How this is done, and how
robots could take advantage of it remains an open
question. Some functional connections between the amyg-
dala, thalamus and cortical areas allow for both fast
elicitation of emotion and more refined emotion control
based on memory and high-level sensory representations
(
Figure 2
a). The role of the amygdala in the experience
and expression of fear has received particular study
[21,31]
. Stimuli that elicit fear reactions can be external
(e.g. a loud noise) or internal, from the behavioral control
columns or from memory structures such as hippocampus
or prefrontal cortex. Decision-making ability in emotional
situations is also impaired in humans with damage to the
medial prefrontal cortex, and abnormalities in prefrontal
cortex might predispose people to develop fear and anxiety
disorders
[32]
.
Activation of the human amygdala can be produced
by observing facial expressions, and lesions of the
human amygdala can cause difficulty in the identifi-
cation of some such expressions
[33,34]
. The amygdala
and prefrontal cortices are therefore involved in social
as well as internal aspects of emotion, and together
play a crucial role in the regulation of emotion, a key
www.sciencedirect.com
Review
TRENDS in Cognitive Sciences Vol.8 No.12 December 2004
557
(a)
mPFC dlPFC
Working memory
Sensory cortex
Arousal
Amygdala
Thalamus
Basal forebrain
Brainstem
Locus coeruleus
Hypothalamus
Hippocampus
Bodily feedback
External
stimulus
Hormones
Proprioception
Behavior
(b)
From somatosensory cortex
Hypothalamus
V4
Posterior inferior
temporal visual
cortex
Inferior temporal
visual cortex
Amygdala
TRENDS in Cognitive Sciences
Figure 2. (a) Interaction of the amygdala with cortical areas in the mammalian brain (adapted from
[32]
). (b) Lateral view of part of the macaque monkey brain emphasizing
how the orbitofrontal cortex (involved in social emotions) links to amygdala, and to sensory cortices. V4 is visual area 4 (adapted from
[3]
).
component of affective style, affective disorders
[35]
and
social interactions
[36]
.
Figure 2
b provides a view of how orbitofrontal cortex
links to the amygdala in macaque monkey. Damage to
monkey caudal orbitofrontal cortex produces emotional
changes that include the tendency to respond inappropri-
ately. Orbitofrontal neurons also serve as part of a
mechanism that evaluates whether a reward is expected,
and different subregions of the prefrontal cortex are
selectively involved during positive rewards or punish-
ments
[37]
.Dolcoset al. have shown that different
subregions of the medial temporal lobe memory system
are selectively and differentially involved for emotional
and neutral stimuli in human, and that this area was
strongly correlated with amygdala activations during
emotional stimuli
[38]
. Many other brain areas have
been involved in the experience and expression of
emotions in humans, including the anterior cingulate
cortex, insula, hippocampus and fusiform gyrus
[39,40]
.
The mirror system, language and empathy
Going beyond the hypothalamo-amygdala-cortical inter-
actions, we note that language plays a unique role in
shading and refining human emotions. It is therefore
interesting that recent research suggests that ‘mirror
neurons’ might provide the substrate for both ‘empathy’ –
the ability to recognize the emotional dispositions of
others – and communication through language.
In monkey, parietal area AIP
[41]
processes visual
information concerning objects to extract possibilities
for manual interaction and is reciprocally connected
with the so-called ‘canonical neurons’ of ventral premotor
area F5
[42]
whose discharge correlates with more-or-less
specific hand actions. Certain F5 neurons, called mirror
www.sciencedirect.com
Review
558
TRENDS in Cognitive Sciences Vol.8 No.12 December 2004
neurons
[43,44]
, discharge when the monkey observes the
experimenter make a gesture similar to one that, when
actively performed by the monkey, involves activity of
that neuron. PET experiments in humans showed that
superior temporal sulcus (STS), the inferior parietal
lobule, and the inferior frontal gyrus (area 45) in the left
hemisphere were significantly activated for both grasp
execution and grasp recognition
[45,46]
. Area 45 is part of
Broca’s area, a major component of the human brain’s
language mechanisms. F5 is considered to be the monkey
homologue of Broca’s area.
These findings grounded the hypothesis that the mirror
system provided the basis for the evolution of human
language via intermediate stages involving ‘complex’
imitation (acquiring novel sequences of abstract actions
in a few trials), protosign (manually-based communi-
cation, enabled by freeing action from praxis to be used in
pantomime and then conventionalized communication)
and protospeech (vocally-based communication exploiting
the brain mechanisms that support protosign)
[47,48]
.
However, mirror neurons have also been implicated in
empathy – but with the emphasis now on recognizing
facial expressions instead of manual actions. Adolphs
[49]
and Ochsner
[36]
stress the important role of social
interaction in the forming of emotions. Clearly, human
emotions are greatly influenced by our ability to
empathize with the behavior of other people
[50]
. Indeed,
some have suggested that mirror neurons can contribute
not only to ‘simulating’ other people’s actions as the basis
for imitation
[51]
, but also ‘simulating’ other people’s
feelings as the basis for empathy
[16,17,52]
.
Box 1. Three main neuromodulatory systems involved in
emotion
Dopamine
In the mammalian brain, dopamine appears to play a major role in
motor activation, appetitive motivation, reward processing and
cellular plasticity, and might be important in emotion. Dopamine is
contained in two main pathways that ascend from the midbrain to
innervate many cortical regions. Dopamine neurons in the monkey
have been observed to fire to predicted rewards
[67,68]
. Moreover,
dopamine receptors are essential for the ability of prefrontal
networks to hold neural representations in memory and use them
to guide adaptive behavior. Therefore, dopamine plays essential
roles all the way from ‘basic’ motivational systems to working
memory systems essential for linking emotion, cognition and
consciousness.
Serotonin
Serotonin has been implicated in behavioral state regulation and
arousal, motor pattern generation, sleep, learning and plasticity,
food intake, mood and social behavior
[69]
. The cell bodies of
serotonergic systems are found in midbrain and pontine regions in
the mammalian brain and have extensive descending and ascending
projections. Serotonin plays a crucial role in the modulation of
aggression and in agonistic social interactions in many animals. In
crustaceans, serotonin plays a specific role in social status and
aggression; in primates, with the system’s expansive development
and innervation of the cerebral cortex, serotonin has come to play a
much broader role in cognitive and emotional regulation, particu-
larly control of negative mood or affect. The serotonin system is the
target of many widely used anti-depressant drugs.
Opioids
The opioids, which include endorphins, enkephalins and dynor-
phins, are found particularly within regions involved in emotional
regulation, responses to pain and stress, endocrine regulation and
food intake. Increased opioid function is associated with positive
affective states such as relief of pain, and feelings of euphoria, well-
being or relaxation. Activation of opioid receptors promotes
maternal behavior in mothers and attachment behavior and social
play in juveniles. Separation distress, exhibited by archetypal
behaviors and calls in most mammals and birds, is reduced by
opiate agonists and increased by opiate antagonists in many species
[70]
. Opiates can also reduce or eliminate the physical sensation
induced by a painful stimulus, as well as the negative emotional
state it induces. Opioids and dopamine receptors are two major
systems affected by common drugs of abuse.
Neuromodulation
We now switch structural levels, turning from specific
brain structures to systems of neuromodulation. Neuro-
modulation refers to the action on nerve cells of endo-
genous substances called neuromodulators. These are
released by a few specialized brain nuclei that have
somewhat diffuse projections throughout the brain and
receive inputs from brain areas that are involved at all
levels of behavior from reflexes to cognition. Each
neuromodulator typically activates specific families of
receptors in neuronal membranes. The receipt of its own
neuromodulator by a receptor has very specific effects on
the neuron at various time scales, from a few milliseconds
to minutes and hours
[53]
. Each neuron has its own
mixture of receptors, depending on where it is located
in the brain.
Kelley’s
[27]
analysis of motivation and emotion
emphasizes three neuromodulatory systems – those for
dopamine, serotonin and opioids (
Box 1
). Strikingly,
although these three neuromodulatory systems seem to
be distinct from each other in their overall functionalities,
they each exhibit immense diversity and synergism of
behavioral consequences. The different effects depend on
both molecular details (the receptors) and global arrange-
ments (the circuitry within the modulated brain region,
and the connections of that region within the brain).
Neuromodulation thus provides a simple but high-
impact signal that can fundamentally change the way
single neurons and synapses ‘compute’ and in this sense is
akin to the alarm system of Sloman
[19]
. Earlier, we said
that reconciling the cognitive architectures of Ortony et al.
and Sloman led us to consider four architectural levels.
Fellous
[54]
has also produced a four-level hierarchy, but
this time on the basis of a review of data on hypothalamus,
amygdala and orbitofrontal cortex, and the suggestion
that the neural basis for emotion involves both compu-
tations in such structures and their current state of
neuromodulation (see
Figure 3
, and
[55,56]
). Others have
suggested that neuromodulation might be a key to meta-
learning
[57]
.
Towards a functional view of emotions
Emotions are, of course, far more complex than a few brain
structures and three ascending modulatory systems that
interact with them. We can only outline the lessons that
neurobiology offers for the study of robotic emotions, not
provide all the details. However, we stress that the
richness of human emotion is in part due to the linkage
www.sciencedirect.com
Plik z chomika:
czarny-wilk
Inne pliki z tego folderu:
ActivityInEMotion1111.pdf
(1587 KB)
Classifying emotion a developmental account.pdf
(324 KB)
EmoSartre.pdf
(93 KB)
Emotion Regulation and Stress.pdf
(199 KB)
Emotional-eating-Eating-when-emotional-or-emotional-about-eating_2011_Psychology-and-Health.htm
(2 KB)
Inne foldery tego chomika:
▼ Andreas Boceli ▼
150212emoArt
Andrea Bocelli - The Best
ANDREAS BOCELI
ArtyMGR
Zgłoś jeśli
naruszono regulamin