Cognitive Neuroscience Society

The Journal of Cognitive Neuroscience

  • Home
  • Annual Meeting
    • General Information
      • CNS Donation Page
      • CNS 2026 Annual Meeting
      • Code of Conduct
      • Accessibility at CNS
      • Dates and Deadlines
      • Inspire Discovery: Sponsor a Travel Award
      • Annual Meeting Workshop Policy & Application
      • Networking
      • Exhibit with Us!
    • Program
      • CNS Partners
      • Schedule of Events
      • Keynote Address
      • George A. Miller Award Lecture
      • Distinguished Career Contributions Award Lecture
      • Young Investigator Award Lectures
      • Invited Symposia
      • Symposia
      • Poster Sessions
      • Data Blitz Sessions
      • Workshops, Socials & Special Events
      • Previous Meetings Programs & Abstracts
    • Attendee Resources
      • Venue Information
      • Destination Vancouver
      • International Travelers
      • Advanced Declaration
      • Local Attractions
      • Getting Around
      • Food and Drink
      • Enriching Experiences at the JW
    • Hotel
      • Hotel Reservations
      • Student Hotel Reservations
    • Registration
      • Registration
      • Registration Policies, Cancellations & Refunds
      • Registration FAQ
    • Submissions
      • Submit a Symposium
      • Submit a Poster
      • Printed Poster Guidelines
      • Promoting Your Presentation
      • Data Blitz
      • GSA or PFA Application
  • About CNS
    • Boards and Committees
    • CNS Diversity and Inclusion Statement
  • Membership
    • Information and Benefits
    • Join or Renew Membership
    • Membership FAQs
    • Newsletter
      • CNS Newsletters
      • Submit an Announcement
      • Newsletter FAQs
  • Awards
    • Travel Award
    • George A. Miller Award
    • The Distinguished Career Contributions Award
    • Young Investigator Award
    • JoCN Travel Fellowship Award
    • 2026 GSA/PFA Award Winners
  • News Center
    • CNS Blog
    • CNS 2026 Press Room
    • CNS 2025 Blog
    • CNS 2024 Blog
    • CNS 2023 Blog
    • CNS 2022 Blog
    • CNS 2021 Blog
    • CNS 2020 Blog
    • Blog Archives
    • Media Contact
  • CNS Archives
    • Conference Videos
    • Previous Meetings Programs & Abstracts
  • MY CNS
    • Account Login
    • Create an Account
  • Contact Us

Copy That: When We’re More Likely to Imitate People Than Robots

May 14, 2014

Hands_robot_blckWhen we are around different people, our behavior changes. Some of it is intentional – like talking about things you have in common – but much of it happens spontaneously without us even realizing it, such as folding your arms or scratching your nose when others do the same. But what about around robotic or computer-generated behavior? New research has found that imitating others depends on someone both looking human and on us believing the behavior to be human.

“Our research seeks to understand if there are dedicated neural circuits for controlling social interactions with other humans compared to non-human agents,” says Richard Ramsey of Bangor University in North Wales. While scientists have previously determined the brain networks responsible for “automatic imitation,” they have yet to understand all the factors that determine when it occurs.

Ramsey’s interest in the topic is rooted in his distinctive accent from growing up in the northern English city of Leeds. “On arriving at university, I noticed that the way I spoke was being copied by other students that I met,” he explains. “This struck me as odd since many students and friends were making fun of the distinctive nature of my accent at the same time as spontaneously copying the way I made vowel sounds.”

With the ever-increasing presence of robotic and computer-generated behaviors in our lives (think Siri), Ramsey and his colleagues decided to test how interacting with such non-human agents affects our social imitation. “An interesting twist we added to our design was not only whether another agent looked like a human or a robot but also whether participants believed the agent they were seeing had human or non-human origins,” Ramsey says.

In the experiment, just published in the Journal of Cognitive Neuroscience, the researchers showed participants videos of a hand moving to induce automatic imitation while in the fMRI scanner. They had two cues as to whether the hand was human. The hand either looked human or robotic. And although the movements were identical across the videos, the researchers told participants that some movements were generated by a computer algorithm, while others were copied from people using motion-capture suits.

Before watching the videos, the participants watched a 10-minute custom-made documentary that described the two techniques for producing digital movements. “The video incorporated a lot of video footage from Hollywood movies, professional motion capture labs, and even some stimuli and footage from our own lab to make it all more believable,” Ramsey says. “Participants believed that movements they would see later in the experiment could be made in fundamentally different ways.”

As predicted, both the human appearance and the belief that the movements were human were important in triggering the automatic imitation “For the first time, we show that one node in the imitation control circuit – the right temporoparietal junction – is tuned to regulate interactions with human more than non-human interaction partners,” Ramsey says. While the imitation was the same when one or both cues to humanity were present, the imitation dropped off when both were absent.

“In the future, as it is very likely that robots will play a larger part in everyone’s lives, these data provide evidence for the neurobiological mechanism that may control how we may respond and interact differently with humans compared to robots,” Ramsey says.

-Lisa M.P. Munoz

The paper, “The Control of Automatic Imitation Based on Bottom–Up and Top–Down Cues to Animacy: Insights from Brain and Behavior” by André Klapper, Richard Ramsey, Daniël Wigboldus, and Emily S. Cross, was published online on April 17, 2014, in the Journal of Cognitive Neuroscience.

 

By lmunoz Filed Under: Uncategorized Tagged With: perception, robot, social

Previous article: Unraveling the Motor Movements That Connect All Primates
Next article: Nothing Really Matters: Disbelief in Free Will Makes Us Care Less About Mistakes

Latest from Twitter

Tweets by @CogNeuroNews

Cognitive Neuroscience Society
c/o Center for Mind and Brain
267 Cousteau Place, Davis, CA 95618
meeting@cogneurosociety.org

Recent Posts

  • Threading Together Attention Across Human Cognition
  • Taking Action Seriously in the Brain: Revealing the Role of Cognition in Motor Skills
  • 50 Years of Busting Myths About Aging in the Brain
  • Making the Brain Language Ready: A Journey of Discovery
  • The Lasting Cognitive Effect of Smell on Memory 

Archives

Blog Archives

Previous Meeting Programs and Abstracts

Past Newsletters

All contents © Cognitive Neuroscience Society 1995-2026