Cognitive Neuroscience Society

The Journal of Cognitive Neuroscience

  • Facebook
  • RSS
  • Twitter
MENUMENU
  • Home
  • Annual Meeting
        • General Information
          • What to Expect at CNS 2023
          • CNS 2023 Mobile App
          • CNS 2023 Giveaway
          • CNS 2023 Giveaway Winners
          • Accessibility at CNS
          • General Information
          • Code of Conduct
          • Dates and Deadlines
          • Hotel Reservations
          • Poster Guidelines
          • Poster Printing Discount
          • Annual Meeting Workshop Policy & Application
          • Exhibit with us!
        • Program
          • Thank you to our Partners
          • CNS 2023 Program Booklet
          • Schedule Overview
          • Program-at-a-Glance
          • CNS 30th Anniversary Dance Party
          • Keynote Address
          • George A. Miller Awardee
          • Distinguished Career Contributions Awardee
          • Young Investigator Awardees
          • CNS at 30: Perspectives on the Roots, Present, and Future of Cognitive Neuroscience
          • Invited-Symposium Sessions
          • Symposium Sessions
          • Data Blitz Session Schedule
          • Poster Schedule & Session Information
          • JoCN Travel Fellowship Award
          • GSA/PFA Award Winners
          • Workshops, Socials & Special Events
        • Registration
          • Registration
          • Registration FAQ
          • Registration Policies, Cancellations & Refunds
        • News/Press
          • CNS 2023 Press Room
          • CNS 2022 Blog
          • CNS 2021 Blog
          • CNS 2020 Blog
        • Submissions
          • 2023 Poster Printing Discount
          • Submission Requirements
          • Submit a Poster
          • Submit a Symposium
          • GSA or PFA Application
          • Data Blitz
          • Frequently Asked Submission Questions
        • Archive
          • CNS 2020 Conference Videos
          • CNS 2019 Conference Videos
          • CNS 2018 Conference Videos
          • CNS 2017 Conference Videos
          • CNS 2016 Conference Videos
          • CNS 2015 Conference Videos
          • Previous Meetings Programs & Abstracts
  • About CNS
    • Boards and Committees
    • CNS Statement: Black Lives Matter
  • Membership
    • Information and Benefits
    • Join or Renew Membership
    • Membership FAQs
    • Member Discounts
    • Newsletter
      • Submit an Announcement
      • Current Newsletter
      • Newsletter FAQs
      • Past Newsletters
  • Awards
    • George A. Miller Award
    • Fred Kavli Distinguished Career Contributions Award
    • Young Investigator Award
    • Young Investigator Award Nominations
    • 2023 YIA Nomination Form
    • JoCN Travel Fellowship Award
  • News Center
    • CNS Blog
    • CNS 2023 Press Room
    • CNS 2023 Blog
    • CNS 2022 Blog
    • CNS 2021 Blog
    • CNS 2020 Blog
    • CNS 2019 Blog
    • Blog Archives
    • Quick Tips for Getting Started on Twitter
    • Media Contact
  • My CNS
  • Contact Us
post

Copy That: When We’re More Likely to Imitate People Than Robots

May 14, 2014

Hands_robot_blckWhen we are around different people, our behavior changes. Some of it is intentional – like talking about things you have in common – but much of it happens spontaneously without us even realizing it, such as folding your arms or scratching your nose when others do the same. But what about around robotic or computer-generated behavior? New research has found that imitating others depends on someone both looking human and on us believing the behavior to be human.

“Our research seeks to understand if there are dedicated neural circuits for controlling social interactions with other humans compared to non-human agents,” says Richard Ramsey of Bangor University in North Wales. While scientists have previously determined the brain networks responsible for “automatic imitation,” they have yet to understand all the factors that determine when it occurs.

Ramsey’s interest in the topic is rooted in his distinctive accent from growing up in the northern English city of Leeds. “On arriving at university, I noticed that the way I spoke was being copied by other students that I met,” he explains. “This struck me as odd since many students and friends were making fun of the distinctive nature of my accent at the same time as spontaneously copying the way I made vowel sounds.”

With the ever-increasing presence of robotic and computer-generated behaviors in our lives (think Siri), Ramsey and his colleagues decided to test how interacting with such non-human agents affects our social imitation. “An interesting twist we added to our design was not only whether another agent looked like a human or a robot but also whether participants believed the agent they were seeing had human or non-human origins,” Ramsey says.

In the experiment, just published in the Journal of Cognitive Neuroscience, the researchers showed participants videos of a hand moving to induce automatic imitation while in the fMRI scanner. They had two cues as to whether the hand was human. The hand either looked human or robotic. And although the movements were identical across the videos, the researchers told participants that some movements were generated by a computer algorithm, while others were copied from people using motion-capture suits.

Before watching the videos, the participants watched a 10-minute custom-made documentary that described the two techniques for producing digital movements. “The video incorporated a lot of video footage from Hollywood movies, professional motion capture labs, and even some stimuli and footage from our own lab to make it all more believable,” Ramsey says. “Participants believed that movements they would see later in the experiment could be made in fundamentally different ways.”

As predicted, both the human appearance and the belief that the movements were human were important in triggering the automatic imitation “For the first time, we show that one node in the imitation control circuit – the right temporoparietal junction – is tuned to regulate interactions with human more than non-human interaction partners,” Ramsey says. While the imitation was the same when one or both cues to humanity were present, the imitation dropped off when both were absent.

“In the future, as it is very likely that robots will play a larger part in everyone’s lives, these data provide evidence for the neurobiological mechanism that may control how we may respond and interact differently with humans compared to robots,” Ramsey says.

-Lisa M.P. Munoz

The paper, “The Control of Automatic Imitation Based on Bottom–Up and Top–Down Cues to Animacy: Insights from Brain and Behavior” by André Klapper, Richard Ramsey, Daniël Wigboldus, and Emily S. Cross, was published online on April 17, 2014, in the Journal of Cognitive Neuroscience.

 

By lmunoz Filed Under: Uncategorized Tagged With: perception, robot, social

Previous article: Unraveling the Motor Movements That Connect All Primates
Next article: Nothing Really Matters: Disbelief in Free Will Makes Us Care Less About Mistakes

Recent Posts

  • CNS 2023: Day 2
  • Forget About It: Investigating How We Purge Thoughts from Our Minds
  • CNS 2023: Day 1 Highlights
  • Poverty: What’s the Brain Got to Do With It?
  • Unraveling Graceful Human Learning Over Time

Blog Archives

Quick Tips for Getting Started on Twitter

Cognitive Neuroscience Society
c/o Center for Mind and Brain
267 Cousteau Place, Davis, CA 95618
916-955-6080: for CNS Membership Questions
805-450-7490: for annual meeting questions about- registration, posters, symposium
916-409-5069: Fax Line
email: meeting@cogneurosociety.org

Recent Posts

  • CNS 2023: Day 2
  • Forget About It: Investigating How We Purge Thoughts from Our Minds
  • CNS 2023: Day 1 Highlights
  • Poverty: What’s the Brain Got to Do With It?
  • Unraveling Graceful Human Learning Over Time

Archives

Blog Archives

Previous Meeting Programs and Abstracts

Past Newsletters

All contents © Cognitive Neuroscience Society 1995-2019

Add to Calendar

Add to Calendar
04/16/2022 11:00 AM
04/16/2022 12:00 PM
America/Los_Angeles
How Prior Knowledge Shapes Encoding of New Memories
Description of the event
Grand Ballroom A
Create an Account

Login Utility