Cognitive Neuroscience Society

The Journal of Cognitive Neuroscience

  • Home
  • Annual Meeting
    • General Information
      • CNS Donation Page
      • CNS 2026 Annual Meeting
      • Code of Conduct
      • Accessibility at CNS
      • Dates and Deadlines
      • Inspire Discovery: Sponsor a Travel Award
      • Annual Meeting Workshop Policy & Application
      • Networking
      • Exhibit with Us!
    • Program
      • CNS 2026 Sponsors
      • CNS 2026 Partners
      • Schedule of Events
      • Keynote Address
      • George A. Miller Award Lecture
      • Distinguished Career Contributions Award Lecture
      • Young Investigator Award Lectures
      • Invited Symposia
      • Symposia
      • Rising Stars Session
      • Poster Sessions
      • Data Blitz Sessions
      • Workshops, Socials & Special Events
      • Previous Meetings Programs & Abstracts
    • Attendee Resources
      • Venue Information
      • Destination Vancouver
      • International Travelers
      • Advanced Declaration
      • Local Attractions
      • Getting Around
      • Food and Drink
      • Enriching Experiences at the JW
    • Hotel
      • Hotel Reservations
      • Student Hotel Reservations
    • Registration
      • Registration
      • Registration Policies, Cancellations & Refunds
      • Registration FAQ
    • Submissions
      • Submit a Symposium
      • Submit a Poster
      • Printed Poster Guidelines
      • Promoting Your Presentation
      • Data Blitz
      • GSA or PFA Application
  • About CNS
    • Boards and Committees
    • CNS Diversity and Inclusion Statement
  • Membership
    • Information and Benefits
    • Join or Renew Membership
    • Membership FAQs
    • Newsletter
      • CNS Newsletters
      • Submit an Announcement
      • Newsletter FAQs
  • Awards
    • Travel Award
    • George A. Miller Award
    • The Distinguished Career Contributions Award
    • Young Investigator Award
    • JoCN Travel Fellowship Award
    • 2026 GSA/PFA Award Winners
  • News Center
    • CNS Blog
    • CNS 2026 Press Room
    • CNS 2025 Blog
    • CNS 2024 Blog
    • CNS 2023 Blog
    • CNS 2022 Blog
    • CNS 2021 Blog
    • CNS 2020 Blog
    • Blog Archives
    • Media Contact
  • CNS Archives
    • Conference Videos
    • Previous Meetings Programs & Abstracts
  • MY CNS
    • Account Login
    • Create an Account
  • Contact Us

Using Fruits and Veggies to Break Down How We Remember and Identify Objects

September 4, 2014

credit: http://commons.wikimedia.org/wiki/File:Fruit_and_vegetable_vendor_at_St._Jacobs_Farmers_Market,_2011_July_7.jpg

credit: http://commons.wikimedia.org/wiki/File:Fruit_and_vegetable_vendor_at_St._Jacobs_Farmers_Market,_2011_July_7.jpg

Guest Post by Marc Coutanche, Yale University 

From a young age, we learn the differences between a lemon and a lime and dozens of other fruit, making going to a farmer’s market to shop for fruit a seemingly simple task. But despite appearances, very little is simple about holding what you want in mind, and then identifying it in the world — whether that is a lime in the market or keys on a cluttered counter. It’s a testament to the evolution of the brain that it’s hard to even imagine object identification as anything other than effortless past childhood. But if you’ve known someone with Alzheimer’s disease, or certain other neurological disorders, its fallibility can become all too clear.

This is perhaps most strikingly apparent from observing patients who have developed “associative visual agnosia” after damage to the brain. The impairment can leave a person unable to identify (previously recognized) objects, despite having potentially perfect vision. These patients can even draw an object that’s placed in front of them, with little recognition of what it is; for example, they might draw a perfect replica of a carrot, with no idea that it’s a food. How does our brain store our knowledge of the thousands upon thousands of objects that we encounter in our lifetime, so that we can recognize them effortlessly from their features?

In a new research study, my coauthor, Sharon Thompson-Schill, and I found evidence that our knowledge of objects draws on a centralized hub in the brain. This hub pulls together dispersed pieces of information about an object’s particular shape, color, and so on, from sensory areas. Understanding these hubs, and how they integrate features, could prove critical to ultimately understanding cases where memory can fail, such as in Alzheimer’s disease.

In the past decade, new machine learning approaches that can ‘decode’ brain activity from fMRI scans have provided opportunities to tackle questions about the brain in new and exciting ways. The approach itself — seeing if brain activity patterns alone can be used to predict what someone is perceiving or thinking about — can sound like something from the pages of a sci-fi novel, but asking this question can tell us a lot about how the brain encodes information. The impressive success of decoding methods comes from their ability to pool together information from distributed populations of neurons.

Imagine activity in the brain as a symphony. Previously, fMRI methods have allowed us to listen to one instrument at a time, but the machine learning methods let us hear the whole orchestra; in this case, patterns of brain activity. Just as it’s easier to identify a musical piece when all the instruments are playing, we can now identify what the brain is processing with a lot more specificity than we previously thought possible.

credit: Marc Coutanche

Can you spot the lime in this noise? Researchers used machine-learning methods to use brain activity patterns alone to predict the fruit a person was looking for — before it appeared in noise like this.

In our recent study published in Cerebral Cortex, we investigated how knowledge is organized in the brain by having people visually search for fruits and vegetables. Previously, researchers have decoded memories for very distinct items, such as faces from vehicles. Decoding different fruits and vegetables is a lot more specific, and this category of objects has properties, such as systematic variations in shape and color, that are well suited to studying “semantic memory” — our knowledge of the objects we’ve encountered throughout our lives.

Some theories suggest that semantic memory has no central location — that it is distributed across the sensory and motor brain areas involved in seeing, hearing, touching, and manipulating objects. For example, your knowledge of a telephone would be spread across your auditory, visual, and motor cortices. Other theories suggest that one or more centralized hubs are important. One such idea is that our brain contains “convergence zones” that each integrates converging information from other brain areas. So your knowledge of limes might come from the successful integration of shape, color, and taste information at a convergence site. A key motivation for our study was to test for evidence of such a convergence zone, and for evidence of converging object properties.

In our experiment, we recorded participants’ brain activity with an fMRI scanner, while asking them to look for one of four fruits and vegetables — carrots, celery, limes, or tangerines. We wanted to probe memory, rather than current perception, so we couldn’t just show images of the fruits and vegetables. Instead, we asked participants to look for objects hidden within colorful visual noise (which looks like static on a screen).

In each trial, we first told our participants which fruit or vegetable to look for , and then showed them images of random visual noise. After some time, an object appeared, concealed inside the static. Importantly, we only looked at the brain activity recorded before the object appeared: while our participants were still looking at totally random noise. Focusing on the brain activity collected when they were holding an object in mind (without seeing it) let us truly probe internally-driven brain activity. We wanted to see if this activity would lead us to the location of a centralized hub.

Sure enough, we found that we could decode object identity in just one location: the left anterior temporal lobe, which lies a few inches above and to the front of the left ear. This finding is consistent with previous studies that point to the anterior temporal lobes as being important for semantic memory. For example, the conceptual errors made by dementia patients — including mistakes in naming fruits, and matching fruit names to pictures — is associated with deterioration in this brain region.

Interestingly, the memory-generated activity patterns that we found were very similar to activity patterns we observed when the participants were actually viewing images of each fruit or vegetable. To continue the musical analogy, we found a similar symphony when our participants were both seeing and thinking about the objects.

We next wanted to see which brain regions converge to ultimately make object identity in the anterior temporal lobe. For this, we turned to the visual processing areas responsible for shape and color. We had chosen our fruits and vegetables deliberately: two are green (lime and celery); two are orange (tangerine and carrot); two are elongated (carrot and celery); and two are near-spherical (lime and tangerine).

The idea was to “train” our machine learning decoder to distinctly look for brain activity patterns in regions associated with identifying shape and color, without picking up on activity associated with other distinguishing features, such as taste. We used the pairs of objects to our advantage here, by training the decoder to distinguish two of the fruits and vegetables (limes versus tangerines for color), and asking how it would classify other items with similar features (celery versus carrots). When using activity from a brain region associated with processing color, our decoder ‘mistook’ limes for celery, and tangerines for carrots. And the decoders that used data from the shape-processing area confused carrots with celery, and tangerines with limes. Those results made us confident that the decoders were correctly identifying color and shape information in the early visual regions.

We then reasoned that if shape and color really do converge on the left anterior temporal lobe, our object-decoders should find it easier to identity the searched-for object (e.g. tangerine) when both its color (orange) and shape (spherical) brain activity patterns are found in their respective regions. We found exactly this: a decoder could better identify an object from brain activity in the left anterior temporal lobe when both its color and shape were identified from converging feature regions.

The results of this study give support to theories that our brain contains one or more convergence zones that integrate object properties. This work is also the first to identify and link together the distinct brain patterns associated with both an object and its specific properties (color and shape). As part of the next steps in our research, we are now looking at how this knowledge becomes integrated in our brain during learning.

–

Marc Coutanche is a postdoctoral fellow at Yale University. He conducted this research with Sharon Thompson-Schill while at the University of Pennsylvania.

Are you a member of CNS with an interest in blogging? Consider contributing a guest post about your work or trends in the field. Email your ideas to CNS Public Information Officer, Lisa M.P. Munoz (cns.publicaffairs@gmail.com). 

 

By lmunoz Filed Under: Uncategorized Tagged With: language, memory, visual

Previous article: Coordinating Movement, Language, and Thoughts? An Expanded Role for the Cerebellum
Next article: Your Brain Has Got Rhythm Even If You Don’t

Latest from Twitter

Tweets by @CogNeuroNews

Cognitive Neuroscience Society
c/o Center for Mind and Brain
267 Cousteau Place, Davis, CA 95618
meeting@cogneurosociety.org

Contact

Cognitive Neuroscience Society
C/O Center for Mind and Brain
267 Cousteau Place
Davis, CA 95618
info@cogneurosociety.org

Recent Posts

  • Threading Together Attention Across Human Cognition
  • Taking Action Seriously in the Brain: Revealing the Role of Cognition in Motor Skills
  • 50 Years of Busting Myths About Aging in the Brain
  • Making the Brain Language Ready: A Journey of Discovery
  • The Lasting Cognitive Effect of Smell on Memory 

Archives

Blog Archives

Previous Meeting Programs and Abstracts

Past Newsletters

All contents © Cognitive Neuroscience Society 1995-2026