Schedule of Events | Search Abstracts | Symposia | Invited Symposia | Poster Sessions | Data Blitz Sessions

Poster C46

Get a grip: Seeing objects activates grip representations automatically and quickly

Poster Session C - Sunday, April 14, 2024, 5:00 – 7:00 pm EDT, Sheraton Hall ABC

Heath Matheson2, Tanvi Vora1; 1Memorial University of Newfoundland

Humans live in a world of meaningful objects, which we categorize in the service of action. Understanding the neural processes involved in object categorization is an important step towards understanding the relationship between the brain and behaviour. Neurocognitive models from the framework of grounded and embodied cognition suggest that reuse of motor information associated with objects use partially constitutes our categorization abilities. To investigate this question, we applied spatiotemporal representational similarity analysis (RSA) to electroencephalographic (EEG) responses from the publicly available THINGS database in which participants passively viewed over 20,000 pictured objects in rapid serial visual presentation. This technique allows us to characterize the representational geometry of neural activity recorded from the scalp across time. Using this technique, we decoded grasp information, captured by ratings of object graspability, for a brief period starting at about 200 ms after the presentation of the object. This provides support for the notion that object categorization is constituted, in part, by the motor information associated with object use, and shows that such activation occurs automatically and quickly in the presence of objects of the world.

Topic Area: LONG-TERM MEMORY: Semantic

 

CNS Account Login

CNS2024-Logo_FNL-02

April 13–16  |  2024