Schedule of Events | Search Abstracts | Symposia | Invited Symposia | Poster Sessions | Data Blitz Sessions

Poster D86

Action-specific representations of visual task features: Computation on demand

Poster Session D - Monday, April 15, 2024, 8:00 – 10:00 am EDT, Sheraton Hall ABC

Nina Lee1, Lin Lawrence Guo1, Adrian Nestor1, Matthias Niemeier1,2; 1University of Toronto, 2Centre for Vision Research, York University

It has been previously found that computations of goal-directed behaviour are facilitated by conjunctive neural representations of task features. However, these conclusions are drawn from paradigms that often use arbitrary combinations of task affordances and features that necessitate working memory. Therefore, in the present study, we used a task that requires minimal working memory representations to investigate the temporal dynamics of feature representations and their potential integration in the brain. Specifically, we recorded electroencephalography data from human participants while they first viewed, and then grasped objects or touched them with a knuckle. Objects had different shapes and were made of intuitively light or heavy materials. Importantly, shape and weight were features relevant for grasping but not for knuckling. Using multivariate analysis, we found that representations of object shape were similar for grasping and knuckling. However, only for grasping did early shape representations reactivate at later phases of grasp planning, suggesting that sensorimotor control signals feed back to early visual cortex. Grasp-specific representations of weight/material only emerged during grasp execution after object contact, during the load phase. A trend for grasp-specific integrated representations of shape and material arose, but only briefly during movement onset. These results argue against the view that goal-directed actions automatically integrate all features of a task into a sustained and unified neural representation. Instead, our results suggest that the brain creates action-specific representations of relevant features as required for the separate subcomponents of its action computations.

Topic Area: PERCEPTION & ACTION: Motor control

 

CNS Account Login

CNS2024-Logo_FNL-02

April 13–16  |  2024