Big Theory versus Big Data: What Will Solve the Big Problems in Cognitive Neuroscience?


Saturday, March 24, 3:30 – 5:30 pm, Grand Ballroom
Chair: David Poeppel, Max-Planck Institute & New York University
Speakers: Eve Marder, Gary Marcus, Alona Fyshe, Jack Gallant.
All areas of the sciences are excited about the innovative new ways in which data can be acquired and analyzed. In the neurosciences, there exists a veritable orgy of data – but is that what we need? Will the colossal datasets we now enjoy solve the questions we seek to answer, or do we need more ‘big theory’ to provide the necessary intellectual infrastructure? Four leading researchers, with expertise in neurophysiology, neuroimaging, artificial intelligence, language, and computation will debate these big questions, arguing for what steps are most likely to pay off and yield substantive new explanatory insight.
Talk 1: The important of the Small for understanding the Big.
Eve Marder, Brandeis University
The brain employs highly degenerate systems that allow for resilience and robustness. These can be found in studies of large ensembles of neurons, and are likely to show up in all kinds of large-scale simulations and theoretical studies. Nonetheless, if one ever wishes to account for the behavior of large numbers of neurons, at some point it is necessary to go down to the cellular level for analysis to see which biological mechanisms are consistent with conclusions made and proposed at higher levels of analysis.
Talk 2: Which presents the biggest obstacle to advances in cognitive neuroscience today: lack of theory or lack of data?
Jack Gallant, University of California, Berkeley
Science is a collection of methods and processes for constructing elegant theories that can explain and predict high-dimensional data. It is obvious that both theory and data are required. But at any point in time, progress is likely to be limited relatively more by a lack of theory or a lack of data. It is my contention that at the current time, progress in human cognitive neuroscience — our ability to construct powerful explanatory, predictive models — is more limited by a lack of data than a lack of theory. This is because the human brain data that are available currently offer such a coarse view of brain function that they do not provide sufficient information to develop and test rich cognitive theories. Thus, most current cognitive theories do not predict well either human brain data or complex behavior under naturalistic conditions. Development of new devices, new methods of measurement and new experimental paradigms are required in order to support cognitive models that respect the complexity of brain structure and function.
Talk 3: Data Driven Everything
Alona Fyshe, University in Victoria, British Columbia
The structure of every organism, including humans, is the product of adaptation and evolution in the face of data. Clearly data is a powerful force, but in practice we will not have eons of data at our disposal. Does that necessarily mean we will need strong model priors? How far can we get with big-but-finite data?
Talk 4: Neuroscience, deep learning, and the urgent need for an enriched set of computational primitives
Gary Marcus, NYU
Large strands of AI and contemporary neuroscience are dominated by a quest to find a single computational primitive (or canonical cortical circuit) to rule them all, typically some version of hierarchical feature detection, first made popular by Hubel and Wiesel, and more recently by deep learning. At first glance, the superficial success of deep learning seems to be argument in favor of a homogenous computational system. I argue, however, that deep learning is far more superficial than widely believed, and that both deep learning and models of neuroscience must be supplemented by a broad range of elementary computational devices.