English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Journal Article

Brain songs framework used for discovering the relevant timescale of the human brain

MPS-Authors
/persons/resource/persons208989

Deco,  Gustavo
Computational Neuroscience Group, Department of Information and Communication Technologies, Center for Brain and Cognition, University Pompeu Fabra, Barcelona, Spain;
Catalan Institution for Research and Advanced Studies (ICREA), University Pompeu Fabra, Barcelona, Spain;
Department Neuropsychology, MPI for Human Cognitive and Brain Sciences, Max Planck Society;
School of Psychological Sciences, Monash University, Melbourne, Australia;

External Resource
No external resources are shared
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)

Deco_Cruzat_2019.pdf
(Publisher version), 5MB

Supplementary Material (public)
There is no public supplementary material available
Citation

Deco, G., Cruzat, J., & Kringelbach, M. L. (2019). Brain songs framework used for discovering the relevant timescale of the human brain. Nature Communications, 10: 583. doi:10.1038/s41467-018-08186-7.


Cite as: https://hdl.handle.net/21.11116/0000-0002-FF96-6
Abstract
A key unresolved problem in neuroscience is to determine the relevant timescale for understanding spatiotemporal dynamics across the whole brain. While resting state fMRI reveals networks at an ultraslow timescale (below 0.1 Hz), other neuroimaging modalities such as MEG and EEG suggest that much faster timescales may be equally or more relevant for discovering spatiotemporal structure. Here, we introduce a novel way to generate whole-brain neural dynamical activity at the millisecond scale from fMRI signals. This method allows us to study the different timescales through binning the output of the model. These timescales can then be investigated using a method (poetically named brain songs) to extract the spacetime motifs at a given timescale. Using independent measures of entropy and hierarchy to characterize the richness of the dynamical repertoire, we show that both methods find a similar optimum at a timescale of around 200 ms in resting state and in task data.