In the 21st century, the growth of data acquisition has greatly outpaced the growth of data analysis, rendering current computational and statistical tools insufficient for extracting meaning from large datasets. Neuroscience is particularly susceptible to these big data challenges, as neuroexperimentalists have devised methods of collecting terabytes of data per hour. Without statistical and computational frameworks for analysis and organization of these data, the field will be unable to fully reap the benefits of this enormous potential. Our expertise in (i) computer science, (ii) data curation, (iii) statistical science, and (iv) neuroscience enables us to fill this gap. Below we elaborate on these four threads.
Joshua Vogelstein, PHD
Assistant Professor of Biomedical Engineering
Specialization: Big Neuro Statistics
Contact
Johns Hopkins Whiting School of Engineering
3400 N. Charles Street
Clark Hall 317C
Baltimore, MD 21218
The fundamental driving force of science is the discovery of latent structure that converts myriad disparate data into understanding.