Flux Blog

Reproducibility, Open Science, and Big Data

Jul 24, 2023 | For society members

Replication is a foundational principle for the credibility of science [Romero, 2019]. However, psychology and neuroscience are currently experiencing a replicability and reproducibility crisis, evoked by the realization that studies failed to replicate or reproduce some previously popularized findings [Klein et al., 2014; Malich & Rehmann-Sutter, 2022], such as the Marshmallow effect [De Posada & Singer, 2005], the Mozart effect [Campbell, 2000], “feeling the future” using extrasensory perception [Bem, 2011; Rabeyron, 2020; Ritchie, Wiseman, & French, 2012], trait construct priming [Bargh, Chen, & Burrows,1996; Doyen, Klein, Pichon, & Cleeremans, 2012; Klein et al., 2014; Stroebe, 2019], or the power pose effect [Carney, Cuddy, & Yap, 2010].

Neuroscientific findings have been shown to hold a certain allure for the general public, such that non-experts trust statements more when allegedly backed by neuroscience [Weisberg, Keil, Goodstein, Rawson, & Gray, 2008]. Scientists thus have the responsibility to exert caution when communicating single findings to avoid sensationalism and to perform rigorous research despite external pressure to produce results.

Replication/Replicability: independent investigators re-do a previously conducted study, including collecting and analyzing their own data, analytical methods, laboratories, and instruments to see if they get the same results. Scientific evidence is strengthened when results are replicated by multiple research groups in this manner.

Reproduction/Reproducibility: an independent group of researchers (re-)analyze data from a previously conducted study to see if they obtain the same results.

[National Academies of Sciences, Engineering, and Medicine, 2019; Peng, Dominici & Zeger, 2006]

 

Considering sample size is important for generalizable and sound inferences

Neuroscientists and psychologists fundamentally rely on statistical theories and methods [Chen, 2019], whether to map brain function to structure, study the connectivity between brain regions, model brain dynamics, or detect aberrances. A single finding might be observed due to chance, meaning that sometimes we might observe an unusual result in a sample that does not accurately reflect the behavior of the population. The outcome of a single experiment can stray quite far from the expected or theoretical average, especially when dealing with a small sample size or a situation with high variability.

However, according to Bernoulli’s law of large numbers, when we gather data from a large sample of individuals, the whole population’s average for a certain trait can be estimated more accurately [Blume & Royall, 2003; Bolthausen & Wüthrich, 2013]. In simpler terms, the larger the sample size, the better the average of the population’s characteristics can be represented.

Therefore, it is advisable to record many observations, either on the individual or using a large sample size, to draw generalizable conclusions for the individual or a population, respectively. However, in human studies, lack of time, funding, and human resources often restrict sample sizes to humble numbers, which limits statistical power and can lead to spurious and non-replicable results [Button et al., 2013; Nee, 2019; Szucs & Ioannidis, 2020].

 

Current scientific policies pressure researchers to publish

The pressure to publish leads to a focus on novelty and a bias towards the publication of positive findings, which in turn can incentivize bad practices that increase the likelihood of false positives, such as HARKing, cherry-picking, p-hacking, fishing and data-dredging, or even fraud [Andrade, 2021; Yong, 2012; Stroebe, 2019]. Moreover, researchers may not adhere to the most up-to-date recommendations and good analysis practices [Nieuwenhuis, Forstmann, & Wagenmakers, 2013; Vul, Harris, Winkielman, & Pashler, 2009].

However, it should be noted that not only poor science is at fault – intrinsically, some hypotheses will be false when exploring the unknown and our current statistical methods will per definition lead to some false positives (“the base rate fallacy”) [Bernard, 2023; Bird, 2021; Hunter, 2017]. Either way, it is important to establish the necessary methodology and environment to allow for high-quality research. Let us all contribute to a scientific community that fosters data sharing, knowledge exchange, collaboration, and mutual support.

HARKingHypothesizing After the Results are Known

Cherry-pickingselecting and reporting only results that support the hypotheses

P-hackinganalyzing data until a significant result is found

Fishing, data-mining, and data-dredgingtesting myriad associations not based on hypotheses

[Andrade, 2021]

 

Big data and open science initiatives aim to mitigate challenges

The (Neuro)science community is now evolving to try to meet and mitigate these challenges with big data and open science initiatives. With these approaches, we can view the spurred discussion and endeavors as a chance at and leverage for better science, instead of just a crisis [Munafo, 2022]. To promote this movement, in this Flux Blog, we provide an annotated collection of literature, guidelines, and databases in reference to this timely topic.

If you have questions, comments, or would like to add to this collection, please contact us, we would appreciate your participation! 

Please share your feedback here and/or leave us a note in the comment section!

 

Guidelines, good practice & more standardized protocols

Name

Description

Reference

Tips for good research practice “Practical advice at the different stages of everyday research: from planning and execution to reporting of research” Schwab, S., Janiaud, P., Dayan, M., Amrhein, V., Panczak, R., Palagi, P. M., ... & Held, L. (2022). Ten simple rules for good research practice. PLoS computational biology, 18(6), e1010139. https://doi.org/10.1371/journal.pcbi.1010139 
Tips and examples for open and reproducible research Workflow of the WomCogDev lab in Geneva as an example Turoman, N., Hautekiet, C., Jeanneret, S., Valentini, B., & Langerock, N. (2022). Open and reproducible practices in developmental psychology research: The workflow of the WomCogDev lab as an example. Infant and Child Development, e2333. https://doi.org/10.1002/icd.2333
Tips for analysis scripts in neuroimaging  Examples and tips for analysis pipeline code in neuroimaging (e.g. preprocessing, statistics, visualization in MEG, MRI) Van Vliet, M. (2020). Seven quick tips for analysis scripts in neuroimaging. PLoS computational biology, 16(3), e1007358. https://doi.org/10.1371/journal.pcbi.1007358 
Essential Neurostatistic Principles   Smith P. F. (2017). A Guerilla Guide to Common Problems in 'Neurostatistics': Essential Statistical Topics in Neuroscience. Journal of undergraduate neuroscience education : JUNE : a publication of FUN, Faculty for Undergraduate Neuroscience, 16(1), R1–R12.https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5777851/ 
Considering sample size Suggestion to conduct two-step experiments: 1) exploration and 2) estimation of effect size Editorial. (2020). Consideration of Sample Size in Neuroscience Studies. J. Neurosci., 40(4076).doi: https://doi.org/10.1523/JNEUROSCI.0866-20.2020
Use of multiple statistical analyses   Wagenmakers, E. J., Sarafoglou, A., & Aczel, B. (2022). One statistical analysis must not rule them all. Nature, 605(7910), 423-425.https://www.nature.com/articles/d41586-022-01332-8 
The European Code of Conduct for Research Integrity “serves the European research community as a framework for self-regulation across all scientific and scholarly disciplines and for all research settings” https://allea.org/code-of-conduct/ 
Registered Replication Reports An introduction to registered replication reports Simons, D. J., Holcombe, A. O., & Spellman, B. A. (2014). An Introduction to Registered Replication Reports at Perspectives on Psychological Science. Perspectives on Psychological Science, 9(5), 552–555. https://doi.org/10.1177/1745691614543974

Meetings, Networks and Clubs

Name

Description

Publisher/Developer

ReproducibiliTea Journal Club World-wide open science journal clubs dedicated to topics related to reproducibility, statistics in data analysis, open science, research quality and good research practices (across fields) https://reproducibilitea.org/ 
International Reproducibility Networks “national, peer-led consorti[a] of researchers that aim[] to promote and ensure rigorous research practices” https://www.ukrn.org/international-networks/ 

Knowledge-Sharing and Collaboration 

Name

Description

Publisher/Developer/Links

Open Science Framework free and open source project management tool that allows researchers to track their entire project lifecycle, either privately or publicly

Center for Open Science https://www.cos.io/ 

https://osf.io/ 

Cognitive Atlas collaborative knowledge base of cognitive tasks, experimental paradigms, and behavioral measures used in cognitive neuroscience; standardized vocabulary and ontology

Russell Poldrack (Professor of Psychology at Stanford University) 


Development of the project was supported by grant RO1MH082795 from the National Institute of Mental Health.

http://www.cognitiveatlas.org/ 

Pre-registration  In pre-registration websites, the researcher declares of the study question, hypothesis, design and statistical plan, prior to conducting the actual study P Simmons, Joseph, Leif D Nelson, and Uri Simonsohn. "Pre‐registration: Why and how." Journal of Consumer Psychology 31.1 (2021): 151-162. Preregistration (cos.io)
ENIGMA Consortium “brings together researchers in imaging genomics, neurology and psychiatry, to understand brain structure and function, based on MRI, DTI, fMRI, genetic data and many patient populations”

Thompson, P.M., Jahanshad, N., Ching, C.R.K. et al. ENIGMA and global neuroscience: A decade of large-scale studies of the brain in health and disease across more than 40 countries. Transl Psychiatry 10, 100 (2020). https://doi.org/10.1038/s41398-020-0705-1

https://enigma.ini.usc.edu/

Repositories/Lists of neuroscience databases

Name

Description

Reference/Publisher

Neuroscience Information Framework (NIF) this Discovery Portal provides annotated links to >300 biomedical databases https://neuinfo.org/data/search?q=*&t=registry&ff=Resource%20Type:data%20set#all 
Wikipedia List of neural databases   https://en.wikipedia.org/wiki/List_of_neuroscience_databases
OpenNeuro repository of (BIDS-compliant) MRI, MEG, EEG, iEEG, and ECoG datasets, allows filtering by age, N, diagnosis, etc. https://openneuro.org/
Neurovault facilitates the open sharing and exploring of neuroimaging data

Gorgolewski KJ, Varoquaux G, Rivera G, Schwartz Y, Sochat VV, Ghosh SS, et al. (January 2016). "NeuroVault.org: A repository for sharing unthresholded statistical maps, parcellations, and atlases of the human brain". NeuroImage. 124 (Pt B): 1242–1244. doi:10.1016/j.neuroimage.2015.04.016. PMC 4806527. PMID 25869863.

https://neurovault.org/ 

National Archive of Computerized Data on Aging (NACDA)   https://www.icpsr.umich.edu/web/pages/NACDA/index.html 

Big studies and datasets

Name

Description

Reference/Publisher

ABCD Study   https://abcdstudy.org/
Healthy Brain Network (HBN) New York Area
N=10,000  (5-21 y)
Alexander, L. M., Escalera, J., Ai, L., Andreotti, C., Febre, K., Mangone, A., ... & Milham, M. P. (2017). An open resource for transdiagnostic research in pediatric mental health and learning disorders. Scientific data, 4(1), 1-26. http://fcon_1000.projects.nitrc.org/indi/cmi_healthy_brain_network/ 
YOUth cohort  Dutch cohort; longitudinal study on brain development

Baby & Child cohort: N>2500 from pregnancy
Child & Adolescence cohort:  N~1350 children (8-10 y)

Follow-ups every three years (or more frequently for infants) for at least 6 years. Includes eye tracking, computer tasks, behavioral tasks, parent-child observations, ultrasounds and EEG (for YOUth Baby & Child), MRI (for YOUth Child & Teenager)
Buimer, E. E., Pas, P., Brouwer, R. M., Froeling, M., Hoogduin, H., Leemans, A., ... & Mandl, R. C. (2020). The YOUth cohort study: MRI protocol and test-retest reliability in adults. Developmental Cognitive Neuroscience, 45, 100816. doi.org/10.1016/j.dcn.2020.100816

Onland-Moret, N. C., Buizer-Voskamp, J. E., Albers, M. E., Brouwer, R. M., Buimer, E. E., Hessels, R. S., ... & Kemner, C. (2020). The YOUth study: Rationale, design, and study procedures. Developmental cognitive neuroscience, 46, 100868. doi.org/10.1016/j.dcn.2020.100868https://www.uu.nl/en/research/youth-cohort-study 
CAM-CAN Cambridge Centre for Ageing Neuroscience dataset (18-90 y)

Home interviews: N~3000

Neuroimaging (EEG, MRI, MEG): N~700
https://camcan-archive.mrc-cbu.cam.ac.uk/dataaccess/ 
OASIS Open Access Series of Imaging Studies (OASIS): multiple cross-sectional or longitudinal datasets on the topic of Alzheimer’s and dementia in (aging) adults 

Daniel S. Marcus, Tracy H. Wang, Jamie Parker, John G. Csernansky, John C. Morris, Randy L. Buckner; Open Access Series of Imaging Studies (OASIS): Cross-sectional MRI Data in Young, Middle Aged, Nondemented, and Demented Older Adults. J Cogn Neurosci 2007; 19 (9): 1498–1507. doi: https://doi.org/10.1162/jocn.2007.19.9.1498 

Daniel S. Marcus, Anthony F. Fotenos, John G. Csernansky, John C. Morris, Randy L. Buckner; Open Access Series of Imaging Studies: Longitudinal MRI Data in Nondemented and Demented Older Adults. J Cogn Neurosci 2010; 22 (12): 2677–2684. doi: https://doi.org/10.1162/jocn.2009.21407

and more

https://www.oasis-brains.org/ 

Leiden consortium on individual development (CID study) Longitudinal twin study on the development of social behavior & behavioral control

Meta-data (protocols etc.) available

Data will be made available soon
https://www.developmentmatters.nl/data-access/ 

Atlases

Name

Description

Reference/Publisher

Surface Volume Atlases Infant Brain   https://zenodo.org/record/7044932#.ZGJQ_nZBz-h 
Brain Development Atlases Human atlases at different developmental stages http://brain-development.org/ 
  Allen Institute for Brain Science

https://portal.brain-map.org/

https://www.brainspan.org/

Meta-Analyses

Name Description Reference/Publisher
Neurosynth automatized meta-analyses of neuroscience articles using text-mining, Aim: find associations between brain regions and cognitive concepts

Yarkoni, T., Poldrack, R., Nichols, T. et al. Large-scale automated synthesis of human functional neuroimaging data. Nat Methods 8, 665–670 (2011). https://doi.org/10.1038/nmeth.1635 

https://neurosynth.org 

Further Discussions, Viewpoints, Concepts, and Problems to address

Name

Description

Reference/Publisher

FAIR 

 

Article on the Concept of FAIR data: Findable, Accessible, Interoperable, and Reusable data. Poline, JB., Kennedy, D.N., Sommer, F.T. et al. Is Neuroscience FAIR? A Call for Collaborative Standardisation of Neuroscience Data. Neuroinform 20, 507–512 (2022). https://doi.org/10.1007/s12021-021-09557-0
BIDS MRI Article on the concept of BIDS: Brain imaging data structure Gorgolewski, K., Auer, T., Calhoun, V. et al. The brain imaging data structure, a format for organizing and describing outputs of neuroimaging experiments. Sci Data 3, 160044 (2016). https://doi.org/10.1038/sdata.2016.44
EEG-BIDS Extension of BIDS to EEG data Pernet, C.R., Appelhoff, S., Gorgolewski, K.J. et al. EEG-BIDS, an extension to the brain imaging data structure for electroencephalography. Sci Data 6, 103 (2019). https://doi.org/10.1038/s41597-019-0104-8
MEG-BIDS Extension of BIDS to MEG data Niso, G., Gorgolewski, K., Bock, E. et al. MEG-BIDS, the brain imaging data structure extended to magnetoencephalography. Sci Data 5, 180110 (2018). https://doi.org/10.1038/sdata.2018.110

Interpretation crisis

 

Article on the concept of the interpretation crisis: we need to understand our results better Krämer, B. (2022, November). Why Are Most Published Research Findings Under-Theorized?. In Questions of Communicative Change and Continuity (pp. 23-52). Nomos Verlagsgesellschaft mbH & Co. KG. https://doi.org/10.5771/9783748928232-23
Metascience Article on the concept of metascience and its importance in the context of the replication crisis Malich, L., & Rehmann-Sutter, C. (2022). Metascience Is Not Enough – A Plea for Psychological Humanities in the Wake of the Replication Crisis. Review of General Psychology, 26(2), 261–273. https://doi.org/10.1177/10892680221083876
Article on large-scale replication projects Article on large-scale replication projects McShane, B. B., Tackett, J. L., Böckenholt, U., & Gelman, A. (2019). Large-scale replication projects in contemporary psychological research. The American Statistician, 73(sup1), 99-105.
Open Access Issues Protest against open access https://www.spectrumnews.org/news/imaging-journal-editors-resign-over-extreme-open-access-fees/
Preregistration concept Review on the topic of preregistered reports, e.g. discussing effectiveness of the practice, history, policies, and developments Chambers, C.D., Tzavella, L. The past, present and future of Registered Reports. Nat Hum Behav 6, 29–42 (2022). https://doi.org/10.1038/s41562-021-01193-7 
Preregistration Issues Article on the downsides of preregistration McDermott, R. (2022). Breaking free: How preregistration hurts scholars and science. Politics and the Life Sciences, 41(1), 55-59. doi:10.1017/pls.2022.4
BWAS Discussion around BWAS: Brain-wide association studies

(following the example of Genome-wide association studies - GWAS)

Marek, S., Tervo-Clemmens, B., Calabro, F.J. et al. Reproducible brain-wide association studies require thousands of individuals. Nature 603, 654–660 (2022). https://doi.org/10.1038/s41586-022-04492-9

 

 

Spisak, T., Bingel, U. & Wager, T.D. Multivariate BWAS can be replicable with moderate sample sizes. Nature 615, E4–E7 (2023). https://doi.org/10.1038/s41586-023-05745-x

 

 

Tervo-Clemmens, B., Marek, S., Chauvin, R.J. et al. Reply to: Multivariate BWAS can be replicable with moderate sample sizes. Nature 615, E8–E12 (2023). https://doi.org/10.1038/s41586-023-05746-w 

Data sharing compliance issues Study Gabelica, M., Bojčić, R., & Puljak, L. (2022). Many researchers were not compliant with their published data sharing statement: a mixed-methods study. Journal of clinical epidemiology, 150, 33–41. https://doi.org/10.1016/j.jclinepi.2022.05.019

Examples of products and innovation 

Name

Description

Reference/Publisher

Dataflux Service/Product for the handling of big data https://dataflux.eu/analytics
Prolific Conducting online experiments / surveys https://www.prolific.co/ 

Examples of public outreach

Name

Description

Reference/Publisher

Leiden Psychology Blog   https://www.leidenpsychologyblog.nl/ 
Leiden Psychology Podcasts   https://www.universiteitleiden.nl/en/podcasts 

 

References:

Andrade, C. (2021). HARKing, cherry-picking, p-hacking, fishing expeditions, and data dredging and mining as questionable research practices. The Journal of Clinical Psychiatry, 82(1), 25941. doi: https://doi.org/10.4088/JCP.20f13804 

Bargh, J. A., Chen, M., & Burrows, L. (1996). Automaticity of social behavior: Direct effects of trait construct and stereotype activation on action. Journal of personality and social psychology71(2), 230. doi: https://doi.org/10.1037/0022-3514.71.2.230

Bem, D. J. (2011). Feeling the future: experimental evidence for anomalous retroactive influences on cognition and affect. Journal of personality and social psychology, 100(3), 407. doi: https://doi.org/10.1037/a0021524 

Bernard, C. (2023). Stop Reproducing the Reproducibility Crisis. Eneuro, 10(2).doi: https://doi.org/10.1523/ENEURO.0032-23.2023 

Bird, A. (2021). Understanding the replication crisis as a base rate fallacy. The British Journal for the Philosophy of Science. doi: https://doi.org/10.1093/bjps/axy051 

Blume, J. D., & Royall, R. M. (2003). Illustrating the law of large numbers (and confidence intervals). The American Statistician, 57(1), 51-57. doi: https://doi.org/10.1198/0003130031081 

Bolthausen, E., & Wüthrich, M. V. (2013). Bernoulli's law of large numbers. ASTIN Bulletin: The Journal of the IAA, 43(2), 73-79. doi: https://doi.org/10.1017/asb.2013.11 

Button, K. S., Ioannidis, J. P., Mokrysz, C., Nosek, B. A., Flint, J., Robinson, E. S., & Munafò, M. R. (2013). Power failure: why small sample size undermines the reliability of neuroscience. Nature reviews neuroscience, 14(5), 365-376. doi: https://doi.org/10.1038/nrn3475 

Carney, D. R., Cuddy, A. J., & Yap, A. J. (2010). Power posing: Brief nonverbal displays affect neuroendocrine levels and risk tolerance. Psychological science, 21(10), 1363-1368. doi: https://doi.org/10.1177/0956797610383437 

Chén, O. Y. (2019). The roles of statistics in human neuroscience. Brain sciences, 9(8), 194. doi: https://doi.org/10.3390/brainsci9080194

Doyen, S., Klein, O., Pichon, C. L., & Cleeremans, A. (2012). Behavioral priming: it's all in the mind, but whose mind?. PloS one7(1), e29081. doi: https://doi.org/10.1371/journal.pone.0029081

Hunter, P. (2017). The reproducibility “crisis” Reaction to replication crisis should not stifle innovation. EMBO reports, 18(9), 1493-1496. doi: https://doi.org/10.15252/embr.201744876   

Klein, R. A., Ratliff, K. A., Vianello, M., Adams Jr, R. B., Bahník, Š., Bernstein, M. J., ... & Nosek, B. A. (2014). Investigating variation in replicability. Social psychology. doi: https://doi.org/10.1027/1864-9335/a000178

National Academies of Sciences, Engineering, and Medicine (2019). Understanding Reproducibility and Replicability. Reproducibility and Replicability in Science. https://www.ncbi.nlm.nih.gov/books/NBK547546/ 

Nieuwenhuis, S., Forstmann, B. & Wagenmakers, EJ (2011). Erroneous analyses of interactions in neuroscience: a problem of significance. Nat Neurosci 14, 1105–1107. https://doi.org/10.1038/nn.2886 

Nee, D. E. (2019). fMRI replicability depends upon sufficient individual-level data. Communications biology, 2(1), 130. doi: https://doi.org/10.1038/s42003-018-0073-z 

Malich, L., & Rehmann-Sutter, C. (2022). Metascience is not enough–a plea for psychological humanities in the wake of the replication crisis. Review of General Psychology, 26(2), 261-273. doi: https://doi.org/10.1177/10892680221083876

Munafò, M. R., Chambers, C., Collins, A., Fortunato, L., & Macleod, M. (2022). The reproducibility debate is an opportunity, not a crisis. BMC Research Notes, 15(1), 1-3. doi: https://doi.org/10.1186/s13104-022-05942-3 

Peng, R. D., Dominici, F., & Zeger, S. L. (2006). Reproducible epidemiologic research. American journal of epidemiology, 163(9), 783-789. doi: https://doi.org/10.1093/aje/kwj093 

Rabeyron, T. (2020). Why most research findings about psi are false: The replicability crisis, the psi paradox and the myth of Sisyphus. Frontiers in Psychology, 11, 562992. doi: https://doi.org/10.3389/fpsyg.2020.562992 

Ritchie, S. J., Wiseman, R., & French, C. C. (2012). Replication, replication, replication. The Psychologist.

Romero, F. (2019). Philosophy of science and the replicability crisis. Philosophy Compass, 14(11), e12633. doi: https://doi.org/10.1111/phc3.12633 

Stroebe, Wolfgang. "What can we learn from many labs replications?." Basic and Applied Social Psychology 41, no. 2 (2019): 91-103. doi: https://doi.org/10.1080/01973533.2019.1577736 

Szucs, D., & Ioannidis, J. P. (2020). Sample size evolution in neuroimaging research: An evaluation of highly-cited studies (1990–2012) and of latest practices (2017–2018) in high-impact journals. NeuroImage, 221, 117164. doi: https://doi.org/10.1016/j.neuroimage.2020.117164 

Vul, E., Harris, C., Winkielman, P., & Pashler, H. (2009). Puzzlingly High Correlations in fMRI Studies of Emotion, Personality, and Social Cognition. Perspectives on Psychological Science, 4(3), 274–290. https://doi.org/10.1111/j.1745-6924.2009.01125.x

Weisberg, D. S., Keil, F. C., Goodstein, J., Rawson, E., & Gray, J. R. (2008). The seductive allure of neuroscience explanations. Journal of cognitive neuroscience, 20(3), 470-477. doi: https://doi.org/10.1162%2Fjocn.2008.20040 

Yong, E. (2012). Bad copy. Nature, 485(7398), 298. doi: https://doi.org/10.1038/485298a 

 

0 Comments

Author

Christina G. Lutz

Christina G. Lutz

PhD candidate in the Developmental Neuroimaging Group, Department of Child and Adolescent Psychiatry of the University of Zurich

Interested in Contributing?

Subscribe

Follow Us

Related Posts

Program Manager – Bioethics Program

UCSF Bioethics is seeking a Program Manager to work closely with a group of inter-disciplinary UCSF faculty engaged in bioethics activities (education, research or clinical). In close collaboration with faculty, the Program Manager supports the program’s strategic...

Qualitative Researcher, UCSF Memory and Aging Center

The UCSF Memory and Aging Center seeks a highly motivated and collaborative postdoctoral scholar to join an interdisciplinary research program on experiences of dementia and of dementia caregiving. This work will be directed by Winston Chiong, MD PhD; in conjunction...