TY - JOUR
T1 - Efforts to enhance reproducibility in a human performance research project
AU - Drocco, Jeffrey A.
AU - Halliday, Kyle
AU - Stewart, Benjamin J.
AU - Sandholtz, Sarah H.
AU - Morrison, Michael D.
AU - Thissen, James B.
AU - Be, Nicholas A.
AU - Zwilling, Christopher E.
AU - Wilcox, Ramsey R.
AU - Culpepper, Steven A.
AU - Barbey, Aron K.
AU - Jaing, Crystal J.
N1 - Publisher Copyright:
Copyright: © 2023 Drocco JA et al.
PY - 2023
Y1 - 2023
N2 - Background: Ensuring the validity of results from funded programs is a critical concern for agencies that sponsor biological research. In recent years, the open science movement has sought to promote reproducibility by encouraging sharing not only of finished manuscripts but also of data and code supporting their findings. While these innovations have lent support to third-party efforts to replicate calculations underlying key results in the scientific literature, fields of inquiry where privacy considerations or other sensitivities preclude the broad distribution of raw data or analysis may require a more targeted approach to promote the quality of research output. Methods: We describe efforts oriented toward this goal that were implemented in one human performance research program, Measuring Biological Aptitude, organized by the Defense Advanced Research Project Agency's Biological Technologies Office. Our team implemented a four-pronged independent verification and validation (IV&V) strategy including 1) a centralized data storage and exchange platform, 2) quality assurance and quality control (QA/QC) of data collection, 3) test and evaluation of performer models, and 4) an archival software and data repository. Results: Our IV&V plan was carried out with assistance from both the funding agency and participating teams of researchers. QA/QC of data acquisition aided in process improvement and the flagging of experimental errors. Holdout validation set tests provided an independent gauge of model performance. Conclusions: In circumstances that do not support a fully open approach to scientific criticism, standing up independent teams to cross-check and validate the results generated by primary investigators can be an important tool to promote reproducibility of results.
AB - Background: Ensuring the validity of results from funded programs is a critical concern for agencies that sponsor biological research. In recent years, the open science movement has sought to promote reproducibility by encouraging sharing not only of finished manuscripts but also of data and code supporting their findings. While these innovations have lent support to third-party efforts to replicate calculations underlying key results in the scientific literature, fields of inquiry where privacy considerations or other sensitivities preclude the broad distribution of raw data or analysis may require a more targeted approach to promote the quality of research output. Methods: We describe efforts oriented toward this goal that were implemented in one human performance research program, Measuring Biological Aptitude, organized by the Defense Advanced Research Project Agency's Biological Technologies Office. Our team implemented a four-pronged independent verification and validation (IV&V) strategy including 1) a centralized data storage and exchange platform, 2) quality assurance and quality control (QA/QC) of data collection, 3) test and evaluation of performer models, and 4) an archival software and data repository. Results: Our IV&V plan was carried out with assistance from both the funding agency and participating teams of researchers. QA/QC of data acquisition aided in process improvement and the flagging of experimental errors. Holdout validation set tests provided an independent gauge of model performance. Conclusions: In circumstances that do not support a fully open approach to scientific criticism, standing up independent teams to cross-check and validate the results generated by primary investigators can be an important tool to promote reproducibility of results.
KW - data quality
KW - evaluation methodology
KW - reproducibility of results
KW - validation studies
UR - http://www.scopus.com/inward/record.url?scp=85204512781&partnerID=8YFLogxK
U2 - 10.12688/f1000research.140735.1
DO - 10.12688/f1000research.140735.1
M3 - Article
C2 - 39291139
AN - SCOPUS:85204512781
SN - 2046-1402
VL - 12
JO - F1000Research
JF - F1000Research
M1 - 1430
ER -