Skip to main content
SearchLoginLogin or Signup

Rigor, Reproducibility, and Transparency in Shared Research Resources: Follow-Up Survey and Recommendations for Improvements

Keywords: core, reproducibility, rigor, shared research resource, transparency, quality

Published onAug 16, 2022
Rigor, Reproducibility, and Transparency in Shared Research Resources: Follow-Up Survey and Recommendations for Improvements


Rigor, reproducibility, and transparency (RR&T) are essential components of all scientific pursuits. Shared research resources, also known as core facilities, are on the frontlines of ensuring robust RR&T practices. The Association of Biomolecular Resource Facilities Committee on Core Rigor and Reproducibility conducted a follow-up survey 4 years after the initial 2017 survey to determine if core facilities have seen a positive impact of new RR&T initiatives (including guidance from the National Institutes of Health, new scientific journal requirements on transparency and data provenance, and educational tools from professional organizations). While there were fewer participants in the most recent survey, the respondents’ opinions on the role of core facilities and level of best practices adoption remained the same. Overall, the respondents agreed that procedures should be implemented by core facilities to ensure scientific RR&T. They also indicated that there is a strong correlation between institutions that emphasize RR&T and core customers using this expertise in grant applications and publications. The survey also assessed the impact of the COVID-19 pandemic on core operations and RR&T. The answers to these pandemic-related questions revealed that many of the strategies aimed at increasing efficiencies are also best practices related to RR&T, including the development of standard operating procedures, supply chain management, and cross training. Given the consistent and compelling awareness of the importance of RR&T expressed by core directors in 2017 and 2021 contrasted with the lack of apparent improvements over this time period, the authors recommend an adoption of RR&T statements by all core laboratories. Adhering to the RR&T guidelines will result in more efficient training, better compliance, and improved experimental approaches empowering cores to become “rigor champions.”

ADDRESS CORRESPONDENCE TO: Andrew W. Ott, Office for Research, Northwestern University, 633 Clark Street, Evanston, IL 60208, USA (Phone: 847-467-1622; E-mail: [email protected]).

ADDRESS CORRESPONDENCE TO: Christopher W. Gregory, Office of Research Technologies, 120 Mason Farm Rd, Chapel Hill, NC 27599, USA (Phone: 919-843-6367; E-mail: [email protected]).

Conflict of Interest Disclosures: The authors declare no financial support or associated conflicts of interest.

Keywords: core, reproducibility, rigor, shared research resource, transparency, quality


Biomedical research is a process of exploring the unknown, deconstructing the complexity of life processes and the pathology of disease, and applying new discoveries to improve and advance the lives of humans, animals, and society. As scientists, we build on existing knowledge, taking incremental steps toward understanding with the occasional leap forward provided by a major discovery or paradigm shift.

Science advances through the publication of novel results and independent replication studies upon which others in the field build new hypotheses to better elucidate biologic processes. The ability to replicate studies requires that work in the laboratory is completed using best practices to ensure that the data collected is accurate, that reduced datasets used to derive conclusions are traceable to raw data, and that all procedures are sufficiently documented to ensure that the experiment can be independently recreated.

The procedures needed to enable replication of important results have been broadly placed under the umbrella of scientific rigor, reproducibility, and transparency (RR&T). Although different groups define the terms RR&T in multiple ways, some directly conflicting, the overriding objective is to ensure that the scientific data is valid and unbiased and that the projects can be replicated by other scientists schooled in the art.[1],[2],[3]

From this point of view, the Association of Biomolecular Resource Facilities (ABRF) Committee on Core Rigor and Reproducibility (CCoRRe) was formed to explore the challenges and opportunities of integrating reproducible research practices in the day-to-day operations of shared research resources (SRRs), also known as cores or core laboratories. The overarching CCoRRe goal is to increase awareness and provide guidance to shared research SRR leaders and their staff members as they strive to operate in a rigorous, reproducible, and transparent manner.

In 2017, one of the first actions the committee took was to launch an open survey of managers, staff, and users of core laboratories to gain information on how the National Institutes of Health (NIH) initiatives on advancing scientific rigor and reproducibility influenced current services and new technology development. In addition, the survey aimed to identify the challenges and opportunities related to the implementation of new reporting requirements and to identify new practices and resources needed to ensure rigorous research.

The survey results that were published in the Journal of Biomolecular Techniques in 2019 revealed that whether or not SRR staff were aware of the NIH guidelines, they still had a deep understanding of what tools were needed to ensure RR&T in their daily operations.[4] However, they often felt powerless to implement changes within their institutions. These results were largely reproduced in a survey of primarily European core staff in 2020 conducted by the Interdisciplinary Neurobehavioral Core at Heidelberg University, indicating that insufficient resources exist to meet expectations related to RR&T, and the communication between cores and research groups should be improved to enhance the quality of research.[5]

In 2021, the CCoRRe launched a follow-up survey to determine if, over the last 4 years, any of the NIH guidelines on RR&T requirements for more transparency in scientific journals, and the efforts of professional and scientific associations providing educational tools for scientific rigor and integrity, had positively impacted project management and SRR output.

Importantly, the CCoRRe survey aimed to also determine if the rapid decline in one-on-one training and in-person mentorship triggered by the pandemic would have an effect on maintaining RR&T. An additional goal was to shed light on how the tools used to increase scientific rigor have been applied in response to the COVID-19 pandemic and how that learning may, or may not, impact future plans for core operations.

Given the extensive coverage of RR&T in mainstream media,[3],[4],[6],[7],[8],[9],[10],[11],[12],[13],[14],[15] funding agency educational programs,[2],[16],[17],[18],[19] and scientific literature[20],[21],[22],[23] both before and after the 2017 survey was completed, there were reasons to be optimistic that behaviors would improve over this time period. In analyzing results of the survey, the authors attempt to understand if the evolution of views of core scientists on RR&T is in alignment with the views of the broader scientific community, including funding agencies and publishers/editors.

This information should be used to aid the ABRF and SRR managers in defining the next steps required to ensure that research completed using core facilities meets the high-quality standards and maintains the public support for national level investment in basic research.


Survey design

The 2017, the CCoRRe anonymous survey contained both multiple choice and open-ended text questions that provided responders with the freedom to explore each subject without a priori or external influences. In contrast, the 2021 survey was built upon consensus responses obtained in the first survey as a tool to better explore the influence the last 4 years had on scientific RR&T as experienced by SRR staff, customers, and administrators. The 2021 online survey was administered using REDCap.[24] Invitations were sent to the Core Administrators Network Coordinating Committee of ABRF with a request that core administrators forward the survey within their institutions. All survey participants remained anonymous.

Data analysis

The 2021 survey contained both multiple choice and Likert scale questions. In cases in which open text responses were used in the 2017 survey, responses were categorized as previously described.[4] Respondents were allowed to select multiple categories, and/or “other,” as a response. In that case, respondents were able to provide text entries. If questions in the 2017 survey asked participants to write in a response, similar language was used for prompted responses in the 2021 survey.

Categories are paraphrased in the interest of brevity. In these cases, 2017 responses are compared to 2021 responses as a percentage along with the number of respondents that provided an answer to the specific question. In cases in which users provided open responses in the 2017 survey that were used to create prompted responses in the 2021 survey, only the 2021 responses are provided. Answers in “other” were used to determine if the prompts provided were representative of most common responses. Statistical analysis was performed on Likert scale questions by assigning a point value of 1 (most important) to 5 (unimportant). Prompts for preparedness for the COVID-19 pandemic and preparation for a future pandemic were identical and could be compared to each other.


Survey review

A total of 100 participants started the 2021 survey, with 65 (65%) completing all questions, in contrast to 243 and 198 (81%) completing all questions for the 2017 survey. As was the case in the previous survey, the majority of respondents are core facility directors or managers (64% in 2017 versus 69% in 2021) and work in an academic setting (67% in 2017 versus 72% in 2021). As shown in FIGURE 1, the demographics in this survey, when categorized by core technology type, remain broad and of similar distribution to the previous survey.


Technologies supported by the survey respondents. Respondents were able to select more than 1 technology type.

Respondents who identified as ABRF members increased from 46% in 2017 to 67% in 2021. We surmised that this was likely due to how the survey was advertised within their professional organization. More than 150 individuals participated in the 2017 survey, responding within the first 5 days, whereas the second survey remained open for over 4 months and received a total of 100 partial responses.

Awareness of RR&T best practices continues to remain high, as only 16 of the 86 respondents admitted being completely unaware of any guidelines put in place by governmental agencies, private funding agencies, or major publishers addressing research reproducibility concerns.

Based on the 2017 survey, common categories were used to discover what factors contributed to a lack of RR&T compliance. These categories were used as prompts in the 2021 survey. The prompted responses in the 2021 survey appear to adequately describe the reasons for lack of compliance, with “pressure to publish” being the only category consistently provided unprompted. Not surprisingly, categories related to education (experimental design, training, and understanding technology) dominate issues, with prioritization (lack of funds/time) and lack of scrutiny at the time of publication (inadequate data/documentation, inappropriate tools, and irresponsible conduct of research) significantly contributing to issues. When asked what procedures the core has in place to support reproducible research, FIGURE 2 shows responses between the 2 surveys were consistent with most guidelines concerning RR&T already implemented within the cores. The high adoption rate of these procedures within this survey cannot be generalized across all cores, as this survey was voluntary, but would indicate that the infrastructure to deliver high-quality, reproducible research exists in the community.


Procedures in place to support reproducible research.

However, little additional adoption of best practices occurred in the last 4 years, which may be attributed to cores focusing on operations during the pandemic. When asked what additional procedures should be implemented to improve RR&T and what the barriers are to implementing the procedures, as shown in FIGURE 3, responses were consistent again, indicating that needs have not changed significantly during this time.


Additional procedures or activities that should be implemented to improve research reliability.

“Assistance requested from ABRF” categories were statistically equivalent between the 2 surveys, with “online tools and resources” prioritized as the most important method that ABRF can offer to support cores. Given the emphasis on the education of individuals performing research, communication between cores and the research groups is of paramount importance. This communication should be encouraged at the institutional level and by the cores themselves. Unfortunately, as shown in FIGURE 4 and FIGURE 5, the majority of institutions and cores are not initiating such a discussion. It is therefore not surprising that three-quarters of respondents indicated never having received a request for a statement regarding RR&T.


Percent of respondents indicating that core provides documentation addressing rigor or reproducibility for services provided.


Percent of respondents indicating their institution has requested the cores’ participation in promoting rigor and reproducibility for the investigators.

Comparing answers from the same respondents, there is a very strong correlation with cores and institutions emphasizing RR&T and core customers using core expertise in grant applications and publications (Table 1). The reliance of principal investigators (PIs) on core expertise and input for grant applications and publications increases from less than 10% in cases in which neither the core nor the institution have emphasized RR&T to 75% in which both do. Developing these statements is not onerous and appears to be highly effective in stimulating discussion at the grant stage.

Table 1

Correlation between cores/institutional initiation of discussion regarding rigor and reproducibility with requests from research groups for additional information


Have you received requests for a rigor and reproducibility statement for your services?

Has your institution asked your group to participate in any way in promoting rigor and reproducibility for the investigators?

Does your group have a document or statement addressing rigor and reproducibility for the services it provides?

Yes, to support papers

Yes, to support grants

Yes, other reason
































Clearly, the pandemic strained core resources, as some SRRs had to increase operations to meet COVID-related demand for services, had limited access to facilities, faced budgetary issues, and experienced challenges with supply chains. Therefore, many resources that may have been utilized for RR&T-related projects were diverted to pandemic preparedness. Survey results shown in FIGURE 6 indicate that core staff primarily focused on communicating within the core laboratory network to formulate plans to respond to the pandemic. Regular meetings initiated by ABRF helped cores develop and proliferate best practices as well as provided a sense of community at a time when many were isolated.[25]


Pandemic response tools impacting RR&T.

Prompted categories appeared to describe the breadth of actions taken by the cores. The only consistent response in the “other” category was to expand the use of online training and remote access to instrumentation. Interestingly, many of the other strategies aimed at increasing efficiencies have a high overlap with best practices related to RR&T, including the development of standard operating procedures (SOPs), supply chain management, and cross training. This would indicate that there is a strong economic driver to improve RR&T in core laboratories but that the work is not being consistently prioritized. The most commonly requested form of assistance from respondents concerned the ABRF. Survey participants indicated that it is critical that the ABRF provide examples of best practices to cores regarding SOPs, supply chain management, and communication.

Review of RR&T tool development and impact to grants at NIH

Since the NIH called for reform and improvements related to RR&T, many NIH notices[26] relating to rigor and transparency have been issued, and many online training module[17] policies, manuals, and tools[27] have been developed.[28] Given the multifaceted nature of conducting and publishing science, Koroshetz et al. have proposed establishing a network of “rigor champions”—trainees, researchers, educators, institutional leaders, journal editors and reviewers, scientific society members, and funding organizers—who are all committed to promoting RR&T.[29] Linked to this paper, the National Institute of Neurological Disorders and Stroke has created a “Rigor Champions and Resource” website that includes a table with more than 170 RR&T-related resources.[30]

A recent survey was conducted across a range of disciplines to better understand scientific experiments and research practices for RR&T and decipher what hinders reproducibility.[31] Results from 101 researchers confirmed that the reproducibility of scientific results is an important concern and that the primary reasons for poor RR&T included lack of publicly available data for use, lack of sufficient metadata, and lack of complete information in the methods/SOPs/protocols. Similarly, the results of the current CCoRRe RR&T survey demonstrate that core directors working in SRRs understand the value of championing rigor but also continue to struggle with poor experimental design, lack of training, lack of standardized protocols, and inadequate documentation from their customers, leading to inadequate RR&T (Table 2).

Table 2

Factors contributing to lack of compliance with RR&T guidelines



Poor experimental design


Lack of training, mentorship, oversight


Lack of standardization in protocols and data analysis


Poor understanding of technologies/lack of training


Inadequate documentation (experiments or data management)


Lack of funds


Lack of time


Inappropriate experimental or analytical tools


Inadequate peer review


Irresponsible conduct of research


Pressure to publish*


*Most common unprompted reason provided by respondents.

We propose that SRRs are well suited to act as “rigor champions” and as key contributors to the scientific process, and in that vein, they should apply their expertise to study design, training, SOPs, and method validation and documentation, thereby improving RR&T practices and serving as the “front line” in enhancing the materials/methods sections of papers and data-sharing practices.

The 4 elements of the NIH policy on enhancing reproducibility through rigor and transparency are scientific rigor, scientific premise, authentication of key resources, and control for sex/other relevant biological variables.[32] Some progress has been reported related to sex as a biological variable (SABV).[33] In the 2019-2023 Trans-NIH Strategic Plan for Women’s Health Research, the emphasis is placed on end-to-end integration of sex and gender in the biomedical research enterprise, supporting a future in which every woman is offered evidence-based disease prevention and treatment. Although this is an important strategy, grant funding decisions and scientific studies and publications will demonstrate the scientific community’s embrace of SABV as an essential component of RR&T.

SRRs must also play a key role in study design and data analysis to ensure compliance with these standards. Given the proposed NIH elements for enhancing RR&T, a qualitative study of successful PIs was conducted to assess research excellence and integrity.[34] The authors of the study interviewed 52 PIs who all were required to meet the following criteria: they had to be federally funded researchers conducting high-quality, high-impact research and have reputations for professionalism and integrity.

More than 50% of the respondents cited that the following 8 practices foster rigor and compliance: holding regular team meetings, encouraging shared ownership and decision-making, providing supervision and guidance, ensuring sufficient training, fostering positive attitudes about compliance, scrutinizing data and findings, expressing value and expectations, and establishing and following SOPs. Similarly, SRRs are encouraged to observe these practices as they participate in scientific enterprises. Quantitative and qualitative measurements of RR&T practices in SRRs and laboratories are essential to empowering management and continuous improvement. At the time that this manuscript was prepared, the NIH had not published metrics reflecting the impact of RR&T guideline implementation on the reproducibility of federally funded research.

Menke et al. describe a new tool called “SciScore” that uses natural language processing and machine learning and can be used by journals and authors to drive RR&T compliance.[35] Developed with the ability to recognize 15 different rigor criteria in the materials and methods sections of papers and assign a score, this tool can be used to generate a rigor and transparency index (RTI) across journals.

The authors report that the average RTI score for biomedical research has more than doubled between 1997 and 2019, indicating improvement in RR&T. However, the bulk of these advancements occurred prior to 2015, indicating a recent reduction in focus on continuing RR&T improvements. Since 2010, publications that include the terms “rigor,” “reproducibility,” or “transparency” in the title have continued to increase their RTI scores annually (FIGURE 7).


Publications by year on the topic of RR&T (source: and retractions (source: Retraction Watch). Search details in supporting information.

For reference, during this same period of time, publications in science and engineering have increased by ~23%.[36] The current CCoRRe survey does not indicate that this apparent increased focus on RR&T in published papers is necessarily reflective of practices by SRR customers. Indeed, the number of retractions because of irreproducible methods, issues with starting materials/antibodies, or misconduct in the basic life sciences, physical sciences, and health sciences during this time period is also increasing according to data from Retraction Watch.[37] Given this phenomenon, scientists in this survey appear to be witnessing firsthand what is occurring in the literature. It is essential that SRRs play a key role in ensuring materials and methods are reported completely and transparently to further improve the RR&T of published papers, indicating continued focus on the topic.


The 2021 CCoRRe survey indicates that little has changed in the 4 years between surveys in the view of core facilities directors, the primary respondents to this survey. In fact, if anything, the main observable appears to be fatigue in this area, evidenced by the reduction in the response rate in the most recent survey. There is no way to determine if this fatigue is due to the pandemic, indifference, or another cause. However, it is clear that SRRs play a critical role in RR&T, as most solutions to reproducibility issues are technical in nature and technique specific.

Three recent reports from the National Academies of Sciences, Engineering, and Medicine examine this issue and recommend actions that can improve reproducibility and replicability in research.[2],[16],[19] All 3 reports emphasize that there is a strong connection between openness and transparency and more reliable and trusted science. Transparency about methods, data, computer code, and other aspects of research is crucial to efforts to reproduce and replicate research. Persistent efforts by many stake holders—researchers, research institutions, funders, and publishers—will be needed to move the research enterprise toward greater openness and transparency in ways that support reproducibility and replicability.

The most important step SRR directors can take to improve RR&T is to initiate discussions with PIs related to transparency, quality, and replicability of work. Core leadership consistently cites the adoption of SOPs, consultation, and education as the most important activities needed to improve RR&T. Given that most respondents indicate that their core does not even have a statement addressing RR&T, there is little reason for individual researchers to view the core staff as an expert on the topic. Additionally, SRRs are positioned to increase transparency and decrease incidents of misconduct by publicizing SOPs and creating data flows that preserve raw data used in publications.

What steps should SRRs take to address the RR&T challenge facing science? We propose the following:

  • set the standards by developing an RR&T statement that can be prominently displayed on SRR websites and/or in the laboratory (Table 3),

  • operate according to the adopted RR&T guidelines,

  • educate SRR customers during a mandatory and initially free consultation to explain the standards, and

  • enforce RR&T standards to ensure that all experimental protocols managed by the SRR are accurately reflected in the methods sections of papers and that all the data generated by the SRR are available and supported by metadata (when applicable).

    Table 3

    Nine steps to rigorous and reproducible experiments

    1.      If using a core facility, consult with the core staff in the planning stage. Consult with a statistician if you need help developing a power analysis to assure that your results will be adequately powered.

    2.      Design your experiment with sufficient and appropriate controls (rigor) and replicates (reproducibility).

    1. Assure that ALL of your reagents (antibodies, cell lines, mice) are fully validated.

    2. Have a clear and detailed protocol (SOP) and data analysis plan. Assure that the protocol is strictly followed or that any deviation is well documented.

    3. Assure that the staff or students performing the experiment are well trained and understand each step and the importance of performing them precisely.

    4. Use only well-maintained instrumentation, preferably maintained and operated in a core facility with expert staff (see #1 above).

    5. Document all steps, reagents, equipment, and data analysis methods used in the experiment.

    6. Assure that both the documentation and the original, unprocessed data are properly stored in a safe data management repository.

    7. Acknowledge all grants that support the core, the core (by name), and core staff in publications.

To assist with step 1, a template for RR&T content for SRR websites and practical RR&T guidance and checklists for RR&T are being developed at many institutions. These best practices can be accessed through a single site ( New sites will be added as authors are made aware of them. Core laboratories are encouraged to adopt and adapt this template for their specific core needs. Once the RR&T statement is public, SRRs are encouraged to publicize its existence to core customers and to provide brief overviews to all customers utilizing the SRRs. While the authors understand the potential limitations of this approach (willingness of SRR customers to comply, ongoing poor study design, exclusion from manuscripts, etc), we also posit that this is an important first step in ensuring RR&T compliance and supporting better experimental approaches. As “rigor champions,” SRRs can lead the way in improving research reliability.

No comments here
Why not start the discussion?