The impact of this crisis on individual researchers can be profound. In a recent webinar, for which R&D World served as the media partner, Uraisha Magan, R&D Lead at New Form Foods, shared her personal experience: “In academia, I tried to replicate growing cells that were supposed to mimic an intestine. I spent 11 months trying to grow the cells, and it just didn’t work. I read numerous publications, and interestingly, many of the methodology sections were quite similar, as if it was a routine procedure. But I couldn’t replicate it.
“While the problem extends across scientific disciplines given the “publish or perish” pressures that set up perverse incentives, pundits debate the scope of the problem. A 2015 meta-analysis published in Science concluded that 60% of psychology studies failed to replicate. Some researchers rebuffed the scope of the conclusions, but the problem extends across scientific disciplines. In 2023, there was a record-breaking number of more than 10,000 scientific paper retractions, as Nature noted. That’s a new record. In 2005, John Ioannidis, a professor at the Stanford School of Medicine, penned in a seminal essay titled “Why Most Published Research Findings Are False” is a 2005 essay published in PLOS Medicine.
Real-world consequences can have a financial impact
The replication crisis isn’t just an academic concern; it has real-world consequences, especially for industries like biotech that rely heavily on scientific research. As Magan points out, “In biotech… we rely on fundamental research findings from publications. The common consequence of the replication crisis is the loss of time. In industry, the bigger loss is money.”
This reliance on potentially unreliable research can hinder innovation, especially in emerging fields. The cultivated meat sector, which is quite novel, lacks significant confirmed R&D documentation. “The biggest issue in this new sector is the scalability of products, and information in the public domain is really scarce,” Magan said.
Beyond the immediate financial costs, failed replications can have long-lasting economic consequences. If, for instance, a pharma company invests heavily in developing a drug based on irreproducible research, it can lead to tangible financial losses and setbacks for the entire industry. The problem could also extend to many young companies. “In a startup, cost is a big factor,” Magan said. “Having products based on science that’s not reproducible could lead to many other consequences, such as the company’s reputation going down, loss of customers, as well as investors pulling out because the science isn’t reproducible.” Additionally, a lack of trust in research findings can make investors wary of funding new ventures, potentially stifling innovation.
Failed replications can also damage morale and reputation. Researchers can be left feeling demoralized, second-guessing their skills when the problem lies with the irreproducibility of the original research. For companies, building products on shaky scientific foundations can lead to reputational damage and lost consumer trust. “Researchers in industry also feel this, blaming themselves when things aren’t working out, but it’s actually because the science isn’t reproducible,” Magan observed.
Tackling the reproducibility crisis in R&D
So, how can the scientific community and R&D-driven industries combat the replication crisis? While there is no single easy solution, the path forward is clear — prioritizing transparency, rigor, and collaboration — along with a long-term commitment to building a more robust and reliable scientific foundation. “I see a lot of progress already in making sure the science we do is reproducible and trustworthy,” Wu said. “Seeing these changes, infrastructures being built, and practices being improved gives me hope that we’re on track to steering science back to what most people, myself included, originally expected of it.”
Documentation and transparency
As Magan emphasizes, clear and concise documentation is paramount. “The most important thing is clear and concise documentation of all experimental work, including failed experiments, because there’s a lot to learn from those,” she states. This documentation should go beyond just the successful outcomes, capturing the full context of the research process, including any deviations or unexpected results.
Equally important is fostering a culture of transparency and collaboration. Magan advises, “Transparency within your team is crucial — all outputs from your experiments should be shared.” This open sharing of data, protocols, and findings allows for scrutiny, verification, and the identification of potential flaws or inconsistencies.
Rigorous methodology and statistical analysis
“In science, general practices include having independent experiments with their own replicates, writing protocols concisely, and always using statistical analysis to interpret data,” Magan said. Embracing these practices, along with taking advantage of online platforms and tools for data sharing and collaboration, can help build a more robust and reliable scientific foundation.
Standardization and digital tools
In industry settings, Magan emphasizes the value of Standardized Operating Procedures (SOPs): “In industry specifically, we rely on standard operating procedures… We write them down, and every person is trained on them, creating redundancy in the team.” These SOPs provide clear, step-by-step instructions for every research process, minimizing variability and ensuring consistency across experiments.
Digital tools play a crucial role in modern R&D. Magan highlights, “We use Labstep ELN from STARLIMS, which is an online platform tool where all experiments, datasets, protocols, and inventories are stored in one place.” Electronic lab notebooks (ELNs) like Labstep can streamline documentation, data management, inventory tracking, and collaboration, making it easier for researchers to maintain accurate records and share information effectively.
Changing publication practices
Shifting to publication formats that prioritize methodological rigor over positive results could drastically reduce the pressures that lead to questionable practices. Youyou Wu highlights the “registered report” format, where publication is decided based on the study design, not the outcome. One of her collaborators saw his research rejection rate plummet “from 70% to zero” after adopting this approach.
Taking advantage of AI and machine learning tools
While AI is hyped, machine learning tools can be powerful tools in the replication crisis battle. Wu explained how machine learning can predict the replicability of studies before they’re even replicated: “I’m currently working on predicting replication outcomes before a replication study takes place using different methods.” This could help prioritize research efforts and allocate resources more effectively.
On interdisciplinary collaboration
Collaboration between academia and industry can lead to more robust and relevant research. Magan notes, “Industry collaborations can support replication efforts by independently validating experiments from academia…Academia could also benefit from sending students as interns to industry partners for real-world experience.” Wu adds that because of different incentives, “bringing an industry partner into research projects often inherently boosts accountability for more robust, replicable research that actually works.”
The path forward
By changing the incentives, tapping technology, and fostering collaboration, the scientific community can pave the way for more reliable and trustworthy research. The diffusion of technology can be an asset here. “Everything is online — we use project management systems, Google Sheets, and Google Docs,” Magan said. There’s a culture of transparency within New Form where everybody has access to information,” Magan said,
In the webinar Wu, also underscored the role of constructive criticism in addressing the replication crisis, arguing that it’s essential for progress. “I think what we need is a paradigm or space or some infrastructure for criticism,” she said. “What comes along with the replication crisis inevitably is a lot of criticism against research done in the past and against individuals who engaged in this research. We need constructive criticism to move things forward in a more systematic way.”
Tell Us What You Think!