By David Wang, General Manager, Informatics at PerkinElmer
Artificial intelligence (AI) is everywhere we turn—from smart cars, drones and music streaming, to social media, cell phones and banking.
AI (and machine learning) is also an innovation whose time has come in the lab. Researchers are looking for ways to more easily and effectively access, analyze and spotlight scientific data that is growing in volume and complexity and often dispersed across hard-to-access silos.
The importance of being able to make data-driven hypotheses and decisions for all scientists and technicians in life sciences, bio-pharmaceutical, food science or other R&D disciplines is paramount, and R&D labs can now harness the benefits of advanced AI tools to do this, accomplishing in mere seconds or minutes what once took weeks or months.
Leveraging the unique capabilities of AI to accelerate this journey, however, starts with an understanding of the current state of scientific and operational data in the laboratory.
Here are five steps to help transition towards an AI-rich, Lab of the Future with confidence:
Liberate the data
Scientific data continues to remain anchored to laptops, instruments, paper records, and data silos within and across today’s organizations. Data has also been locked-up in many “home grown” systems, data warehouses, and spreadsheets for decades—with each data source being in a proprietary format for a particular instrument doing unique analysis, or for an individual.
The first major step of making laboratory data AI friendly is to ensure that all experimental data and scientific conclusions can be easily accessed, as well as accurately and securely shared, while making them portable and moving away from highly customized or proprietary systems.
Liberating data starts as simply as transforming files into standard formats—such as PDF or CSV—and ensuring that files are appropriately described (e.g., with the who, what, where, why, and how of the analysis). For example, making critical information like high content screening image data accessible beyond instrument-specific analytical software will provide access for others in the organization and foster collaboration and discovery acceleration.
Securing sharing technologies, such as cloud storage, also makes data further accessible to a wide range of authorized collaborators.
Clearly define end goals
Even the best technologies cannot succeed if they are not thoughtfully applied to solve precise scientific goals and if the analytics are not clearly defined. In general, AI tools and solutions are most powerful when mapped to very specific goals and analytic targets.
For example, to identify patients who are most likely to respond to certain medical treatments, different AI tools would be employed than if doing predictive analysis on a drug’s side effects in a clinical trial. Similarly, a different configuration of AI image recognition algorithms would be applied to classify tissues at risk of invasive cancer versus image recognition used to avoid hitting pedestrians in cross walks with a self-driving car.
The clearer the end goals and key analytics are articulated at the outset (e.g. what is “in scope” and what is “out of scope”), the better the outcome will be and the more rapidly and proactively effective course corrections can be made.
Normalize data
Getting data formats analysis-ready before even asking AI to make sense of that data is critical, especially as data comes in multiple forms and from many sources, including health records, genetic data, public data, clinal trial data, cellular images and much more.
Here it is important to make the basis for analysis consistent. For example, if Patient X’s height is recorded in centimeters and Patient Y’s height is in inches, then analyses of the two without common units would result in erroneous conclusions.
Working towards data standards, with commonly accepted descriptors, definitions and units–through the efforts of organizations like the Allotrope Foundation–is a major step in optimizing data aggregation and analysis and making results meaningful for AI. For example, ensuring that commonly accepted data standards are used when choosing AI to auto-map patient data to SDTM clinical trial data standards can greatly accelerate the power of the underlying analysis.
Maximize operational and infrastructure data
An important part of the move to AI and the Lab of the Future is also optimizing operational and infrastructure data so that scientific results can be easily validated and reproduced.
To do this, it is critical to regularly analyze and apply operational and infrastructure data, such as temperature, humidity, power surges, and reagents use. Maintaining the temperature and humidity requirements of clean room facilities used for biologic drugs, for example, is key.
Essentially, organizations can layer AI onto their lab infrastructure but if that infrastructure or foundation has high variability in instrument operational data and performance, the full benefits of the technology will not be realized.
Think solutions and services
Bringing AI into the lab is not just a software thought, it also requires end-to-end thinking with an overall solution that can be sustained over time. User requirements; configuration plans; integration with other critical experimental workflows, software, hardware and instruments; instrument calibration/re-calibration; team training; and troubleshooting are important aspects to consider holistically when planning.
For example, implementing a “point” AI technology without having a clear understanding of how this will affect the whole experimental ecosystem can easily lead to unexpected results.
Avoiding this requires identifying and then partnering with a strong team of internal and external players and experts to ensure that the full workflow (from scientific, to test results, to data analyses) is being taken in to account.
Reaping the promise of AI
The promise that AI holds for laboratories is exciting.
Getting ready with thoughtful preparation and solution readiness will pay off exponentially by multiplying the power of scientists and technicians to meet today’s challenges and opportunities and push ahead to new horizons.