If you have ever tried to keep a workshop clean while you are still building something, you know the trick is not the big cleanup at the end. It is the steady habit of putting tools back as you go, so the mess never gets a chance to take over.
That is roughly the shift many labs are facing with compliance. As Kimberly Remillard, a senior regulatory affairs manager at Thermo Fisher Scientific, put it, “With new technologies, labs are able to generate more data now than ever before.” The catch is that “ensuring that this data is traceable and auditable is a growing challenge for labs as they continue to digitize operations while new regulatory requirements and standards emerge.”
In other words, the work is not slowing down, and the expectations are not either.
More data, more pressure on the plumbing
Digitization has expanded what labs can capture, store, and analyze, but it has also raised the bar for documentation and integrity. Remillard framed it as a problem of scale and complexity: labs are managing “large and complex datasets” while trying to maintain “proper documentation and data integrity,” even as “regulations are constantly shifting across industries.”
That collision is pushing many labs toward systems designed to make compliance less of a periodic scramble and more of a built-in feature. Remillard pointed to core infrastructure tools as a practical starting point: “The need to ensure traceability, auditability and data integrity across the lab presents the need to integrate technologies that enhance their workflows that can make this compliance easier, such as a laboratory information management system (LIMS) or an electronic lab notebook (ELN).”
Her argument is not that LIMS and ELNs are new, but that what labs need from them is evolving. The software can no longer just store records. It has to support scrutiny.
From scheduled checkups to daily proof
Historically, many organizations treated compliance like a calendar item, important, sometimes painful, but bounded. Remillard suggested that model is breaking under modern data volumes and faster iteration cycles.
“Leveraging these technologies has changed the day-to-day lab workflow,” she said, adding that “the constant need for regulatory compliance is an added responsibility among lab teams.”
The bigger change is when compliance has to be demonstrated. “Teams are no longer tasked with vetting their operations to ensure compliance at regularly scheduled maintenance checks,” Remillard said. “Rather, compliance must be promised in each step of the workflow every day.”
In practice, that means labs need systems that make it difficult to do the wrong thing accidentally. It also means teams need to stop thinking of audit readiness as something you prepare for later. You prepare for it continuously.
Designing for auditability
Remillard emphasized that “labs should look to work with technologies that align with regulations and rulings like the EU GMP Annex 11” and “CFR Part 11,” because that alignment helps make “real-time compliance” achievable.
Her framing of “real-time compliance” is less about buzzwords and more about operational features that reduce risk without adding friction. “To ensure compliance and prioritize scientific discovery, labs must work with next-generation technologies to integrate real-time regulatory compliance into their software,” she said. “Labs must look for systems that enable easy auditability and that keep pace with newly publicized and drafted guidance.”
Indirectly, the message is that compliance tooling should not sit off to the side. It should live where the work happens, in instrument workflows, data capture, and reporting.
Automation as the compliance engine
Remillard drew a straight line between “real-time compliance” and automation, especially for labs dealing with large data sets. “To meet the ‘real-time compliance’ requirements, labs must be automated,” she said. “This technology is no longer a ‘nice to have’ – it is a strategic imperative for labs working with large data sets.”
She also gave concrete examples of what “real-time compliance” can look like on the ground: “automated alerts that inform staff if any steps are missed,” systems that “notify if an instrument is out of spec or calibration,” and “instant reporting and analysis of large datasets to flag trends or discrepancies.”
Read another way, she is describing compliance as a feedback loop. When the system can detect deviations early, it can steer people back to the process before minor issues turn into audit findings, or worse, bad science.
AI raises the stakes, and could help carry the load
As labs bring more AI into workflows, Remillard argued they should expect a double effect: more scrutiny and more opportunity.
“Especially as AI is further integrated into the lab, it will not only require additional regulatory compliance efforts,” she said, “but it will help further enable ‘real-time compliance’ to help labs move faster to amend any issues that appear in experimentation.”
She tied that back to outcomes labs actually care about: “more reliable and consistent results and faster discoveries.”
That is a useful way to keep the conversation grounded. AI governance can become abstract quickly, but Remillard’s point is practical: if AI and automation are woven into the workflow correctly, they can tighten traceability, speed corrective action, and reduce human error, even as they introduce new validation and documentation expectations.
Compliance as a pillar, not a patch
Remillard’s underlying theme is that compliance cannot be bolted on after the fact. “Compliance should be a central pillar amid any lab’s software integration,” she said.
For lab leaders, that shifts how software decisions get made. Instead of evaluating tools only on features that boost throughput, teams also have to ask whether systems support audit trails, data integrity controls, and documentation that holds up when regulators or internal quality teams come calling.
Or, to go back to the workshop: you can still build fast. You just need a setup where the cleanup happens as you go.



