Lab equipment heavyweight Thermo Fisher Scientific is the latest to pair up with NVIDIA during this week’s J.P. Morgan Healthcare Conference in San Francisco, as large suppliers and AI platform companies jockey to define what “AI-ready labs” actually look like in practice.
Announced the same day was Lilly-NVIDIA joint investment potentially worth $1 billion over five years which will involve a physical co-innovation lab in the Bay Area opening by end of March.
Thermo Fisher announced a pact with OpenAI in October to “embed advanced artificial intelligence across its clinical trials business.”
By comparison, the Thermo-NVIDIA news was lighter on details. Thermo Fisher did say the effort will combine its scientific instruments and lab software with NVIDIA’s AI platform to “progressively increase” lab automation, accuracy and speed. The alliance will emphasize connecting instruments, infrastructure and data to AI tools that can reduce manual steps in experiment design, sample prep, instrument runs and analysis.
On the NVIDIA side, the companies pointed to infrastructure and model tooling including DGX Spark, plus NeMo and BioNeMo, as core building blocks. NVIDIA executives also framed the effort as part of a move toward “lab-in-the-loop” science, where AI systems, agents, and instruments are tightly coupled so that labs can iterate faster, with fewer manual steps between hypothesis, experiment and analysis. Thermo Fisher is positioning itself as a systems integrator for modern lab operations: instruments plus informatics plus automation, with AI embedded across the stack rather than bolted onto a single workflow.
NVIDIA has recently announced a growing roster of deals with companies ranging from IQVIA to Illumina, Mayo Clinic and Arc Institute.
In related news, TetraScience announced a separate partnership with Thermo on the same day also related to AI. The Thermo-TetraScience pact will focus less on instrument-side automation and more on the persistent obstacle that slows many “AI in the lab” ambitions: scientific data trapped in incompatible formats across vendors and software systems. TetraScience says its vendor-agnostic “Scientific Data Foundry” and “Scientific Use Case Factory” help standardize experimental outputs into a “AI-native” form.



