
Image courtesy of 10x Genomics
As Anthropic rolls out Claude for Life Sciences, 10x Genomics is positioning its cloud as the analysis engine behind natural-language queries on giant single-cell datasets. “What we just launched is a meaningful first step toward that: an integration with Claude where we expose an MCP interface to the tools we have in our cloud,” said Michael Schnall-Levin, 10x Genomics’ chief technology officer. MCP, or Model Context Protocol, is an open standard that lets AI models connect to external tools and data sources through a common interface, so Claude can talk directly to 10x’s analysis environment rather than relying on ad hoc scripts or manual exports.
Those tools include 10x’s Cloud Analysis platform and its single-cell pipelines for aligning reads, generating Feature Barcode matrices, performing clustering and other secondary analyses, which historically required a mix of scripting and command-line expertise or dedicated bioinformatics support. For now, the collaboration focuses on letting Claude call those existing workflows via MCP, so a scientist can ask for an analysis in plain English (or plain other common languages) and have Claude configure and launch it without writing code. “There’s a whole additional layer of scientific analysis and insight we can build on top,” Schnall-Levin said, framing the current release as a meaningful starting point rather than a full AI co-investigator.
What the integration actually does
Under the hood, the 10x–Anthropic collaboration is essentially wiring Claude straight into 10x’s existing cloud toolchain. Through the MCP interface, Claude sees 10x’s Cloud Analysis environment as a menu of callable tools: set up a run, launch a pipeline, monitor status, pull down results and stitch them together. Instead of a bioinformatician writing and debugging shell scripts, the scientist describes the task in natural language and Claude translates that request into concrete calls against 10x’s cloud.

10x describes Feature Barcode technology as “a method for adding extra layers of information to cells by running single cell gene expression in parallel with other assays.”
Those tools are the same ones 10x customers already rely on for single-cell and spatial work. Cloud Analysis can align sequencing reads, generate Feature Barcode matrices, run clustering and other secondary analyses, and aggregate results across experiments. In the traditional workflow, that meant knowing which pipeline to invoke, how to specify the right parameters and file paths, and how to route outputs into the next step. In many labs, that practical know-how lives with a single computational specialist.
The integration with Anthropic shifts that plumbing into the background. A scientist can point Claude for Life Sciences at a 10x dataset and ask for common tasks in plain English: “run the standard single-cell RNA-seq pipeline on these samples,” “cluster these cells and report the major populations,” or “combine these runs and put the key metrics in a table.” MCP handles the translation between those requests and the underlying 10x workflows, so researchers do not have to touch code or the command line at all.
For now, that is the scope: Claude can configure and launch existing 10x workflows, manage data and aggregate outputs, but it is not yet doing bespoke scientific reasoning on top of those results. Schnall-Levin emphasized that gap as intentional.
There’s a whole additional layer of scientific analysis and insight we can build on top.
He pointed to case–control comparisons, longitudinal analyses and tighter links to literature-mining tools as logical next steps. In other words, the current release turns Claude into a conversational front end for 10x’s analysis engine; the “AI co-investigator” layer is still to come.
Who this is for
The obvious beneficiary is the non-computational bench scientist who has been dependent on a bioinformatics colleague to get from raw reads to interpretable figures.
“I think it’s almost anybody, but probably skewed toward non-computational people,” Schnall-Levin said. “A lot of lower-level tasks, gathering data, doing basic trending, running different tools and stringing their outputs together, would previously have required a bioinformatician. Now a lab scientist can do some of that via the natural-language interface with an LLM.”
At the same time, he sees clear upside for the computational side of the house. Much of what the integration automates are the chores that soak up time but do not require deep scientific creativity.
“A lot of what we’re automating initially is the grunt work: plumbing data from the output of one step to the input of another, bringing data together, summarizing tables, taking CSV or Excel-type files, pulling out the right columns, and doing some statistics,” he said. “Those tasks are not very fun for bioinformaticians, but they can consume a decent amount of time.”
“It’s crazy how much time you can waste on things like different delimiters, odd formats, carriage returns. Or CSV files with dozens of columns,” he added.
In that sense, the integration is less about replacing bioinformaticians and more about letting both groups, coders and non-coders, spend more time on the scientific questions they actually care about.



