
Image generated in the FLUX playground from Black Forest Labs
OpenAI CEO Sam Altman is rumored to be staffing up Merge Labs, an unannounced brain-computer interface venture that points away from skull-opening implants and toward ultrasound-based sensing. Reporting from The Verge says Altman tapped Caltech engineer Mikhail Shapiro, known for ultrasound neurotech and gene-encoded “acoustic reporter” cells, suggesting a non-implanted, “read-only” path to decode intent without electrode arrays. At a press dinner in August, Altman put it simply: “I would like to be able to think something and have ChatGPT respond to it… Maybe I want read-only.” The Financial Times reports Merge aims to raise about $250 million at an $850 million valuation, potentially from OpenAI itself. Details on product and timelines remain scant, but the technical direction is clear: interface by sensing, not surgery.
Details on product and timelines remain scant, but the technical direction potentially involves an interface by sensing rather than surgery. To date, the only “Merge Labs” with a public site is a crypto venture unrelated to Altman’s effort. While the rumors are moving faster than the company’s web presence, if true, Merge’s focus would be part of an emerging pattern with roots in post-humanist writing from Ray Kurzweil, whose 2005 and 2024 books The Singularity Is Near/The Singularity Is Nearer project a human-machine merge by 2045. Kurzweil told The Guardian he expects intelligence to “expand a millionfold” by 2045. Altman is in a similar camp. In his 2017 blog post on “the merge,” he wrote: “I believe the merge has already started… Our phones control us and tell us what to do when; social media feeds determine how we feel; search engines decide what we think.”
Soft merge vs. hard merge
In his 2017 essay, Altman pegged guesses for the merge between humanity and machine “between 2025 and 2075.” The ultrasound play would be the softer bet on that timeline, aiming for an interface by sensing, not surgery. On the other side of the split screen is OpenAI co-founder Elon Musk, now a rival, whose Neuralink explicitly frames implants as a way to keep pace with AI. Musk’s line has barely changed since 2019, when he described a “scalable high-bandwidth brain-machine interface system,” adding that humans would “have the option of merging with AI.” In essence, Neuralink is based on a hard merge hypothesis: invasive and directly integrated into the nervous system.
The dreams still remain lofty at the moment. Neuralink’s system requires a small craniotomy. A sewing-machine-style robot inserts flexible polymer threads that carry about 1,024 electrodes into motor cortex. While the approach optimizees signal quality, the trade-offs are surgery and device longevity. In early 2024, the first human participant experienced thread retraction, which cut usable channels. Neuralink later restored functionality through software adjustments; a second participant implanted later avoided thread retraction, according to the company.
If Merge succeeds with the ultrasound path, it would sense brain activity through the intact skull, potentially aided by acoustic reporter genes that make targeted neurons visible to ultrasound. That trades scalpels for physics and biology: through-skull imaging loses resolution, and making neurons acoustically responsive implies gene delivery with its own safety and regulatory hurdles.
Not instant kung fu
While the nearer-term promise of brain computer interfaces are therapeutic, both Altman and Musk are ultimately chasing capabilities closer to science fiction film The Matrix. In the 1999 cyberpunk film, characters downloaded martial arts skills directly into their brains through neural ports. Taken at face value, both men gesture at Matrix-style upgrades. Musk talks about “high-bandwidth” symbiosis to keep pace with AI; Altman muses about “think a thought, get a ChatGPT response,” a read-only link that lowers friction to using an AI agent. But writing durable skills into the brain is a different problem. Today’s best read systems decode motor intent or speech with implanted electrodes, while write systems can evoke simple perceptions or nudge learning, not install black-belt reflexes.
Visual-cortex stimulation studies using devices like Second Sight’s Orion system have shown that dynamic electrical stimulation can trace letter shapes that blind and sighted people can perceive and recognize. Hippocampal “memory prosthesis” research at USC and Wake Forest demonstrated modest improvements in recall for specific memory tasks, about 37% above baseline in epilepsy patients, but these remain narrow, clinical prototypes. Ultrasound neuromodulation studies show promise for enhancing motor skill acquisition and modulating neural plasticity in controlled laboratory settings, yet through-skull sensing and standardized safety guidance for consumer applications remain active areas of research.
BCIs are already helping people today
While the long-term mass-market possibilities remain hypothetical, BCIs are seeing growing traction in medicine. Neuralink implanted its third patient with quadriplegia or ALS in early 2025; the company’s first recipient, Noland Arbaugh, now uses the device about 10 hours daily to control his computer, play chess, browse the web, and text—tasks that previously took him minutes per message. Synchron demonstrated an ALS participant controlling an iPad in August 2025 using Apple’s new BCI Human Interface Device protocol, treating neural signals as native input alongside touch and voice. Academic teams have pushed speech BCIs to 62–78 words per minute in lab settings using implanted electrodes. And Blackrock Neurotech, the longest-running player with human implants since 2004, has supported over 30 patients in research trials; one patient has used their Utah Array for more than nine years.



