Welcome to our weekly briefing on R&D headlines shaping technology, energy, manufacturing, and more. In this edition, we continue to explore how the Chinese startup DeepSeek is potentially rewriting AI training assumptions, the potential ripple effects on global energy demand, new tariffs upending automotive supply chains, plus highlights from the Idaho National Laboratory, Procter &…
This week in AI research: Latest Insilico Medicine drug enters the clinic, a $0.55/M token model R1 rivals OpenAI’s $60 flagship, and more
While OpenAI charges $60 per million tokens for its flagship reasoning model, a Chinese startup just open-sourced an alternative that matches its performance—at 95% less cost. Meet DeepSeek-R1, the RL-trained model that’s not just competing with Silicon Valley’s AI giants, but in some cases running on consumer laptops in some configurations rather than in data…
Sensor data, reimagined: When 90% less data can fuel 100x gains in efficiency in AI projects
For decades, the Nyquist-Shannon theorem—a foundational principle of signal processing—dictated that fully sampling a signal at or above twice its highest frequency was essential for capturing critical information. Now, a Pennsylvania startup called Lightscline suggests we may be entering a “post-Nyquist era.” According to a recent Nature Scientific Reports paper, the company’s neural-network-based software, inspired…
How generative AI gave climate modeling a 25x speed boost
A team of researchers at the Allen Institute for AI and UC San Diego have introduced a climate modeling approach boasting 25× speed-ups and associated energy savings over the emulated physics-based model FV3GFS. Specifically, the method takes about 2 hours and 56 minutes to run a 10-year simulation The physics-based FV3GFS model takes about 78 hours…