In a repurposed Electrolux factory in Memphis, xAI trained Grok 3 on a supercomputer dubbed “Colossus,” which the company says includes 200,000 Nvidia H100 GPUs. Elon Musk and senior xAI employees described Grok 3 as “the smartest AI on Earth” in a recent livestream, citing each GPU’s capacity of up to 4 PFLOPS (four quadrillion…
25 landmark R&D-heavy tech funding rounds of 2024
[Updated December 18, 2024 with new details on Databricks] In 2024, AI-focused startups continued to dominate the funding landscape with industry leaders like OpenAI ($6.6B), xAI ($6B), and Anthropic ($4B) leading the pack. Adding to this trend, MLflow and data lakehouse pioneer Databricks recently secured a $10 billion funding round that was primarily equity-based, at…
Amazon, Apple and NVIDIA compete for GPU talent with salaries of $300k or more
Tech giants are assembling AI supercomputers of unprecedented scale. Microsoft, Meta, Amazon, and Elon Musk’s xAI are building clusters boasting 100,000 or more H100 GPUs, with xAI planning to double its Colossus system to 200,000 GPUs — including the upcoming H200 model. To put this in perspective, OpenAI reportedly trained GPT-4 using roughly 25,000 A100…