
[Adobe Stock]
While the current surge in AI interest is likely to cool eventually, it would be wrong to suggest that an AI winter is coming any time soon.
Consider that Avnet’s latest survey of 1,200+ electronics engineers found that eight out of 10 are actively engaged in AI, with 42% already shipping AI-based products and another 40% currently integrating AI into upcoming designs. And it’s not just about products themselves—fully 95–96% of respondents see AI as “somewhat” to “extremely likely” to reshape multiple aspects of the product-design process.
Oh, and AI tools are still hot too. Almost two-thirds of respondents, 64%, were using ChatGPT when looking for information during design and development. And Google Gemini, Microsoft Copilot, and Meta AI weren’t far behind. Only 6% said that they did not use any such AI tools.

[From the ‘Embracing AI‘ Avnet survey]
Morgan Stanley analyst Brian Nowak has forecasted that 2025 capex for the hyperscalers will surpass $300 billion — almost double 2023 levels. The Economist noted that AI data center spending could eclipse $1.4 trillion by 2027. The technology is certainly diffusing into applications across industries “We really do believe in the pervasiveness [of AI],” said Alex Iuorio, SVP, Global Supplier Development at Avnet.

Alex Iuorio
While the AI momentum continues, Iuorio acknowledges the cyclical nature of technology. “To think it’s going to last forever at these kinds of rates would be a little silly.” Yet he cautions against premature predictions of a slowdown, noting, “forecasting the slowdown when there don’t seem to be any signs of it…that’s an interesting position.”
Investors have conflicting views on the AI market. Goldman Sachs, for one, said in September that AI stockers weren’t in a bubble. There are, however, some potential warning signs. Nvidia’s price-to-sales ratio of around 30 and Palantir’s near 68, according to recent valuations, evoke memories of the dot-com bubble’s inflated valuations. Concerns about diminishing returns from larger AI models and regulatory hurdles, especially in the EU, further fuel these concerns. Yet Iuorio sees continued demand growth, especially as AI expands to the industrial edge.
The rapid expansion and investment in AI technologies also recall the rapid dissemination of IoT years ago, which led to widespread security vulnerabilities that continue to plague networks today. Understandable, nearly one in four of the survey respondents, 37.5%, see security and privacy worries as the biggest concern in AI projects, followed by data quality at 30.6%.

[From the ‘Embracing AI‘ Avnet survey]
The shift back to hardware
In the 80s, hardware design was differentiation. … Today, hardware’s commoditized—differentiation shifted to software. Now AI flips it back.
For decades, the tech industry placed a premium on software-centric innovation. In 2011, Marc Andressen declared that “Software Is Eating the World.” But in the 1980s, hardware was cool. In the early years of the decade, Chicago was a thriving center of coin-operated gaming. Major companies like Midway (Pac-Man), Williams Gaming, Atari, and Rock-Ola were all based there or had large presences in the area, driving substantial demand for electronic components.
“It was an incredible time to be in hardware,” recalled Iuorio, who was then working in Greater Chicago. Engineers back then would run out of buildings—sometimes literally into the parking lot—to show off new logic boards that had no jumpers, a feat that improved manufacturability and speed. The sheer excitement around such design breakthroughs highlights how much hardware mattered. “You’d package up an enclosure, a hard drive, a display, and then the software made it an ATM. But early on, it was hardware that gave you an edge,” he said.
Organizations ranging from the Financial Times to Deloitte agree that the pendulum is swinging back. Today’s AI workloads—especially in generative models—require significant computational horsepower, specialized memory architectures, and advanced cooling and power management. As a result, hardware is once again the linchpin for differentiation.
Meanwhile, hyperscalers are buying GPUs in record quantities, snapping up advanced chipsets for data-center AI training. In January 2025, OpenAI—together with SoftBank, Oracle, and MGX—launched the Stargate Project, a commitment of up to $500 billion over four years to build AI data centers across the U.S. The first facility in Abilene, Texas covers 875 acres (rivaling Central Park in size). Earlier, Elon Musk’s xAI venture announced the Colossus supercomputer—already boasting 100,000 NVIDIA H100 GPUs in Memphis—with plans to scale to a one million GPUs by 2027.
The divide between hyperscalers and industrial AI adoption
“The market is going crazy if you’re associated with AI and the chips that support it, but from an industrial-market perspective, it’s not as exciting—yet.”
As consumer-facing AI dominates headlines—fueled by hyperscalers snapping up massive GPU clusters—many industrial players have been slower to climb aboard. The reasons vary, from cost constraints to complexities around retrofitting legacy systems. Yet Avnet’s survey points to a growing wave of industrial-edge applications. Nearly half of the respondents see AI as key to process automation (42%), predictive maintenance (28%), or fault detection (27%)—use cases that promise to boost efficiency without relying exclusively on public clouds. Meanwhile, industrial giants from Honeywell to Siemens have sizable AI initiatives and partnerships.
In the consumer-tech world, AI has morphed from novelty to necessity in record time. But in industrial settings—factories, energy grids, or even hospital equipment—adoption remains more conservative.
In addition to heightened regulatory requirements, many industrial companies maintain equipment that can be decades old, complicating the integration of AI models and edge-capable hardware. Iuorio notes that it’s “very timely” to run this survey and see how AI might apply to the design process. Interest in medical applications of AI is clearly building in everything form drug discovery to surgical robotics.
New tech demands new skills
The Avnet report shows a broad distribution of required skills, with AI model optimization (16.5%), data analysis (16.2%), and problem-solving (15.8%) being nearly equally important. These skill sets aren’t randomly distributed, Iuorio explains, but rather represent “categorical” elements that are “going to be needed to successfully design and deploy” AI technologies.
Ultimately, while generative AI garners many of the headlines, Iuorio sees a far more profound impact once these algorithms step out of the data center and into the real world. As he puts it: “When it gets out to the edge, it gets exciting.”