Menu

IoT FEATURE NEWS

Good Bye Moore's Law, Hello AI Acceleration

By

EETimes caught my eye with a guest piece by Karthee Madasamy, Managing Partner at MFV Partners entitled, “As Tech Reaches Compute Limits, Quantum Computing Must Work

Madasamy believes quantum computing is entering a phase in 2025 that mirrors where AI was roughly five years ago. That was when AI seemed ages away but, “eventually, the hardware and computing power caught up with that progress, and everything aligned for takeoff. That same alignment is occurring now in quantum computing.”

He points out that Nvidia CEO Jensen Huang, in January, thought that “practical quantum computing” was still 15 to 30 years away, though he since has admitted that his timeline was wrong.

The company PsiQuantum, in which Madasamy was an early investor, is already making millions of quantum computing chips. Based on his knowledge, he believes we are two years out.

“In fact, quantum computing is entering a phase in 2025 that mirrors where AI was roughly five years ago,” he says.

Besides PsiQuantum, Google announced last December a breakthrough with its Willow. In February, both Microsoft and Amazon also highlighted recent chip developments. These announcements are all pointing to (eventually) upgrading their cloud services.

The truth is that we are close to the limits of Moore’s law, and the advances in AI points are going to tax the existing cloud infrastructure. To continue the momentum we may need a law that is better than Moore’s law. I am not sure that the paradigm should be only based on the physics of computing, but a combination of Quantum and AI – perhaps “the law of acceleration.”

All our computing for the past 150 years has operated on a binary digital system, where everything exists in one of two states: either a zero or a one. From the latest GPUs to the first microcontrollers, this fundamental limitation remains.

Quantum computing breaks the binary digital system by allowing qubits to exist in multiple states simultaneously – to be both zero and one until measured. This approach exponentially increases information storage and computational capacity, creating a fundamentally different system, rather than improving incrementally.

Madasamy explains that, “The Centre for the Governance of AI” has found that, over the past 13 years, the amount of compute used to train leading AI systems has increased by a factor of 350 million.  As quantum computing arrives, he does not believe it will take over AI training entirely, but quantum processors will undoubtedly be able to take on some of the heavy load of computing by accelerating bottlenecks, such as finding optimal weights in large neural networks. It will also unlock entirely new AI architectures. While many dismiss quantum computing’s impact on AI because they cannot immediately envision quantum-powered consumer applications, it will enable more advanced reasoning capabilities in AI systems, which will then translate to consumer applications we can’t yet imagine.

Consider a parallel: When we were building 3G modems 25 years ago, sending data at 2 Mbps to phones with tiny 1.4-inch screens, nobody could anticipate services like Uber or live streaming on smartphones. First comes capability and then comes applications. Quantum will initially address AI’s computational bottlenecks, but consumer applications, particularly those for more sophisticated AI reasoning capabilities, will follow quickly.

The companies that succeed in this transition in moving quantum from physics to engineering will define the next computing era, much as Intel shaped the microprocessor age. We are witnessing the birth of an industry that will fundamentally reshape computing capabilities across every sector.

As with previous technological revolutions, those who recognize this inflection point early will help shape the quantum future rather than merely adapting to it. Quantum’s arrival is really a question of when and not if, as the transition is already underway, and its answers to our current computing, AI and technological challenges will be profound.

I was impressed with Karthee Madasamy’s analysis and suggest reading the entire article.

I will add one more component to this mix, about networks. AI is already taking advantage of Edge Computing and applications exist that navigate the best place to process, whether at the endpoint, middle edge computing, or the cloud. As chips and neural networks advance, I can see where AI will become far more distributed, enabling even more human-like thought processing from the AI persona. What that law will look like, I have no idea. For all I know it will be AI that creates it.




Edited by Erik Linask
Get stories like this delivered straight to your inbox. [Free eNews Subscription]

Partner, Crossfire Media

SHARE THIS ARTICLE
Related Articles

Slicing Up the Network with 5G SA: An Interview with Telit Cinterion's Stan Gray

By: Carl Ford    6/10/2025

Carl Ford speaks with Stan Gray about 5G SA, network slicing, and trends, challenges, and opportunities related to both.

Read More

Cisco Introduces Agentic AI to Industrial AIoT

By: Carl Ford    6/10/2025

The goal at Cisco is to make management of systems easier, particularly for OT, with a focus on operational issues and not on the networks connecting …

Read More

CiscoLive and Well in 2025

By: Carl Ford    6/10/2025

Cisco's new AI infrastructure innovations aim to simplify, secure, and future-proof data centers for the AI era, whether they are on-premises or a hyp…

Read More

What are the Hyperscalers' Goals Working the Power Play with Telcos?

By: Carl Ford    6/6/2025

Are telcos in prime position to support hyperscalers as AI drives up energy and compute needs?

Read More

Meta Goes Nuclear with Constellation Energy.

By: Carl Ford    6/5/2025

Meta will be powering its AI data centers with nuclear power from Constellation Energy's plant in Illinois.

Read More