The next challenge in Generative AI is hardware efficiency

One of the challenges faced by companies offering AI is how to cover the huge costs of running the service. For example, Enrique Dans mentions that running ChatGPT costs OpenAI $700,000 a day. This kind of spending isn’t going to be diluted even if they reach massive adoption of their paid service tier of $20 per month.

One option is that AI development and adoption are going to slow down. The other scenario is that we’ll see very interesting developments in the efficiency of the hardware running AI models. While Microsoft is already known to be developing its own chips, it’s really NVIDIA that has the advantage because of its years of knowledge and experience in chip design to make significant improvements and has all the incentives to keep the crown as the leading AI chip provider.

AI Generative AI NVIDIA OpenAI

Join my free newsletter and receive updates directly to your inbox.