Is OpenAI's ChatGPT Too Expensive to Run?

Published On Sat May 13 2023
Is OpenAI's ChatGPT Too Expensive to Run?

ChatGPT's Potential Cost Per Day to Operate

ChatGPT, OpenAI's chatbot, has been making headlines due to its potential cost to operate. According to Dylan Patel, the chief analyst at semiconductor research firm SemiAnalysis, the chatbot could cost up to $700,000 a day to operate due to its expensive tech infrastructure. The high cost is primarily due to the expensive servers required to run the AI. It's worth noting that while the cost of training ChatGPT's large language models is already in the tens of millions, operational expenses far surpass training costs when deploying the model at scale. In fact, inference costs of the chatbot surpass the training expenses on a weekly basis, as confirmed by Patel and another researcher at SemiAnalysis, Afzal Ahmad.

Companies that have been utilizing OpenAI's language models, such as the AI dungeon game developer Latitude, have been paying unduly high prices for years. For instance, the CEO of Latitude, Nick Walton, revealed that running the model cost the company $200,000 a month in 2021. The company also paid for Amazon Web Services servers, as it answered millions of user questions. To reduce expenses, Walton decided to change to a language software provider supported by AI21 Labs, which cut his company's AI expenses in half to $100,000 a month.

In an effort to decrease the cost of running generative AI models, Microsoft has reportedly been working on an AI chip since 2019. According to sources familiar with the matter, the chip, called Athena, might be announced for inner use by Microsoft and OpenAI beginning next year. Microsoft acknowledged that it was falling behind Google and Amazon in its attempts to construct its own in-house chips and was seeking for cheaper variants to Nvidia's chips, which its AI models currently run on. Over 300 Microsoft employees are now reportedly working on the chip.

In conclusion, OpenAI's ChatGPT is currently one of the most powerful technologies out there, and its popularity is rapidly growing. While it comes with a high cost due to the expensive servers required to run the AI, it assists people in seeking the best answers to their requests. However, switching to the company’s own chips can be a significant risk due to its previous falling behind its competitors. As such, companies that want to take advantage of ChatGPT's power should consider their options carefully.