An investigation from SemiAnalysis’ revealed that OpenAI spends as much as $694,444 in actual money per day to maintain the chatbot. The system is believed to have 3,617 HGX A100 servers with 28,936 GPUs, according to the company’s estimation. The cost per query is estimated to be roughly 0.36 cents.
The current costs to run the programme could be considerably higher, according to Dylan Patel, Chief Analyst at SemiAnalysis, who said that GPT-4 is probably even more expensive to run than GPT-3. Although OpenAI has already made a GPT-4 version available to paying customers, Patel based his predictions on the outdated GPT-3 model.
According to the manufacturer, the new model provides more precise information and gives stronger protection against the remarks that veered off course and were an issue with GPT-3/3.5. The system’s power-guzzling specialized chips are one of the main causes of the high expenses.
Read More: How Students Can Make The Best Use Of Technology To Enhance Learning Capacities
As a solution, Microsoft, one of the largest OpenAI stockholders, is rumored to be developing its own artificial intelligence (AI) processor, code-named Athena, which may completely replace the NVIDIA GPUs and significantly lower ChatGPT’s operating expenses.
ChatGPT can also create working code from scratch, sparking concerns that it might eventually overtake programmers. The code produced by the chatbot, however, could not be very secure, according to new research by computer scientists Raphael Khoury, Anderson Avila, Jacob Brunelle, and Baba Mamadou Camara.