OpenAI’s GPT-5 development faces significant delays and cost issues

OpenAI’s next major project, GPT-5 (code-named Orion), is experiencing substantial setbacks and escalating costs, according to a Wall Street Journal report by Deepa Seetharaman. The project, which has been in development for over 18 months, has encountered multiple challenges during training runs, each costing approximately half a billion dollars in computing expenses alone.

The company’s attempts to create a more advanced successor to GPT-4 have been hampered by data limitations and technical difficulties. Microsoft, OpenAI’s largest investor, had initially expected to see the new model by mid-2024. However, at least two large-scale training runs have failed to meet researchers’ expectations.

A key challenge is the scarcity of high-quality training data. The public internet no longer provides sufficient diverse, high-quality content needed for significant AI improvements. In response, OpenAI has begun creating custom training data, hiring experts to write code and solve complex problems while explaining their reasoning process.

The project’s costs are unprecedented, with each six-month training run requiring massive computing resources. The company is utilizing thousands of Nvidia chips in data centers, pushing expenses far beyond the $100 million spent on training GPT-4.

Internal turbulence has further complicated development, with the company losing more than two dozen key executives and researchers in 2024, including Chief Scientist Ilya Sutskever and recently, veteran researcher Alec Radford.

OpenAI is now exploring new approaches, including reasoning capabilities and synthetic data generation, to overcome these challenges. CEO Sam Altman has recently announced plans for their new reasoning model “o3” but has not provided specific timeline for GPT-5’s release.

Related posts:

Stay up-to-date: