According to a report by The Wall Street Journal, OpenAI’s development of its next-generation language model, GPT-5, is experiencing delays, with results thus far failing to justify the substantial investments made.
It appears that This echo may exist within The Data, hinting at OpenAI’s desire to explore novel methodologies. Despite reporting on a WSJ story, the article provides additional details regarding the 18-month development process behind GPT-5, codenamed Orion.
According to reports, OpenAI has successfully conducted at least two significant training sessions, with the objective of significantly enhancing its model by feeding it vast amounts of data. A preliminary coaching run unexpectedly lagged behind schedule, suggesting that a larger endeavour would likely prove both time-consuming and expensive. While GPT-5 has demonstrated impressive capabilities, it remains unclear whether its enhancements are substantial enough to warrant the significant costs associated with maintaining the model’s operation.
According to the Wall Street Journal, OpenAI has taken a more proactive approach to generating new content, rather than solely relying on public data and licensing agreements, by hiring individuals to create original material through coding and solving mathematical problems. The AI system is further leveraging artificially generated knowledge created by another one of its models, o1.
OpenAI failed to respond promptly to a request for comment. The corporation previously mentioned this last year.