OpenAI Unleashes "Revolutionary" AI Large Models as Open Source: Will the AI Landscape Undergo a Paradigm Shift?

08/14 2025 426

Produced by | Dali Finance

In the age of AI, OpenAI stands as a pillar in the field, with its advancements consistently sparking a global AI frenzy.

In August 2025, OpenAI made a groundbreaking announcement: the release of two "revolutionary" models, gpt-oss-20b and gpt-oss-120b, as open source.

This marks the first time since the launch of GPT-2 in 2019 that OpenAI has opened up the weights of its core language models to the global developer community, signaling a strategic shift in the AI landscape.

Both models employ the Transformer architecture and integrate "Mixture-of-Experts (MoE)" technology, capable of handling context lengths of up to 128,000 tokens.

gpt-oss-120b boasts 117 billion parameters, with approximately 510 million parameters active at any given time, making it ideal for high-inference demand scenarios and runnable on a single GPU with 80GB of VRAM.

Conversely, gpt-oss-20b features 21 billion parameters, with around 360 million parameters active at a time, suited for low-latency applications and edge devices, and runnable on devices with just 16GB of VRAM.

These open-source models have excelled in numerous benchmark tests. In challenges such as Codeforces programming competitions, MMLU (Massive Multitask Language Understanding), and HealthBench, gpt-oss-120b has performed on par with OpenAI's o4-mini model and even surpassed it in health-related queries. Meanwhile, gpt-oss-20b has outperformed OpenAI's o3-mini model in math competitions (like AIME 2024 and 2025) and health-related tasks.

Moreover, the open-source models support Chain-of-Thought (CoT) reasoning, allowing developers to adjust the reasoning effort level (low, medium, high) based on task complexity. The reasoning chains generated by these models are publicly available, facilitating auditing and analysis by developers.

OpenAI's release of the gpt-oss series marks a strategic transformation, taking a significant step towards greater openness and transparency in AI.

01 From "Closed" to "Open": The Rationale Behind OpenAI's Turnaround

OpenAI's shift represents a strategic adjustment amidst intense competition.

In recent years, from Meta's Llama series to Mistral AI, and China's DeepSeek sparking a fresh wave of open-source models, these high-performance open-source models have been steadily eroding OpenAI's market share and winning over numerous developers and SMEs.

Faced with this scenario, if OpenAI continued to adhere to its closed "walled garden" approach, it risked becoming increasingly isolated in the open-source tide.

Therefore, selectively and strategically embracing open source has become crucial for maintaining industry leadership and balancing the competitive landscape.

By open-sourcing, OpenAI is declaring to the global developer and enterprise community that it remains their best choice, whether for closed-source top models or open-source high-performance models.

Furthermore, open-sourcing enables global developers to participate in model optimization and innovation, accelerating the cutting-edge exploration of AI technology.

02 The Potency of This Open-Source "Nuclear Bomb"

The two models presented by OpenAI this time are imbued with sincerity and are far from being "leftovers".

The larger gpt-oss-120b outperforms OpenAI's own closed-source mini model o3-mini in solving general problems and programming competitions, reaching a level comparable to the more powerful o4-mini.

The lighter gpt-oss-20b can also match o3-mini in many routine tests and even surpass it in specialized fields like mathematics and health.

The 120b model can run efficiently on a mainstream GPU with 80GB of VRAM, while the 20b model fits seamlessly into high-end laptops or even smartphones with just 16GB of RAM.

Powerful AI capabilities are no longer the exclusive domain of a few giants with expensive computing clusters.

Behind these impressive performances lies OpenAI's profound technical expertise.

Whether it's borrowing reinforcement learning techniques from its cutting-edge internal models or adopting a Mixture-of-Experts (MoE) architecture to reduce operating costs, OpenAI showcases its leadership in model development.

Dali Finance believes that OpenAI's open-sourcing this time is not aimed at top competitors like Google and Anthropic that follow a closed-source approach but rather at "new forces" attempting to challenge its position through open source.

By entering the open-source arena with "revolutionary" products, OpenAI will significantly compress the survival space of other open-source models.

It's conveying a clear message to all developers and enterprises: there's no need to struggle with various open-source models of varying quality anymore; here is the official, most potent "free lunch".

By open-sourcing, OpenAI further solidifies its closed-source commercial empire. In the future, AI enterprises will compete not only on technology but also on the completeness and openness of their ecosystems.

As global developers start building their applications and products based on OpenAI's open-source models, they become more deeply integrated into OpenAI's technology system and ecosystem.

In the future, when they require more powerful, enterprise-level AI capabilities, upgrading to OpenAI's paid closed-source models will be the most natural and logical choice.

03 Industry Shockwaves: A New Competitive Landscape

OpenAI's move is a boon for developers and SMEs.

Powerful, free underlying technical tools will significantly lower the entry barrier and stimulate unprecedented innovation.

For the entire industry's innovation ecosystem, open source will foster broader industry-academia-research collaboration and spark deeper discussions on AI safety and ethics.

For OpenAI itself, open sourcing also means assuming greater responsibility.

OpenAI's entry marks the official inception of the AI open-source battle entering the stage of "deity fights".

For Chinese AI enterprises, including representatives like ByteDance and Alibaba, "national team" members like Jieyue Xingchen and Zhipu AI, and "special forces" like DeepSeek, OpenAI's open sourcing signifies increased international competition pressure.

On one hand, they need to continuously enhance model performance to compete with OpenAI's open-source models; on the other hand, they must strengthen ecosystem construction to boost their attractiveness and competitiveness.

A new era of AI that is more open, inclusive, and "intensely competitive" has dawned!

Will OpenAI's open sourcing put an end to the chaos of the "hundred models war" or ignite an even fiercer "deity fight"? Welcome to share your thoughts in the comments section.

Solemnly declare: the copyright of this article belongs to the original author. The reprinted article is only for the purpose of spreading more information. If the author's information is marked incorrectly, please contact us immediately to modify or delete it. Thank you.