OpenAI announced two new open-weight AI language models — GPT-OSS-120B and GPT-OSS-20B — on Tuesday, August 5. These new text-only models mark the company’s first open-weight release since GPT-2 in 2019, as reported by TechCrunch.
gpt-oss-120b matches OpenAI o4-mini on core benchmarks and exceeds it in narrow domains like competitive math or health-related questions, all while fitting on a single 80GB GPU (or high-end laptop).
gpt-oss-20b fits on devices as small as 16GB, while matching or exceeding… pic.twitter.com/Zn2wDiWcNb
— OpenAI (@OpenAI) August 5, 2025
According to the company, the models are designed as lower-cost options that developers and researchers can easily run and customize.
Lower-Cost Alternatives to Proprietary Models
OpenAI’s move follows a broader industry trend, with Meta, Microsoft-backed Mistral AI, and Chinese startup DeepSeek releasing open-weight models over the past few years.
Per CNBC:
“An AI model is considered open weight when all its parameters — used during training to improve prediction and output — are publicly available.”
These models differ from open-source software, which typically includes full source code access. Open-weight models, however, offer transparency and control without the full open-source approach.
Greg Brockman on Open Ecosystem Development
OpenAI President Greg Brockman highlighted the significance of the announcement:
“…we are excited to contribute to that and really push the frontier and then see what happens from there,” he told TechCrunch.
Features and Capabilities of the New Models
The two newly released models offer scalable performance:
- GPT-OSS-120B: A large model that can run on a single Nvidia GPU
- GPT-OSS-20B: A lighter model that runs on consumer laptops with 16GB of memory
Both models are now freely available for download on Hugging Face.
Collaboration With Chip Makers
OpenAI partnered with leading chip manufacturers for this release, including:
- Nvidia
- AMD
- Cerebras
- Groq
Nvidia CEO Jensen Huang praised the collaboration, stating:
“OpenAI has shown what could be built on Nvidia AI — and now they’re advancing innovation in open-source software.”
Fallback to Cloud-Based Closed Models
During a press briefing, OpenAI emphasized that the open models can send complex queries to cloud-based AI when needed. This allows developers to bridge the gap between open and closed models.
Model Performance Results
The models were tested on Codeforces, with scores as follows:
- GPT-OSS-120B: 2622
- GPT-OSS-20B: 2516
Both models outperformed DeepSeek’s R1, confirming their state-of-the-art capabilities.
Launch Delays Due to Safety Testing
The release came after multiple delays. Last month, OpenAI CEO Sam Altman stated on X:
“It required extra time to run additional safety tests and review high-risk areas.”