Rumor

Codex 5.3 to be the first model trained on Blackwell architecture

Tuesday, February 10, 2026 at 03:59 PM

Codex 5.3, scheduled for late February, is reported to be the first model trained using Blackwell architecture following the hardware's early 2025 release. Separately, DeepSeek is noted for establishing an independent reinforcement learning and data pipeline over a ten-month period.

Context

The recent release of Codex 5.3 in February 2026 marks a pivotal moment for the semiconductor and AI sectors as the first frontier model trained entirely on Nvidia’s Blackwell architecture. Although the Blackwell hardware platform launched in early 2025, the one-year lag underscores the significant time required for global data center buildouts to reach the scale necessary for pre-training flagship models. This development validates the infrastructure-heavy roadmap for Nvidia's GB200 NVL72 systems, which provide the specialized compute and memory throughput required for the next generation of agentic AI. The model is technically notable as the first "self-improving" system, having assisted in debugging its own training runs and managing its own GPU cluster deployment. This level of vertical integration between hardware and software creates a widening moat for incumbents. Currently, only one other lab, DeepSeek, has successfully built a truly independent reinforcement learning and data pipeline, a feat that took over 10 months to achieve. For investors, the Codex 5.3 launch signals that while hardware is available, the software maturity to fully utilize Blackwell is only now arriving.

Related Companies

Nvidia
Nvidia
NVDA
US