OpenAI’s Assistants API is dead. The company announced its deprecation in August 2025 and set a hard sunset date of August 26, 2026. The replacement stack, the Responses API and Conversations API, ships with native support for the Model Context Protocol (MCP), the open standard Anthropic created in November 2024. When the company that built the most prominent proprietary agent API abandons it in favor of a competitor’s open protocol, the standard war is over. MCP won.
This is not just a deprecation notice. It is the clearest signal yet that the agent ecosystem has consolidated around a single interoperability layer: MCP for tool integration, with A2A handling agent-to-agent communication. If you are building agents today, the strategic uncertainty about which protocol to bet on has evaporated.
What Exactly Is Going Away
The Assistants API launched in November 2023 as OpenAI’s answer to stateful, multi-turn agent workflows. It bundled threads, messages, runs, and built-in tools (code interpreter, file search, function calling) into a single managed service. You created an assistant, gave it instructions, and OpenAI handled the conversation state server-side.
The problem was architectural. The Assistants API was:
- Proprietary and locked in. Your assistant’s state lived on OpenAI’s servers. Switching to another model provider meant rewriting your entire agent layer.
- Session-based and stateful. Every interaction required maintaining a thread object. That made the API harder to scale and more expensive to operate than stateless alternatives.
- Isolated from the ecosystem. Assistants could call OpenAI’s built-in tools but had no standard way to connect to the thousands of tools and services the broader community was building.
The Responses API, which replaced the older Chat Completions endpoint for agent workloads, fixes all three problems. It is stateless by default (with optional state via the Conversations API), supports remote MCP servers natively, and includes built-in tools like web search, file search, code interpreter, and computer use, all accessible through a single API call.
The Migration Timeline
| Date | Event |
|---|---|
| November 2023 | Assistants API launches |
| March 2025 | Responses API launches; OpenAI adopts MCP |
| August 2025 | Assistants API officially deprecated |
| August 26, 2026 | Hard sunset: all Assistants API endpoints stop working |
Developers have roughly five months left. OpenAI published a migration guide covering the key differences, but the migration is not trivial. Thread-based state management, run polling, and assistant-level configuration all work differently in the Responses model.
Why OpenAI Abandoned Its Own Approach
OpenAI did not deprecate the Assistants API because it was broken. They deprecated it because MCP made it irrelevant.
By the time OpenAI adopted MCP in March 2025, the protocol had already reached escape velocity. Sam Altman acknowledged it directly: “People love MCP and we are excited to add support across our products.” That is not corporate politeness. That is a concession.
The numbers explain why:
- 10,000+ public MCP servers listed in the registry, covering everything from developer tools to Fortune 500 enterprise systems
- 97 million monthly SDK downloads across Python and TypeScript
- Universal adoption: ChatGPT, Cursor, Gemini, Microsoft Copilot, VS Code, and dozens of other tools all support MCP natively
Network effects killed the Assistants API more than any technical shortcoming. Once every major AI product supported MCP, building a proprietary alternative meant asking developers to maintain two integration layers. Nobody wanted to do that.
The AAIF Factor
The creation of the Agentic AI Foundation (AAIF) in December 2025 removed the last credible objection to MCP adoption. With Anthropic donating MCP to a neutral body under the Linux Foundation, and OpenAI itself joining as a co-founder alongside Block, the “controlled by a competitor” argument disappeared.
The AAIF now governs MCP, AGENTS.md (OpenAI’s own contribution), and Goose (Block’s agent framework) with 49 member organizations. A steering committee with multi-company representation guides the technical roadmap. No single vendor controls MCP’s future, which is exactly what enterprise buyers needed to hear before standardizing on it.
What This Means for Agent Builders
The practical implications depend on where you are in the stack.
If You Are on the Assistants API
You have until August 26, 2026. The migration path goes to the Responses API (for single-turn and multi-step agent tasks) and the Conversations API (for multi-turn stateful interactions). Key differences:
- State management: Assistants used server-side threads. Responses is stateless; you manage context yourself or use Conversations for persistence.
- Tool calling: Assistants had built-in tools with proprietary interfaces. Responses supports MCP servers directly, plus built-in tools (web search, code interpreter, file search, computer use).
- Multi-tool execution: The Responses API can call multiple tools within a single request, something Assistants handled through sequential run steps.
OpenAI’s migration guide covers the API mapping. The biggest conceptual shift is from the thread/run/message model to a simpler input/output model with optional state.
If You Are Building New
Start with MCP. There is no reason to evaluate alternatives.
Every major framework supports it: the OpenAI Agents SDK, LangChain, CrewAI, AutoGen, and Mastra all have MCP integrations. Your agent connects to any MCP server, whether it wraps a GitHub repo, a Postgres database, a Slack workspace, or a custom internal API, through a single, standardized interface.
The practical workflow:
- Pick your agent framework
- Find or build MCP servers for your tools (the MCP server registry has 10,000+ options)
- Wire them together using your framework’s MCP client
- Your agent works with any LLM that supports MCP, not just OpenAI
If You Are an Enterprise Architect
This is the green light to standardize. The “wait and see which protocol wins” holding pattern that many enterprise teams adopted in 2025 has a clear answer now. MCP is the tool integration standard. A2A is the agent-to-agent standard. Both are under neutral governance at the Linux Foundation.
Build your internal MCP servers for proprietary systems. Expose your internal APIs through MCP. Train your teams on MCP server development. The protocol is not going to be superseded, because the entity that would have most benefit from replacing it (OpenAI) just threw its own alternative away.
What Comes Next for MCP
The 2026 MCP roadmap outlines several areas where the protocol is still maturing:
- Asynchronous operations: Long-running tasks currently require workarounds. The roadmap includes native async support.
- Statelessness improvements: Better support for purely stateless server deployments, important for serverless and edge environments.
- Additional SDKs: Official support expanding beyond Python and TypeScript to Java, Go, Rust, and C#.
- Security hardening: OAuth 2.1 integration, scoped permissions, and audit logging for enterprise deployments.
The deprecation of the Assistants API also accelerates a secondary trend: the convergence of agent frameworks around a common tool layer. When every framework speaks MCP, the differentiation shifts from “which tools can I connect to” toward orchestration logic, reliability, and developer experience. That is a healthier competitive dynamic for the ecosystem.
The interoperability debate consumed enormous energy in 2024 and 2025. Teams delayed projects waiting for a winner. Vendors hedged by supporting multiple protocols. Conference talks debated which standard would prevail. All of that is done now. The standard is MCP, the governance is neutral, and the biggest holdout just signed the surrender.
Build accordingly.
Frequently Asked Questions
When does the OpenAI Assistants API shut down?
The OpenAI Assistants API has a hard sunset date of August 26, 2026. After that date, all Assistants API endpoints will stop working. OpenAI deprecated the API in August 2025 and recommends migrating to the Responses API and Conversations API.
What replaces the OpenAI Assistants API?
The Responses API replaces the Assistants API for agent workloads. It supports native MCP (Model Context Protocol) servers, built-in tools like web search and code interpreter, and multi-tool execution in a single request. For stateful multi-turn conversations, the Conversations API provides optional persistence.
Why did OpenAI adopt MCP instead of keeping its own API?
MCP achieved overwhelming ecosystem adoption with 10,000+ public servers and 97 million monthly SDK downloads. Every major AI product, including ChatGPT, Cursor, Gemini, and Microsoft Copilot, supports MCP. Maintaining a proprietary alternative meant asking developers to build two integration layers, which nobody wanted to do.
Is MCP controlled by Anthropic?
No. Anthropic donated MCP to the Agentic AI Foundation (AAIF) under the Linux Foundation in December 2025. The AAIF is co-founded by Anthropic, Block, and OpenAI, with 49 member organizations including AWS, Google, Microsoft, and SAP. A multi-company steering committee governs the protocol’s technical roadmap.
Should I start new projects with MCP or wait for something better?
Start with MCP now. The protocol war is over. Every major agent framework (OpenAI Agents SDK, LangChain, CrewAI, AutoGen, Mastra) supports MCP. The ecosystem has 10,000+ servers, neutral governance, and the company most motivated to replace it just abandoned its own alternative.
