Mastra, the TypeScript AI agent framework built by the team behind Gatsby, reached 300,000 weekly npm downloads and 22,000 GitHub stars before its stable 1.0 release. It raised $13 million in seed funding led by Y Combinator, making it the third-fastest-growing JavaScript framework ever measured by npm data. If you write TypeScript and have been watching the AI agent space from the sidelines, the sidelines just moved.
Python dominated the first generation of agent frameworks: LangGraph, CrewAI, AutoGen. That made sense when the core task was chaining LLM calls in notebooks. But agents in 2026 are shipping as products, not prototypes. They need real-time UIs, streaming responses, authentication flows, and deployment pipelines. That is web infrastructure, and web infrastructure speaks JavaScript.
The TypeScript AI Agent Ecosystem in March 2026
Three frameworks define the TypeScript agent space right now, each solving a different problem.
Mastra is the full-stack framework. It gives you agents, workflows, RAG, memory, evals, and a local development playground called Mastra Studio, all in one package. Think of it as the Next.js of AI agents: opinionated defaults, batteries included, designed for developers who want to ship fast without stitching together five libraries.
Vercel AI SDK is the integration layer. With 2.8 million weekly npm downloads, it dominates the React/Next.js ecosystem. Its strength is streaming UI components, structured generation, and multi-provider support (25+ LLM providers through one interface). If you are building an AI feature inside an existing Next.js app, this is the default choice.
LangGraph.js brings LangChain’s graph-based orchestration to TypeScript. You define agent logic as nodes and edges, which makes complex workflows with branching, loops, and conditional logic explicit and auditable. It has 529,000 weekly npm downloads and full feature parity with its Python sibling.
Two other frameworks worth tracking: AgentKit from Inngest, which focuses on durable, background-running agent workflows, and the OpenAI Agents SDK, which recently added TypeScript support alongside its Python core.
Why Not Just Use Python?
Fair question. Python still has the larger ecosystem for ML and data science. But the argument for TypeScript agents is not about replacing Python for model training. It is about where agents actually run.
A customer support agent that handles 50,000 conversations per day needs WebSocket connections, streaming responses, and session management. A sales agent that qualifies leads needs to integrate with CRM APIs, send emails, and update databases in real time. A coding agent embedded in VS Code runs in a Node.js process.
These are all JavaScript-native environments. Running a Python subprocess, managing a virtual environment, and serializing data across the language boundary adds friction that TypeScript agents eliminate entirely.
Mastra Deep Dive: What Makes It Different
Mastra is not just another wrapper around LLM APIs. It is an opinionated framework that handles the infrastructure AI agents need in production.
Agents and Tool Calling
A Mastra agent is a TypeScript class that connects an LLM to a set of tools. You define tools as typed functions:
import { Agent, createTool } from '@mastra/core';
import { z } from 'zod';
const weatherTool = createTool({
id: 'get-weather',
description: 'Get current weather for a city',
inputSchema: z.object({ city: z.string() }),
execute: async ({ context }) => {
const res = await fetch(`https://api.weather.gov/...`);
return res.json();
},
});
const agent = new Agent({
name: 'weather-agent',
model: openai('gpt-4o'),
tools: { weatherTool },
instructions: 'You help users check weather conditions.',
});
The agent decides which tools to call, in which order, and when to stop. This is standard agentic behavior, but Mastra handles it with full TypeScript type safety, which means your IDE catches mismatched schemas before runtime.
Workflows for Complex Pipelines
Where Mastra pulls ahead of simpler agent wrappers is its workflow engine. Workflows let you define multi-step processes as directed graphs with branching, parallel execution, and human-in-the-loop suspension:
const onboardingWorkflow = new Workflow({ name: 'customer-onboarding' })
.step(validateInput)
.then(enrichFromCRM)
.branch([
[isEnterprise, assignDedicatedRep],
[isStartup, addToSelfServe],
])
.then(sendWelcomeEmail)
.commit();
Workflows can pause, wait for human approval, and resume hours later without losing state. For teams building agents that handle real business processes (not just chat), this is the feature that matters.
Memory and RAG
Mastra supports both short-term memory (within a conversation thread) and long-term memory (across sessions). It also includes built-in RAG: you can sync data from APIs, databases, or web scraping into a vector store, then let agents query it at runtime.
This matters because most production agents need context beyond the conversation. A support agent needs to know your account history. A research agent needs to search your internal docs. Mastra handles the indexing, chunking, and retrieval pipeline without requiring a separate library.
Mastra Studio
The local dev playground, Mastra Studio, is genuinely useful. It visualizes agent workflows, lets you test tool calls interactively, and shows execution traces step by step. If you have used LangSmith for Python agents, think of Studio as the TypeScript equivalent that runs locally without needing a cloud service.
Choosing Between Mastra, Vercel AI SDK, and LangGraph.js
The decision depends on what you are building, not which framework has more stars.
Choose Mastra when you are building a standalone AI product from scratch. You need agents, workflows, RAG, and memory in one cohesive package. You want an opinionated framework that makes architectural decisions for you. Your team is comfortable with a newer framework backed by Y Combinator and the Gatsby team. Companies like PayPal, Adobe, and Docker are already running Mastra in production.
Choose Vercel AI SDK when you are adding AI capabilities to an existing Next.js or React application. You need streaming chat UIs, structured outputs, or multi-provider model routing. You do not need a full agent framework, just a solid integration layer. The Vercel AI SDK recently released version 6 with improved agent primitives, but its sweet spot remains UI-centric AI features.
Choose LangGraph.js when you need fine-grained control over complex, multi-step agent workflows. Your team already uses LangChain in Python and wants feature parity in TypeScript. You need enterprise-grade observability through LangSmith. Compliance requirements demand that every agent decision is logged and auditable. This is the same architecture that Klarna, Replit, and Elastic use for their production agents.
| Feature | Mastra | Vercel AI SDK | LangGraph.js |
|---|---|---|---|
| Agent framework | Full | Basic (v6) | Full |
| Workflow engine | Built-in | No | Graph-based |
| RAG pipeline | Built-in | No | Via LangChain |
| Memory system | Short + long term | No | Checkpointing |
| Dev playground | Mastra Studio | No | LangSmith (cloud) |
| npm weekly downloads | 300K+ | 2.8M+ | 529K+ |
| Best for | Standalone AI products | AI features in Next.js | Complex agent orchestration |
Mastra vs LangChain: The Python-to-TypeScript Question
The “Mastra vs LangChain” comparison is the most common question TypeScript developers ask when entering the agent space, and the answer depends on which LangChain product you mean.
LangChain (the Python library) has the largest agent ecosystem by every measure: 100+ LLM provider integrations, 50+ vector store connectors, and hundreds of prebuilt tools. Around 400 companies run LangGraph Platform in production, including Cisco, Uber, LinkedIn, and JPMorgan. If your team is Python-first, LangChain’s ecosystem depth is hard to match.
LangGraph.js (the TypeScript port) brings full feature parity with LangGraph’s Python version. But LangGraph.js is a runtime for defining agent logic as graphs, not a full application framework. You still need to wire up your own RAG pipeline, memory system, and dev tooling.
Mastra is the framework you pick when you want one cohesive package. Replit used Mastra in Agent 3 and reported that task success rates improved from 80% to 96%. Marsh McLennan deployed a Mastra-based search tool to 75,000 employees. The trade-off: Mastra’s ecosystem is newer and smaller, with fewer third-party integrations.
The practical heuristic: if you need to connect to 30 data sources and want battle-tested production observability through LangSmith, use LangGraph.js. If you want the fastest path from zero to a working TypeScript agent product, use Mastra.
What This Means for the Agent Ecosystem
The TypeScript agent wave is not a language war. It is a sign that AI agents have crossed from research tooling into production software engineering.
When Gatsby’s core team pivots to building AI infrastructure, when Y Combinator backs a TypeScript agent framework at seed stage, when npm download numbers rival established web frameworks, the signal is clear: agents are becoming a standard part of the web stack.
For Python-first teams, this does not mean switching languages. LangGraph maintains full parity across both. But for the millions of JavaScript developers who have been building web applications for years, the barrier to entry for AI agents just dropped to zero. You do not need to learn Python, set up Conda, or figure out pip conflicts. You npm install @mastra/core and start building.
The MCP protocol makes this even more interesting. Mastra already supports MCP, which means your TypeScript agents can connect to the same tool ecosystem that Python agents use. The language boundary for agent capabilities is dissolving.
Frequently Asked Questions
What is Mastra and who built it?
Mastra is a TypeScript framework for building AI agents and applications. It was created by the team behind Gatsby (the React static site generator), raised $13 million in seed funding from Y Combinator, and reached 300,000 weekly npm downloads and 22,000 GitHub stars before its stable 1.0 release.
Should I use TypeScript or Python for building AI agents?
It depends on your deployment environment. Python remains stronger for ML research and data science workflows. TypeScript is better suited for agents that need web integration: real-time UIs, streaming responses, WebSocket connections, and tight integration with existing web applications. Many teams use both, with LangGraph offering full feature parity across Python and TypeScript.
How does Mastra compare to Vercel AI SDK?
Mastra is a full agent framework with built-in workflows, RAG, memory, and a dev playground. Vercel AI SDK is an integration layer optimized for adding AI features to React/Next.js apps, with strengths in streaming UI and multi-provider model routing. Choose Mastra for standalone AI products; choose Vercel AI SDK for adding AI to existing web applications.
Does Mastra support MCP (Model Context Protocol)?
Yes. Mastra has built-in MCP support, allowing TypeScript agents to connect to the broader MCP tool ecosystem. This means agents built with Mastra can use the same tools and integrations available to Python-based agent frameworks.
How does Mastra compare to LangChain for AI agents?
LangChain has the larger ecosystem with 100+ LLM integrations, 50+ vector stores, and LangSmith for production observability. Mastra is TypeScript-first with built-in workflows, RAG, memory, and dev tooling in one package. LangGraph.js (LangChain’s TypeScript port) provides graph-based orchestration with full Python feature parity but requires more assembly. Mastra is faster for new TypeScript-native projects; LangChain is stronger when you need deep ecosystem integrations or Python interop.
What companies use Mastra in production?
Companies including PayPal, Adobe, Elastic, Docker, Replit, and WorkOS have either deployed Mastra in production or published case studies about their implementations. The framework’s Y Combinator backing and rapid adoption curve have made it a common choice for teams building TypeScript-native AI products.
