The current Model Context Protocol (MCP) spec is shifting developers toward lightweight, stateless servers that serve as tool providers for LLM agents. These MCP servers communicate over HTTP, with OAuth handled clientside. Vercelβs infrastructure makes it easy to iterate quickly and ship agentic AI tools without overhead.
Example of Lightweight MCP Server Design
At This Dot Labs, we built an MCP server that leverages the DocuSign Navigator API. The tools, like get_agreements, make a request to the DocuSign API to fetch data and then respond in an LLM-friendly way.
// Get agreements tool that requires authentication server.tool( 'get_agreements', 'Retrieve DocuSign Navigator agreements. Returns a list of all agreements available in the system with metadata like title, type, status, and parties.', {}, // No input parameters needed getAgreementsHandler );
// Get agreement by ID tool that requires authentication server.tool( 'get_agreement_by_id', 'Retrieve detailed information about a specific DocuSign Navigator agreement by its ID. Returns comprehensive details including title, type, status, summary, parties, provisions, metadata, and custom attributes. REQUIRED: agreementId parameter must be provided.', { agreementId: z.string().min(1, 'Agreement ID is required') }, getAgreementByIdHandler );
Before the MCP can request anything, it needs to guide the client on how to kick off OAuth. This involves providing some MCP spec metadata API endpoints that include necessary information about where to obtain authorization tokens and what resources it can access. By understanding these details, the client can seamlessly initiate the OAuth process, ensuring secure and efficient data access.
The Oauth flow begins when the user's LLM client makes a request without a valid auth token. In this case theyβll get a 401 response from our server with a WWW-Authenticate header, and then the client will leverage the metadata we exposed to discover the authorization server. Next, the OAuth flow kicks off directly with Docusign as directed by the metadata. Once the client has the token, it passes it in the Authorization header for tool requests to the API.
API Routes
βββ Health & Monitoring
β βββ GET /health
β
βββ OAuth 2.0 Discovery (.well-known)
β βββ GET /.well-known/oauth-authorization-server
β βββ GET /.well-known/oauth-protected-resource
β
βββ OAuth 2.0 Flow
β βββ GET/POST /register
β βββ GET /authorize
β βββ POST /token
β βββ GET /auth/callback
β
βββ MCP (Model Context Protocol)
βββ POST /mcp Main endpoint
This minimal set of API routes enables me to fetch Docusign Navigator data using natural language in my agent chat interface.
Deployment Options
I deployed this MCP server two different ways: as a Fastify backend and then by Vercel functions. Seeing how simple my Fastify MCP server was, and not really having a plan for deployment yet, I was eager to rewrite it for Vercel.
The case for Vercel:
- My own familiarity with Next.js API deployment
- Fit for architecture
- The extremely simple deployment process
- Deploy previews (the eternal Vercel customer conversion feature, IMO)
Previews of unfamiliar territory
Did you know that the MCP spec doesnβt βjust workβ for use as ChatGPT tooling? Neither did I, and I had to experiment to prove out requirements that I was unfamiliar with. Part of moving fast for me was just deploying Vercel previews right out of the CLI so I could test my API as a Connector in ChatGPT. This was a great workflow for me, and invaluable for the team in code review.
Stuff Iβm Not Worried About
Vercelβs mcp-handler package made setup effortless by abstracting away some of the complexity of implementing the MCP server. It gives you a drop-in way to define tools, setup https-streaming, and handle Oauth. By building on Vercelβs ecosystem, I can focus entirely on shipping my product without worrying about deployment, scaling, or server management. Everything just works.
import { createMcpHandler, withMcpAuth } from 'mcp-handler'; import { z } from 'zod'; import { authStatusHandler, getAgreementsHandler, getAgreementByIdHandler, searchHandler, fetchHandler, } from '../lib/mcp/handlers/index.js'; import { createTokenVerifier } from '../lib/mcp/auth.js';// Create the base MCP handler with both authenticated and non-authenticated tools const handler = createMcpHandler( server => {
// Get agreements tool that requires authentication server.tool( 'get_agreements', 'Retrieve DocuSign Navigator agreements. Returns a list of all agreements available in the system with metadata like title, type, status, and parties.', {}, // No input parameters needed getAgreementsHandler ); );
// Wrap the handler with authentication - all tools require valid authentication const authHandler = withMcpAuth(handler, createTokenVerifier(), { required: true, // All tools require authentication - this triggers 401 responses requiredScopes: ['signature'], // Require at least the signature scope resourceMetadataPath: '/.well-known/oauth-protected-resource', // Custom metadata path });
export { authHandler as GET, authHandler as POST };
A Brief Case for MCP on Next.js
Building an API without Next.js on Vercel is straightforward. Though, Iβd be happy deploying this as a Next.js app, with the frontend features serving as the documentation, or the tools being a part of your website's agentic capabilities. Overall, this lowers the barrier to building any MCP you want for yourself, and I think thatβs cool.
Conclusion
I'll avoid quoting Vercel documentation in this post. AI tooling is a critical component of this natural language UI, and we just want to ship. I declare Vercel is excellent for stateless MCP servers served over http.