How to Turn Your API into an MCP Server

Transform your API into an MCP server using Stainless and OpenAPI specs. This guide covers setup, customization, and testing to enable AI-driven interactions with your API, making it accessible to Claude, Cursor, and more.

Ashley Goolam

Ashley Goolam

25 July 2025

How to Turn Your API into an MCP Server

Ever wished your API could chat with AI agents like Claude or Cursor, turning your endpoints into smart, conversational tools? Well, buckle up, because we’re diving into how to turn your API into an MCP server using Stainless and an OpenAPI spec. This conversational guide will walk you through the process, from setup to deployment, with a test to prove it works. We’ll use the Model Context Protocol (MCP) to make your API AI-friendly, all in a fun, approachable way. Let’s get started!

💡
Want a great API Testing tool that generates beautiful API Documentation?

Want an integrated, All-in-One platform for your Developer Team to work together with maximum productivity?

Apidog delivers all your demands, and replaces Postman at a much more affordable price!
button

What’s an MCP Server, and Why Should You Care?

The Model Context Protocol (MCP) is like a universal handshake for AI systems. It’s a JSON-RPC-based standard that lets AI clients (like Claude Desktop, Cursor, or VS Code Copilot) interact with your API using natural language or programmable prompts. An MCP server acts as a bridge, translating your API’s endpoints into tools that AI agents can understand and use.

Why turn your API into an MCP server? It’s a game-changer:

Whether you’re building a payment platform, a content API, or a custom service, turning your API into an MCP server makes it smarter and more accessible.

How Does Stainless Fit In?

Stainless is a developer’s best friend for creating SDKs and now MCP servers from OpenAPI specs. Its experimental MCP server generation feature takes your OpenAPI definition and spits out a TypeScript subpackage that’s ready to roll as an MCP server. This means your API’s endpoints become AI-accessible tools without you breaking a sweat. Let’s see how to make it happen!

stainless official website

Turning Your API into an MCP Server with Stainless

Prerequisites

Before we dive in, ensure you have:

Step 1: Testing Your OpenAPI Spec with Apidog

Before or even after turning your OpenAPI spec into an MCP server, it would be great to test it out. And that's where Apidog comes in handy! Apidog’s intuitive platform lets you import and test your OpenAPI spec to ensure your API’s endpoints are ready for MCP integration. Here’s how to do it:

  1. Visit Apidog and Sign Up or Sign In:
button

2. Create a New Project and Import Your OpenAPI Spec:

upload file

3. Configure API Settings:

successful import

4. Add Endpoints and Test:

build your api

Testing with Apidog ensures your OpenAPI spec is solid, making the Stainless MCP generation process smoother and your MCP server more reliable.

Step 2: Set Up a Stainless Project with TypeScript

Create a Stainless Project:

create a new project

Enable MCP Server Generation:

add mcp sdk

Step 3: Configure MCP Server Generation

In your Stainless project settings, configure the MCP server options. Create or edit a configuration file (e.g., stainless.yaml) with:

targets:
  typescript:
    package_name: my-org-name
    production_repo: null
    publish:
      npm: false
    options:
      mcp_server:
        package_name: my-org-name-mcp
        enable_all_resources: true

This tells Stainless to generate an MCP server subpackage that implements your API’s endpoints as AI-accessible tools.

Step 4: Customize Endpoint Exposure and Tool Descriptions

By default, all endpoints in your OpenAPI spec become MCP tools. To customize:

  1. Select Specific Endpoints:
resources:
  users:
    mcp: true
    methods:
      create:
        mcp: true
  orders:
    methods:
      create:
        mcp: true
        endpoint: post /v1/orders

2. Fine-Tune Tool Metadata:

resources:
  users:
    methods:
      create:
        mcp:
          tool_name: create_user
          description: Creates a new user profile with name and email.

This ensures your MCP server exposes only the endpoints you want, with clear, AI-friendly descriptions.

Step 5: Handle Large APIs with Tool Filtering and Dynamic Tools

For APIs with many endpoints (>50), exposing each as a separate tool can overwhelm an AI’s context window. Use these strategies:

  1. Tool Filtering:
npx -y my-org-mcp --resource=users

2. Dynamic Tools Mode:

npx -y my-org-mcp --tools=dynamic

Dynamic tools let the AI discover and call endpoints dynamically, reducing context overload.

Step 6: Build and Publish Your MCP Server

Build the MCP Server:

Publish to npm:

npm publish
publish

Step 7: Install and Configure for MCP Clients

After publishing, install your MCP server package locally or remotely for use with AI clients. For Claude Desktop:

  1. Install the Package:
npm install my-org-name-mcp

2. Configure Claude Desktop:

edit claude configuration
{
  "mcpServers": {
    "my_org_api": {
      "command": "npx",
      "args": ["-y", "my-org-mcp"],
      "env": {
        "MY_API_KEY": "123e4567-e89b-12d3-a456-426614174000"
      }
    }
  }
}

3. Other Clients:

cursor tools and integrations

Step 8: Test Your MCP Server

Let’s test your MCP server! In Claude Desktop (or another MCP client), try this prompt:

Using the MCP server, create a new user with name "Alex" and email "alex@example.com"

If your API has a POST /users endpoint (as defined in your OpenAPI spec), the MCP server will translate this prompt into an API call, creating a user and returning a response like:

User created: { "name": "Alex", "email": "alex@example.com", "id": "123" }

This confirms your MCP server is working and ready for AI-driven interactions.

Troubleshooting Tips

Best Practices for MCP Servers

Conclusion

And that’s a wrap! You’ve just learned how to turn your API into an MCP server using Stainless, transforming your OpenAPI spec into an AI-ready powerhouse. From configuring endpoints to testing with a user creation prompt, this guide makes it easy to bridge your API with AI agents like Claude or Cursor. Whether you’re enhancing a small project or scaling a production API, the MCP server is your ticket to smarter, conversational integrations.

Ready to try it? Grab your OpenAPI spec, fire up Stainless, and let your API shine in the AI world.

💡
Want a great API Testing tool that generates beautiful API Documentation?

Want an integrated, All-in-One platform for your Developer Team to work together with maximum productivity?

Apidog delivers all your demands, and replaces Postman at a much more affordable price!
button

Explore more

How to Create an MCP Server with the Java SDK

How to Create an MCP Server with the Java SDK

Create a Java MCP Server to share data with AI models using the Java SDK. This guide covers setup, testing with game data, and extending tools for Claude or Cursor integration.

4 August 2025

Apigee vs Kong: Comprehensive Guide to Choosing the Right API Gateway

Apigee vs Kong: Comprehensive Guide to Choosing the Right API Gateway

Choosing the right API gateway can shape your app’s performance, security, and scalability. This guide breaks down Apigee vs Kong—comparing features, use cases, developer experience, and when to use each. Plus, see how Apidog fits in to streamline your API workflow from design to deployment.

1 August 2025

Web Services vs Microservices: What's the Difference

Web Services vs Microservices: What's the Difference

Explore the key differences between web services and microservices. Learn when to use each architecture, real-world examples, and how Apidog simplifies API management across both styles. Get insights and a free tool to streamline your API strategy today!

1 August 2025

Practice API Design-first in Apidog

Discover an easier way to build and use APIs