Using OpenAI Response API for MCP Integration
This guide demonstrates how to use the OpenAI Response API to integrate with the Model Context Protocol (MCP) for seamless, AI-driven API interactions. You will learn how to generate a REST API specification with Postman's AI Agent, deploy it as an MCP server using HAPI Server, and connect it through OpenAI's Response API for testing.
The goal of this demo is to show how OpenAI's new Tools and Connectors for MCP feature simplifies integration between large language models and your APIs, making it possible to interact with services such as a "Taco Ordering System" directly from an LLM client like ChatGPT or a Bun-based application.
By the end of this guide, you will have a working MCP server derived from a Swagger specification, exposed via ngrok, and connected to the OpenAI Response API for end-to-end interaction testing.
Prerequisites
Before you begin:
-
Install the following tools:
- Bun – A fast JavaScript runtime for modern apps.
- Alternatively, you can use Node.js if preferred.
- ngrok – To expose your local server to the internet.
- HAPI Server – To host your MCP server.
- OpenAI SDK – For client testing.
- Obtain or install Postman Desktop Agent to generate APIs using AI.
- Alternatively, you can ask ChatGPT to generate a Swagger specification (v3).
- Bun – A fast JavaScript runtime for modern apps.
The example below uses a local environment and ngrok for simplicity, but you can deploy your server on any cloud instance or containerized environment.
Step 1: Generate an API Using Postman Agent
You can use Postman's AI Agent to quickly generate an API for any use case. In this example, you will create a simple Taco Ordering System.
-
Open Postman's AI Agent and prompt:
Generate an API for a Taco ordering system with endpoints to view tacos and place an order. -
The agent returns a Swagger (OpenAPI) specification for your taco API.
-
Review the generated specification to confirm it includes paths for:
GET /tacosPOST /order
Save the file in your HAPI_HOME/specs directory as tacos.yaml for use in the next step. (JSON format is also acceptable.)
Step 2: Start an MCP Server Using HAPI Server
Use the HAPI Server to host your Swagger API as an MCP server. This allows LLM-based clients such as ChatGPT to call your API directly.
-
Connect to your instance and run the following command to start the MCP server:
bunx @la-rebelion/hapimcp serve tacos --headless --chatgptAlternatively, you can use the native CLI (download from GitHub releases) and run:
hapi serve tacos --headless --chatgpt -
Once running, expose the local server to the internet using ngrok:
ngrok http 8000 -
Copy the public URL generated by ngrok (e.g.,
https://abcd1234.ngrok.io).
Step 3: Connect ChatGPT or Another Client to the MCP Server
- Open ChatGPT's settings and add a new connector with the ngrok URL you copied earlier.
- Enable the connector. If the feature is not visible, note that MCP connectors in ChatGPT may still be in beta.
- Alternatively, you can connect directly from an MCP-compatible client using the OpenAI SDK.
Step 4: Test the MCP Server With OpenAI SDK
You can test the endpoint by sending a prompt that interacts with your Taco API.
-
Initialize a new Bun project:
mkdir taco-client && cd taco-client
bun init -
Install dependencies:
bun install openai -
Create an
index.tsfile and configure your endpoint:import OpenAI from "openai";
const client = new OpenAI({
apiKey: process.env.OPENAI_API_KEY,
baseURL: "https://abcd1234.ngrok.io/v1"
});
const response = await client.chat.completions.create({
model: "gpt-4.1",
messages: [{ role: "user", content: "What tacos do you have in the menu?" }]
});
console.log(response.choices[0].message); -
Run the client:
bun run index.ts
You should see a response such as:
We have "carne asada" and "al pastor" tacos available.
Step 5: Place an Order Through the API
Now that your MCP server is connected, you can test the ordering functionality.
Run:
bun run index.ts
With the following request:
messages: [
{ role: "user", content: "Place an order for 2 tacos al pastor and 1 carne asada." }
]
The client confirms your order:
Order placed successfully!
You ordered 2 tacos al pastor and 1 carne asada.
Demo Video
Conclusion
You successfully:
- Generated a Swagger API with Postman's AI Agent.
- Deployed it as an MCP server using HAPI Server.
- Exposed the service securely with ngrok.
- Connected and interacted with it using ChatGPT and Bun.
This setup allows you to prototype API-first AI integrations quickly. You can expand this demo to connect more endpoints or integrate it into a cluster for scalability.
To continue exploring, deploy your HAPI MCP server on cloud instances or integrate it into your existing CI/CD pipeline for real-world workloads.
Be HAPI, and go Rebels! ✊🏽
