QUICK INFO
| Difficulty | Beginner to Intermediate |
| Time Required | 60-90 minutes for basic app; 2-3 hours with auth + monetization |
| Prerequisites | Node.js 18+, basic JavaScript/TypeScript, ChatGPT account (Free plan works) |
| Tools Needed | Code editor, ngrok or similar tunnel tool, terminal access |
What You'll Learn:
- Set up an MCP server that ChatGPT can communicate with
- Build interactive widgets that render inside ChatGPT conversations
- Add OAuth authentication for user-specific features
- Implement checkout flows for monetizing your app
This guide walks through the entire process of building a ChatGPT app using OpenAI's Apps SDK. You'll end up with a working app that ChatGPT can invoke, renders a custom UI widget, and (optionally) handles user authentication and payments. The Apps SDK is in preview as of late 2025, with public app submissions expected to open later.
You should have some familiarity with JavaScript or TypeScript. If you've built a basic Express server or worked with React, you'll be fine. Python is also supported, but this guide focuses on Node.js.
What You're Actually Building
The Apps SDK sits on top of the Model Context Protocol (MCP), which is just a standardized way for ChatGPT to discover and call external tools. Your app has two parts: an MCP server that defines what tools are available and handles the logic, and a web component (widget) that renders UI inside ChatGPT's iframe sandbox.
When a user says something like "show me my tasks" to ChatGPT, the model decides your tool is relevant, calls your MCP server with structured arguments, and your server returns data that ChatGPT renders using your widget. The widget communicates back through window.openai for subsequent interactions.
Getting Started
Enable Developer Mode in ChatGPT
Before writing any code, you need developer mode enabled. Go to Settings → Apps & Connectors → Advanced settings and toggle Developer Mode on. This lets you create connectors to your own MCP servers for testing. Enterprise and Business accounts may need admin approval.
Project Setup
Create a new directory and initialize it:
mkdir my-chatgpt-app
cd my-chatgpt-app
npm init -y
Install the MCP SDK and Zod for schema validation:
npm install @modelcontextprotocol/sdk zod
Add "type": "module" to your package.json since we're using ES modules.
Create Your First MCP Server
Create server.js:
import { createServer } from "node:http";
import { readFileSync } from "node:fs";
import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
import { StreamableHTTPServerTransport } from "@modelcontextprotocol/sdk/server/streamableHttp.js";
import { z } from "zod";
const port = process.env.PORT ?? 8787;
function createAppServer() {
const server = new McpServer({ name: "my-app", version: "0.1.0" });
// Register a simple tool
server.registerTool(
"say_hello",
{
title: "Say Hello",
description: "Returns a greeting message.",
inputSchema: { name: z.string().optional() },
},
async (args) => ({
content: [{ type: "text", text: `Hello, ${args?.name || "world"}!` }],
})
);
return server;
}
const httpServer = createServer(async (req, res) => {
const url = new URL(req.url, `http://${req.headers.host}`);
// CORS preflight
if (req.method === "OPTIONS" && url.pathname === "/mcp") {
res.writeHead(204, {
"Access-Control-Allow-Origin": "*",
"Access-Control-Allow-Methods": "POST, GET, OPTIONS",
"Access-Control-Allow-Headers": "content-type, mcp-session-id",
"Access-Control-Expose-Headers": "Mcp-Session-Id",
});
res.end();
return;
}
// Health check
if (req.method === "GET" && url.pathname === "/") {
res.writeHead(200).end("OK");
return;
}
// MCP endpoint
if (url.pathname === "/mcp") {
res.setHeader("Access-Control-Allow-Origin", "*");
res.setHeader("Access-Control-Expose-Headers", "Mcp-Session-Id");
const server = createAppServer();
const transport = new StreamableHTTPServerTransport({
sessionIdGenerator: undefined,
enableJsonResponse: true,
});
res.on("close", () => {
transport.close();
server.close();
});
await server.connect(transport);
await transport.handleRequest(req, res);
return;
}
res.writeHead(404).end("Not Found");
});
httpServer.listen(port, () => {
console.log(`MCP server running at http://localhost:${port}/mcp`);
});
Run it with node server.js. You should see the server start on port 8787.
Expose Your Server to the Internet
ChatGPT needs to reach your server over HTTPS. During development, ngrok is the standard approach:
ngrok http 8787
You'll get a URL like https://abc123.ngrok.app. The MCP endpoint is at https://abc123.ngrok.app/mcp.
Connect to ChatGPT
Go to Settings → Apps & Connectors and click Create. Enter your ngrok URL with /mcp appended, give it a name, select "No authentication" for now, and check "I trust this application."
Open a new chat, click the + button, select More, and add your connector. Then type something like "say hello to me" and ChatGPT should invoke your tool.
Building Interactive Widgets
A text response is fine, but the real power is custom UI. Let's build a widget.
Create the Widget HTML
Create public/widget.html:
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8">
<title>My Widget</title>
<style>
body {
font-family: system-ui, sans-serif;
margin: 0;
padding: 16px;
background: #f6f8fb;
}
.card {
background: white;
border-radius: 12px;
padding: 20px;
max-width: 400px;
margin: 0 auto;
box-shadow: 0 4px 12px rgba(0,0,0,0.08);
}
button {
background: #111bf5;
color: white;
border: none;
border-radius: 8px;
padding: 10px 20px;
cursor: pointer;
font-weight: 600;
}
</style>
</head>
<body>
<div class="card">
<h2 id="greeting">Loading...</h2>
<p id="message"></p>
<button id="refresh-btn">Refresh</button>
</div>
<script type="module">
const greetingEl = document.getElementById('greeting');
const messageEl = document.getElementById('message');
const refreshBtn = document.getElementById('refresh-btn');
function updateFromOutput() {
const output = window.openai?.toolOutput;
if (output) {
greetingEl.textContent = output.greeting || 'Hello!';
messageEl.textContent = output.message || '';
}
}
// Initial render from tool output
updateFromOutput();
// Listen for subsequent tool responses
window.addEventListener('openai:set_globals', (e) => {
updateFromOutput();
});
// Call another tool from the widget
refreshBtn.addEventListener('click', async () => {
if (window.openai?.callTool) {
const response = await window.openai.callTool('get_status', {});
if (response?.structuredContent) {
greetingEl.textContent = response.structuredContent.greeting;
messageEl.textContent = response.structuredContent.message;
}
}
});
</script>
</body>
</html>
Update the Server to Serve the Widget
Update server.js to register the widget as a resource and wire it to your tools:
import { createServer } from "node:http";
import { readFileSync } from "node:fs";
import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
import { StreamableHTTPServerTransport } from "@modelcontextprotocol/sdk/server/streamableHttp.js";
import { z } from "zod";
const widgetHtml = readFileSync("public/widget.html", "utf8");
const port = process.env.PORT ?? 8787;
function createAppServer() {
const server = new McpServer({ name: "my-app", version: "0.1.0" });
// Register the widget as a resource
server.registerResource(
"widget",
"ui://widget/main.html",
{},
async () => ({
contents: [{
uri: "ui://widget/main.html",
mimeType: "text/html+skybridge",
text: widgetHtml,
}],
})
);
// Tool that renders the widget
server.registerTool(
"show_dashboard",
{
title: "Show Dashboard",
description: "Displays the interactive dashboard widget.",
inputSchema: {},
_meta: {
"openai/outputTemplate": "ui://widget/main.html",
},
},
async () => ({
content: [{ type: "text", text: "Here's your dashboard." }],
structuredContent: {
greeting: "Welcome back!",
message: "Everything is running smoothly.",
},
})
);
// Tool the widget can call for updates
server.registerTool(
"get_status",
{
title: "Get Status",
description: "Returns current status.",
inputSchema: {},
_meta: {
"openai/outputTemplate": "ui://widget/main.html",
},
},
async () => ({
content: [],
structuredContent: {
greeting: "Status Update",
message: `Last checked: ${new Date().toLocaleTimeString()}`,
},
})
);
return server;
}
// ... rest of the HTTP server code remains the same
The _meta["openai/outputTemplate"] tells ChatGPT to render your widget when this tool returns. The structuredContent object becomes available as window.openai.toolOutput in your widget.
Restart your server and refresh your connector in ChatGPT settings (click into your connector and hit Refresh). Ask ChatGPT to "show my dashboard" and you should see the widget render inline.
Adding User Authentication
If your app needs to access user-specific data or perform actions on their behalf, you'll need authentication. The Apps SDK uses OAuth 2.1 with PKCE.
The Authentication Flow
When ChatGPT tries to call a protected tool, it initiates OAuth: the user gets redirected to your authorization endpoint, logs in, grants consent, and ChatGPT receives an access token. From then on, ChatGPT includes that token in the Authorization header of all MCP requests.
Your server must validate the token on every request. ChatGPT doesn't validate tokens for you; it just passes them along.
What You Need
Setting up OAuth requires an authorization server. You can use Auth0, Okta, your own implementation, or any provider that supports OAuth 2.1 with PKCE and Dynamic Client Registration (DCR). I won't walk through the full identity provider setup here since it varies by provider, but here's what ChatGPT expects:
Your server needs to expose a /.well-known/oauth-protected-resource endpoint that points to your authorization server's metadata. ChatGPT reads this to discover your auth endpoints.
{
"resource": "https://your-mcp-server.com",
"authorization_servers": ["https://your-auth-server.com"]
}
Your authorization server needs /.well-known/oauth-authorization-server with authorization_endpoint, token_endpoint, and registration_endpoint (for DCR).
Add https://chatgpt.com/connector_platform_oauth_redirect to your allowed redirect URIs.
Token Validation
In your MCP handler, extract and validate the token:
function validateToken(authHeader) {
if (!authHeader?.startsWith("Bearer ")) {
return null;
}
const token = authHeader.slice(7);
// Verify JWT signature against your auth server's JWKS
// Check iss, aud, exp, scopes
// Return decoded token or null
}
// In your request handler:
const token = validateToken(req.headers.authorization);
if (!token && toolRequiresAuth) {
res.writeHead(401, { "WWW-Authenticate": "Bearer" });
res.end();
return;
}
I'm glossing over the JWT validation details because they depend heavily on your auth provider. The point is that your server bears full responsibility for rejecting bad tokens.
Monetizing Your App
Here's where things get interesting (and also a bit limited, at least for now).
Current State of Monetization
OpenAI recommends external checkout as the primary monetization approach. This means directing users out of ChatGPT to your own website to complete purchases. You handle pricing, payments, taxes, and fulfillment on your domain.
There's also Instant Checkout, which lets users pay without leaving ChatGPT, but it's currently in private beta and only available to select marketplace partners. Physical goods purchases are approved first; broader commerce use cases are coming.
External Checkout Implementation
The external checkout flow is straightforward: your widget displays purchasable items, and when the user clicks buy, you redirect them to your checkout page.
// In your widget
const buyBtn = document.getElementById('buy-btn');
buyBtn.addEventListener('click', () => {
if (window.openai?.openExternal) {
window.openai.openExternal('https://your-site.com/checkout?item=premium');
} else {
window.open('https://your-site.com/checkout?item=premium', '_blank');
}
});
After purchase, the user returns to ChatGPT with their new entitlements. Your server checks their subscription status on subsequent tool calls.
Instant Checkout (Beta)
If you're building for physical goods and want to explore Instant Checkout, here's how it works. Your widget prepares a checkout session and calls window.openai.requestCheckout():
async function handleCheckout() {
const session = {
id: "session_" + Date.now(),
payment_provider: {
provider: "stripe", // or "adyen"
merchant_id: "your_merchant_id",
supported_payment_methods: ["card", "apple_pay", "google_pay"],
},
status: "ready_for_payment",
currency: "USD",
totals: [
{ type: "subtotal", display_text: "Subtotal", amount: 2500 },
{ type: "tax", display_text: "Tax", amount: 200 },
{ type: "total", display_text: "Total", amount: 2700 },
],
links: [
{ type: "terms_of_use", url: "https://your-site.com/terms" },
{ type: "privacy_policy", url: "https://your-site.com/privacy" },
],
payment_mode: "live", // use "test" for development
};
const order = await window.openai.requestCheckout(session);
return order;
}
ChatGPT opens a native checkout UI, collects payment info, and sends a complete_checkout tool call to your MCP server with payment data. Your server processes the payment through your PSP (Stripe, Adyen, etc.) and returns order confirmation.
Your MCP server needs a complete_checkout tool:
@tool(description="Completes a checkout session")
async def complete_checkout(
self,
checkout_session_id: str,
buyer: Buyer,
payment_data: PaymentData,
):
# Process payment with your PSP using payment_data token
# Create order in your system
return {
"structuredContent": {
"id": checkout_session_id,
"status": "completed",
"order": {
"id": "order_123",
"checkout_session_id": checkout_session_id,
},
}
}
For testing, set payment_mode: "test" to use test cards without moving real money.
Payment Provider Integration
Stripe and Adyen have specific guides for integrating with ChatGPT's Instant Checkout:
- Stripe: docs.stripe.com/agentic-commerce/apps
- Adyen: docs.adyen.com/online-payments/agentic-commerce
You'll need to set up webhooks, configure your merchant ID, and handle the token-based payment flow these providers expect from ChatGPT.
Deploying for Production
During development, ngrok works fine. For production, you need stable hosting with proper HTTPS.
Hosting Options
Managed container platforms (Fly.io, Render, Railway) handle TLS automatically and work well with the Apps SDK. Cloud serverless options (Cloud Run, Azure Container Apps) also work but watch out for cold start times that can interrupt streaming responses.
Whatever you choose, make sure /mcp stays responsive and supports streaming HTTP responses. Add a health check endpoint at / for monitoring.
Before Going Live
Store secrets in environment variables. Rotate them periodically. Add structured logging with correlation IDs but redact any PII before writing to logs. Implement rate limiting on your end since ChatGPT will send traffic patterns you can't fully predict.
For OAuth apps, use short-lived access tokens with refresh flows. Never log tokens.
Troubleshooting
Widget doesn't update after tool calls: Check that you're listening for openai:set_globals events. Also verify your server is returning data in structuredContent that matches what your widget expects.
502 errors when connecting: Your MCP endpoint isn't responding correctly. Check that /mcp handles POST, GET, and DELETE methods with proper CORS headers. The health check at / should return 200.
OAuth redirect fails: The most common issue is redirect URI mismatch. Make sure https://chatgpt.com/connector_platform_oauth_redirect is in your allowed redirects. Clock skew between servers can also cause token validation failures.
Tool not being invoked: ChatGPT decides when to call your tools based on the conversation. Make your tool descriptions clear and specific. Vague descriptions like "does stuff" won't help the model understand when to use your tool.
Changes not reflected after updating server: After modifying tools or metadata, go to Settings → Connectors, select your connector, and click Refresh. ChatGPT caches the tool definitions.
What's Next
This gets you from zero to a working ChatGPT app with optional auth and payment integration. The official examples repository at github.com/openai/openai-apps-sdk-examples has more complex patterns including React-based widgets and state management.
OpenAI's documentation at developers.openai.com/apps-sdk covers UI guidelines and security requirements you'll need to meet before submission. App submissions are expected to open later in 2025, with a dedicated app directory where users can discover apps.
PRO TIPS
Keep tool descriptions under 200 characters but make them specific enough that ChatGPT knows when to invoke them. "Searches for pizza restaurants nearby" beats "searches things."
Use _meta["openai/toolInvocation/invoking"] and _meta["openai/toolInvocation/invoked"] to customize the loading text ChatGPT shows while your tool runs.
For widgets, version your asset filenames (like widget-v2.js) to bust caches during development. Otherwise you'll wonder why your changes aren't appearing.
Test on mobile early. The ChatGPT mobile apps support connectors, and widget layouts that work on desktop might break on smaller screens.
FAQ
Q: Can I use Python instead of Node.js for the MCP server? A: Yes. The official MCP SDK has Python support. Check OpenAI's examples repo for Python server implementations.
Q: Do I need a paid ChatGPT plan to develop apps? A: No. Developer mode works on Free, Plus, Go, and Pro plans. Business and Enterprise accounts need admin approval to enable it.
Q: When will public app submissions open? A: OpenAI says "later this year" (2025). Apps that meet their developer guidelines will be eligible for listing in a dedicated directory.
Q: How does ChatGPT decide when to suggest my app? A: ChatGPT considers conversation context and user intent. If someone asks about tasks and your app handles task management, it might suggest your app. Clear tool descriptions and good metadata help.
Q: Can my widget access localStorage?
A: No. The sandbox restricts browser storage APIs. Use window.openai.setWidgetState() to persist data across sessions instead.
RESOURCES
- Apps SDK Documentation: Official docs covering all concepts, from MCP servers to UI guidelines
- Apps SDK Examples Repository: Working code examples including React widgets and various server patterns
- MCP Inspector: Test your MCP server locally before connecting to ChatGPT
- App Developer Guidelines: Requirements for app submission and listing




