The Rise of AI-Readable Web Services
When AI assistants like ChatGPT gained the ability to call external services, a new challenge emerged: how does an AI agent know what a web service can do, how to call it, and what legal or usage constraints apply? The answer was the AI plugin manifest — a machine-readable JSON file that describes a service's capabilities to AI systems.
While the original ChatGPT plugin ecosystem has evolved, the underlying pattern — publishing a structured capability declaration at a well-known path — is becoming a foundational convention for the agentic web.
What Is an AI Plugin Manifest?
An AI plugin manifest is a JSON file, typically served at /.well-known/ai-plugin.json, that tells an AI system:
- What the service is and what it does (in natural language)
- Where to find the API specification (usually an OpenAPI/Swagger document)
- Authentication requirements
- Legal and contact information
- Logos and display metadata for the AI's UI
The AI agent reads this file first, then uses the linked API spec to understand the exact endpoints, parameters, and response shapes it can work with.
Anatomy of an ai-plugin.json File
Here is a representative example of the manifest structure:
{
"schema_version": "v1",
"name_for_human": "Weather Lookup",
"name_for_model": "weather_lookup",
"description_for_human": "Get current weather and forecasts for any city.",
"description_for_model": "Use this plugin to retrieve weather data. Call get_weather with a city name to get temperature, humidity, and a 5-day forecast.",
"auth": {
"type": "none"
},
"api": {
"type": "openapi",
"url": "https://api.example.com/openapi.yaml"
},
"logo_url": "https://example.com/logo.png",
"contact_email": "support@example.com",
"legal_info_url": "https://example.com/terms"
}
Key Fields Explained
| Field | Purpose |
|---|---|
name_for_model | The programmatic name the AI uses to reference the plugin internally |
description_for_model | A prompt-style description that guides the AI on when and how to use the plugin |
auth.type | Authentication method: none, service_http, oauth, or user_http |
api.url | URL to the OpenAPI specification the AI reads to understand all available endpoints |
Authentication Options
The manifest supports several authentication patterns, suited to different service types:
- None: Open APIs with no authentication required.
- Service HTTP: A single bearer token shared by all users — suitable for services where the AI platform manages billing.
- User HTTP: Each end-user provides their own API key.
- OAuth: Full OAuth 2.0 flow, allowing the plugin to act on behalf of authenticated users.
The description_for_model Field: Why It Matters
This field is, arguably, the most important in the entire manifest. Unlike description_for_human, which is marketing copy, description_for_model is injected directly into the AI's context window. It acts as a system prompt that shapes how the AI decides to use your service. Best practices include:
- Clearly state what problems the plugin solves and when to use it.
- List any important constraints (rate limits, required parameters, geographic restrictions).
- Use imperative, instruction-style language: "Use this plugin when the user asks about X."
- Keep it concise — every token in this field costs inference budget.
Beyond ChatGPT: The Broader Agent Ecosystem
While ai-plugin.json originated with OpenAI's plugin system, the concept of capability manifests at well-known paths is spreading. Emerging standards like the Model Context Protocol (MCP) and various agent framework specifications are building on this pattern. If you operate a public API today, publishing a manifest — even a simple one — positions your service for discoverability in an increasingly AI-driven web.
Getting Started
To expose your service to AI agents, you need: a valid ai-plugin.json at /.well-known/ai-plugin.json, a publicly accessible OpenAPI spec, and CORS headers set to allow cross-origin reads. Start simple, then iterate based on how AI systems interact with your endpoints.