What Is ai-plugin.json?
As AI agents become more capable of browsing the web and interacting with services autonomously, a new standard has emerged for describing how those agents should interact with a website: the ai-plugin.json manifest. Originally popularized by the ChatGPT plugin ecosystem, this specification defines a machine-readable file that tells AI systems what a service does, how to authenticate with it, and where to find its API definition.
It typically lives at /.well-known/ai-plugin.json on a domain, following the .well-known pattern for discoverability.
Anatomy of an ai-plugin.json File
A minimal but complete ai-plugin.json looks like this:
{
"schema_version": "v1",
"name_for_human": "My Service",
"name_for_model": "my_service",
"description_for_human": "Access and manage data from My Service.",
"description_for_model": "Use this plugin to query and update records in My Service. Supports filtering, pagination, and bulk operations.",
"auth": {
"type": "none"
},
"api": {
"type": "openapi",
"url": "https://example.com/openapi.yaml"
},
"logo_url": "https://example.com/logo.png",
"contact_email": "support@example.com",
"legal_info_url": "https://example.com/legal"
}
Key Fields Explained
| Field | Purpose | Notes |
|---|---|---|
name_for_human | Display name shown to users | Keep it short and recognizable |
name_for_model | Identifier used in AI prompts | Lowercase, no spaces — used in function calls |
description_for_model | Guides the AI on when/how to use the plugin | Most critical field — be specific |
auth | Authentication method | Options: none, user_http, service_http, oauth |
api.url | Points to the OpenAPI spec | Must be a valid, accessible OpenAPI 3.x document |
Authentication Options
The auth field supports several configurations depending on your security model:
- None — Public API, no authentication required
- Service HTTP — The AI platform uses a shared API key you provide
- User HTTP — Each end user provides their own API key
- OAuth — Full OAuth 2.0 flow, allowing per-user authorization with scoped tokens
For production services handling user data, OAuth is strongly recommended. It gives users visibility and control over what the AI agent can access on their behalf.
Writing Effective Model Descriptions
The description_for_model field is where most developers underinvest. This text is injected directly into the AI's context window and heavily influences whether the agent uses your plugin correctly. Good practices include:
- Be explicit about use cases — "Use this when the user asks about X or Y"
- Describe limitations — "Does not support real-time data; results may be up to 1 hour old"
- Clarify data formats — "Dates must be in ISO 8601 format"
- Explain side effects — "The DELETE endpoint permanently removes records"
The OpenAPI Connection
Your ai-plugin.json points to an OpenAPI specification, which is where the actual API endpoints are defined. The AI uses this schema to understand what parameters each endpoint accepts, what it returns, and how to construct requests. Well-documented OpenAPI specs with clear summary and description fields on each operation translate directly into better AI behavior.
Beyond ChatGPT: The Broader Ecosystem
While the ai-plugin.json format was popularized by one platform, the underlying concept — a machine-readable service manifest at a well-known location — is gaining traction across the AI agent ecosystem. Frameworks like LangChain, AutoGPT, and various emerging agent runtimes look for similar discovery patterns. Investing in a well-structured plugin manifest positions your service to be accessible to the broadening landscape of AI agents as standards continue to evolve.
Getting Started
- Create your OpenAPI 3.x specification first — it's the foundation
- Write the ai-plugin.json manifest referencing your spec
- Host both at accessible HTTPS URLs
- Place ai-plugin.json at
/.well-known/ai-plugin.json - Test with an AI platform that supports plugin discovery