MCP (Model Context Protocol) is an open protocol that enables AI models to securely interact with local and remote resources.
Just perform unified configuration in MCP-supported clients such as Cline, Cursor, and Claude. OpenAI announced support for the MCP protocol in March 2025, making this feature widely used across all mainstream AI platforms.
EdgeOne Pages Deploy MCP
EdgeOne Pages Deploy MCP is a dedicated service that enables quick deployment of Web static resource content to EdgeOne Pages and generates public access links. This allows you to immediately preview and share AI-generated Web content or project build products.
Deploy an HTML single file
Deploy a folder or ZIP package
Configuration Method
In any MCP-supported client, you can use the following two methods of JSON configuration to quickly integrate Pages Deploy MCP Server.
Based on Stdio (Standard Input/Output)
This is also the current implementation of most MCP Servers. Just add a few lines of simple configuration in an MCP-supported application to start a fully optimized web deployment service, allowing AI to publish the completed web code to edge nodes and provide an accessible URL.
Note:
Note: When AI calls Pages Deploy MCP Server to deploy a single html file, only a temporary link is generated. To associate with a Pages Project, you must tell the AI to deploy a certain folder or ZIP compressed package.
Method for obtaining EDGEONE_PAGES_API_TOKEN: Referenced document API Token.
{
"mcpServers":{
"edgeone-pages-mcp-server":{
"command":"npx",
"args":["edgeone-pages-mcp"],
"env":{
// Optional. API Token required if deploying a folder or ZIP compressed package
"EDGEONE_PAGES_API_TOKEN":"",
// Optional. Leave blank to create a new Pages Project, or fill in an existing project name to update the project
"EDGEONE_PAGES_PROJECT_NAME":""
}
}
}
}
Supported clients: Cursor, VSCode, Windsurf, ChatWise, Cheery Studio, and others.
Based on Streamable HTTP
Stdio is convenient, but because it depends on specific clients, there are also potential security risks. Therefore, remote calls via Streamable HTTP will become the future trend for MCP Server.
As shown below, only need to specify the remote endpoint.
Note: The Streamable HTTP method does not support deployment of folders or ZIP packages.
Deploying a Service Based on Streamable HTTP
In addition, we have open-sourced the above Streamable HTTP service. You can quickly create an MCP Server with deployment service through Self Hosted Pages MCP, enabling AI to deploy static web pages under your Pages project.
Pre-issues:
1. Configure KV storage: used to store HTML content. The variable name for the bound KV namespace must be my_kv. Redeploy the project after binding. For more ways to use KV, see KV storage.
2. Bind a custom domain: Obtain an exclusive access address. For details, see custom domain name.
After deployment, add the following configurations in your MCP Server configuration file:
{
"mcpServers":{
"edgeone-pages":{
"url":"https://yourcustomdomainname/mcp-server"
}
}
}
You can use natural language to have AI help you deploy HTML content to Pages or through API, for example:
Pages MCP Server leverages serverless edge computing capability and KV storage to receive HTML content through API, automatically generating a public access URL that takes effect immediately, enabling static page deployment in seconds with built-in error handling mechanism.
Building Your Own MCP Service
In summary, MCP enables AI to obtain more resources and call more tools during the dialogue. Following are two sample codes to quickly start deploying your own MCP Server.
Local MCP Server
You can use the MCP Geo geographic location MCP Template for one-click deployment, then add a few simple configuration lines in editors like Cursor to enable it. When AI needs to get user geographic location, it can automatically obtain this information through the get_geo API, then recommend nearby restaurants or scenic spots.
Remote MCP Server
After the second edition of the MCP Server protocol was confirmed, we immediately upgraded Pages' dedicated service to support Streamable HTTP. Visit https://mcp-on-edge.edgeone.app/ to experience the web version of MCP.
You can also use the MCP on Edge Template to quickly deploy MCP Client and MCP Server implemented based on EdgeOne edge functions. The environment variables API_KEY, BASE_URL, and MODEL are fully compatible with OpenAI's API specification, which means you can directly configure these variables using OpenAI's official usage method.
The following table provides references for obtaining API_KEY
MCP technology trends align well with the edge serverless architecture of Pages Functions. Its advantages in performance, scalability, and ease of use enable developers to enjoy the convenience of the Global Edge Network without managing infrastructure. We will continue to follow industry updates, combine them with the evolution of community technology, and continuously enhance MCP capabilities to help developers improve efficiency and development experience.
For more details about Pages, view the document in other chapters.