MCP (Model Context Protocol) is an open protocol that enables AI models to securely interact with local and remote resources.
Just perform unified configuration on clients that support MCP (such as Cline, Cursor, Claude, etc.). OpenAI announced support for the MCP protocol in March 2025, enabling this feature to be widely used on all mainstream AI platforms.
EdgeOne Pages Deploy MCP
EdgeOne Pages Deploy MCP is a dedicated service that allows you to quickly deploy HTML content to EdgeOne Pages and generate a public access link. This enables you to immediately preview and share AI-generated web content.
Configuration Method
In any client that supports MCP, you can use the following two JSON configurations to quickly integrate Pages Deploy MCP Server.
Based on Standard Input/Output
This is also the implementation method for most MCP Servers currently. Just add a few lines of simple configuration in applications that support MCP, and you can start a fully - optimized web page fast deployment service. Let the AI publish the completed web code to the edge node and provide an accessible URL.
{
"mcpServers":{
"edgeone-pages-mcp-server":{
"command":"npx",
"args":["edgeone-pages-mcp"]
}
}
}
Supported clients: Cursor, VSCode, Windsurf, ChatWise, Cheery Studio
Based on Streamable HTTP
Although Stdio is convenient, it has certain potential security risks because it depends on specific clients. Therefore, performing remote calls through Streamable HTTP will become the future trend of MCP Server.
As shown below, just need to specify the remote endpoint.
Pages MCP Server leverages serverless edge computing capability and KV storage. By receiving HTML content through API, it can automatically generate a publicly accessible link that takes effect immediately, implement second-level static page deployment, and has a built-in error handling mechanism.
Build Your Own MCP Service
In summary, MCP enables AI to access more resources and call more tools during the dialogue. Below we use two examples for quick start to deploy your own MCP Server.
Local MCP Server
You can use the MCP Geo geographic location MCP template for one-click deployment, then add a few lines of simple configuration in editors such as Cursor to enable it. When the AI needs to obtain user geographic location, it can automatically get this information through the get_geo API, then recommend nearby restaurants or scenic spots.
Remote MCP Server
After the second edition of the MCP Server protocol was finalized, we immediately upgraded this dedicated service for Pages to support Streamable HTTP. Access https://mcp-on-edge.edgeone.app/ to experience the web version of MCP.
You can also quickly deploy the MCP Client and MCP Server implemented based on EdgeOne edge functions via the MCP on Edge template. The environment variables API_KEY, BASE_URL, and MODEL in it are compatible with OpenAI's API specification, which means you can directly configure these variables according to OpenAI's official usage method.
The following table provides several references for obtaining an API_KEY.
MCP technology trends highly fit the edge serverless architecture of Pages Functions. Its advantages in performance, scalability, and ease of use enable developers to enjoy the convenience of the global edge network without managing infrastructure. We will continuously follow industry trends, combine with the evolution direction of community technology, continuously enhance MCP relevant capabilities, and help developers improve efficiency and development experience.
For more details about Pages, view other chapters in the document.