MCP (Model Context Protocol) is an open protocol that enables AI models to interact securely with local and remote resources.
Just perform unified configuration on clients that support MCP (such as Cline, Cursor, Claude, etc.). OpenAI announced support for the MCP protocol in March 2025, enabling this feature to be widely used on all mainstream AI platforms.
EdgeOne Pages Deploy MCP
EdgeOne Pages Deploy MCP is a dedicated service that can quickly deploy HTML content to EdgeOne Pages and generate a public access link. This enables you to immediately preview and share AI-generated web content.
After the second edition of the MCP Server protocol was finalized, we immediately upgraded this dedicated Pages service to support Streamable HTTP. You can experience the web version of MCP at https://mcp-on-edge.edgeone.app/. This is an application example of MCP Client and MCP Server implemented based on EdgeOne edge functions. It allows for quick deployment of AI content on the web and one-click generation of access links.
Configuration Method
In any MCP-supported client, you can use the following two JSON configuration methods to quickly integrate Pages Deploy MCP Server.
Based on Standard Input/Output
This is also the implementation method for most MCP Servers currently. Just add a few lines of simple configuration in applications that support MCP, and you can start a fully - optimized web page fast - deployment service. This allows AI to publish the completed web page code to edge nodes and provide an accessible URL.
{
"mcpServers":{
"edgeone-pages-mcp-server":{
"command":"npx",
"args":["edgeone-pages-mcp"]
}
}
}
Supported clients: Cursor, VSCode, Windsurf, ChatWise, Cheery Studio
Based on Streamable HTTP
Although Stdio is convenient, its threshold for use is not low. For example, it requires local environments such as Node.js, Python or Docker. Also, there are certain potential security risks. Therefore, performing remote calls through Streamable HTTP is gradually becoming the future trend of MCP Server.
Pages MCP Server leverages serverless edge computing capability and KV storage. By receiving HTML content through API, it can automatically generate a publicly accessible link that takes effect immediately, implement second-level static page deployment with a built-in error handling mechanism.
Build Your Own MCP Service
In summary, MCP allows AI to access more resources and call more tools during the dialogue. Below we use two examples for quick start to deploy your own MCP Server.
Provide a Local MCP Server Based on Geographical Location
You can use the MCP Geo geographical location MCP template for one-click deployment. When AI needs to get the user's geographical location, it can automatically obtain this information through the get_geo API, and then recommend nearby restaurants or scenic spots.
Quickly Deploy a Remote MCP Server for Web Pages
After the second version of the MCP Server protocol was finalized, we immediately upgraded this dedicated service for Pages to support Streamable HTTP. You can access https://mcp-on-edge.edgeone.app/ to try out the web version of MCP. This is an application example of MCP Client and MCP Server implemented based on EdgeOne edge functions. It allows for fast deployment of AI content on the web page and one-click generation of access links.
Of course, you can also quickly deploy your own remote MCP Server using the MCP on Edge template. The environment variables API_KEY, BASE_URL, and MODEL in it are compatible with OpenAI's API specification, which means you can directly configure these variables according to OpenAI's official usage method.
The following table provides several references for obtaining the API_KEY.
After extensive discussions about the remote MCP Server in the MCP community, the second edition of the MCP Server protocol has finally been finalized. The transmission process no longer has a strong dependence on SSE. Only exposing an API is required to complete communication. Although the number of currently supported clients is limited, with the industry's support for the Streamable HTTP MCP Server, more diverse usage scenarios and a more convenient user experience will emerge in the future.
The MCP technology trend fits well with the edge serverless architecture of Pages Functions. Its advantages in performance, scalability, and ease of use allow developers to enjoy the convenience of the global edge network without the need to manage infrastructure. We will continuously follow the industry trends, combine with the technical evolution direction of the community, continuously enhance the relevant capabilities of MCP, and help developers improve efficiency and development experience.
For more details about Pages, view other chapters in the document.