Edge Developer Platform
  • Pages
    • Product Introduction
    • Quick Start
      • Importing a Git Repository
      • Starting From a Template
      • Direct Upload
    • Framework Guide
    • Project Guide
      • Project Management
      • edgeone.json
      • Error Codes
    • Build Guide
    • Deployment Guide
      • Overview
      • Create Deploys
      • Manage Deploys
      • Deploy Button
    • Domain Management
      • Overview
      • Custom Domain Name
      • Configuring an HTTPS Certificate
      • How to Configure a DNS CNAME Record
    • Pages Functions
    • KV Storage
    • Edge AI
    • EdgeOne CLI
    • Pages MCP
    • Best Practices
      • 1-Minute Quick Deployment + Free Beta Test in Progress | Your Exclusive DeepSeek, Handled by EdgeOne!
      • Deploy WordPress Gatsby To EdgeOne Pages
      • Build an Ecommerce Platform Using WordPress + GatsbyJS
      • Combining Sanity to Deploy a Portfolio Template
    • Migration Guides
      • Migrating from Vercel to EdgeOne Pages
      • Migrating from Cloudflare Pages to EdgeOne Pages
      • Migrating from Netlify to EdgeOne Pages
    • FAQs
    • Contact Us
    • Release Notes
Unlock 1 Year of EdgeOne + 1TB CDN: Join Our Developer Journey
Get Started Now !

Pages MCP

What Is MCP?

MCP (Model Context Protocol) is an open protocol that enables AI models to securely interact with local and remote resources.

Just perform unified configuration on clients that support MCP (such as Cline, Cursor, Claude, etc.). OpenAI announced support for the MCP protocol in March 2025, enabling this feature to be widely used on all mainstream AI platforms.


EdgeOne Pages Deploy MCP

EdgeOne Pages Deploy MCP is a dedicated service that allows you to quickly deploy HTML content to EdgeOne Pages and generate a public access link. This enables you to immediately preview and share AI-generated web content.







Configuration Method

In any client that supports MCP, you can use the following two JSON configurations to quickly integrate Pages Deploy MCP Server.


Based on Standard Input/Output

This is also the implementation method for most MCP Servers currently. Just add a few lines of simple configuration in applications that support MCP, and you can start a fully - optimized web page fast deployment service. Let the AI publish the completed web code to the edge node and provide an accessible URL.
{
"mcpServers": {
"edgeone-pages-mcp-server": {
"command": "npx",
"args": ["edgeone-pages-mcp"]
}
}
}
Supported clients: Cursor, VSCode, Windsurf, ChatWise, Cheery Studio


Based on Streamable HTTP

Although Stdio is convenient, it has certain potential security risks because it depends on specific clients. Therefore, performing remote calls through Streamable HTTP will become the future trend of MCP Server.

As shown below, just need to specify the remote endpoint.
{
"mcpServers": {
"edgeone-pages-mcp-server": {
"url": "https://mcp-on-edge.edgeone.app/mcp-server"
}
}
}
Supported clients: ChatWise


Technical Principles

Pages MCP Server leverages serverless edge computing capability and KV storage. By receiving HTML content through API, it can automatically generate a publicly accessible link that takes effect immediately, implement second-level static page deployment, and has a built-in error handling mechanism.



Build Your Own MCP Service

In summary, MCP enables AI to access more resources and call more tools during the dialogue. Below we use two examples for quick start to deploy your own MCP Server.


Local MCP Server

You can use the MCP Geo geographic location MCP template for one-click deployment, then add a few lines of simple configuration in editors such as Cursor to enable it. When the AI needs to obtain user geographic location, it can automatically get this information through the get_geo API, then recommend nearby restaurants or scenic spots.


Remote MCP Server

After the second edition of the MCP Server protocol was finalized, we immediately upgraded this dedicated service for Pages to support Streamable HTTP. Access https://mcp-on-edge.edgeone.app/ to experience the web version of MCP.

You can also quickly deploy the MCP Client and MCP Server implemented based on EdgeOne edge functions via the MCP on Edge template. The environment variables API_KEY, BASE_URL, and MODEL in it are compatible with OpenAI's API specification, which means you can directly configure these variables according to OpenAI's official usage method.

The following table provides several references for obtaining an API_KEY.
Platform
API_KEY
BASE_URL
MODEL (example)
https://openrouter.ai/settings/keys
https://openrouter.ai/api/v1
deepseek/deepseek-r1
https://deepinfra.com/dash/api_keys
https://api.deepinfra.com/v1/openai
deepseek-ai/DeepSeek-R1
https://platform.deepseek.com/api_keys
https://api.deepseek.com/v1
deepseek-reasoner



Why Use EdgeOne Pages?

MCP technology trends highly fit the edge serverless architecture of Pages Functions. Its advantages in performance, scalability, and ease of use enable developers to enjoy the convenience of the global edge network without managing infrastructure. We will continuously follow industry trends, combine with the evolution direction of community technology, continuously enhance MCP relevant capabilities, and help developers improve efficiency and development experience.

For more details about Pages, view other chapters in the document.