Edge Developer Platform
  • Pages
    • Product Introduction
    • Quick Start
      • Importing a Git Repository
      • Starting From a Template
      • Direct Upload
    • Framework Guide
      • Frontends
      • Backends
      • Full-stack
        • Next.js
    • Project Guide
      • Project Management
      • edgeone.json
      • Configuring Cache
      • Error Codes
    • Build Guide
    • Deployment Guide
      • Overview
      • Create Deploys
      • Manage Deploys
      • Deploy Button
      • Use Github Actions
      • Using CNB Plugin
      • Using IDE Plug-In
      • Using CodeBuddy IDE
    • Domain Management
      • Overview
      • Custom Domain
      • Configuring an HTTPS Certificate
      • How to Configure a DNS CNAME Record
    • Pages Functions
      • Overview
      • Edge Functions
      • Node Functions
    • Log Analysis
    • KV Storage
    • Edge AI
    • API Token
    • EdgeOne CLI
    • Pages MCP
    • Integration Guide
      • AI
        • Dialogue Large Models Integration
        • Large Models for Images Integration
      • Database
        • Supabase Integration
        • Pages KV Integration
      • Ecommerce
        • Shopify Integration
        • WooCommerce Integration
      • Payment
        • Stripe Integration
        • Integrating Paddle
      • CMS
        • WordPress Integration
        • Contentful Integration
        • Sanity Integration
      • Authentication
        • Supabase Integration
        • Clerk Integration
    • Best Practices
      • Using General Large Model to Quickly Build AI Application
      • Use the Deepseek-R1 model to quickly build a conversational AI site
      • Building an Ecommerce Platform with WordPress + WooCommerce and GatsbyJS
      • Building a SaaS Site Using Supabase and Stripe
      • Building a Company Brand Site Quickly
      • How to Quickly Build a Blog Site
    • Migration Guides
      • Migrating from Vercel to EdgeOne Pages
      • Migrating from Cloudflare Pages to EdgeOne Pages
      • Migrating from Netlify to EdgeOne Pages
    • Troubleshooting
    • FAQs
    • Contact Us
    • Release Notes

Pages MCP

What Is MCP

MCP (Model Context Protocol) is an open protocol that enables AI models to securely interact with local and remote resources.

Just perform unified configuration in MCP-supported clients such as Cline, Cursor, and Claude. OpenAI announced support for the MCP protocol in March 2025, making this feature widely used across all mainstream AI platforms.


EdgeOne Pages Deploy MCP

EdgeOne Pages Deploy MCP is a dedicated service that enables quick deployment of Web static resource content to EdgeOne Pages and generates public access links. This allows you to immediately preview and share AI-generated Web content or project build products.

Deploy an HTML single file




Deploy a folder or ZIP package



Configuration Method

In any MCP-supported client, you can use the following two methods of JSON configuration to quickly integrate Pages Deploy MCP Server.


Based on Stdio (Standard Input/Output)

This is also the current implementation of most MCP Servers. Just add a few lines of simple configuration in an MCP-supported application to start a fully optimized web deployment service, allowing AI to publish the completed web code to edge nodes and provide an accessible URL.
Note:
Note: When AI calls Pages Deploy MCP Server to deploy a single html file, only a temporary link is generated. To associate with a Pages Project, you must tell the AI to deploy a certain folder or ZIP compressed package.
Method for obtaining EDGEONE_PAGES_API_TOKEN: Referenced document API Token.
{
"mcpServers": {
"edgeone-pages-mcp-server": {
"command": "npx",
"args": ["edgeone-pages-mcp"],
"env": {
// Optional. API Token required if deploying a folder or ZIP compressed package
"EDGEONE_PAGES_API_TOKEN": "",
// Optional. Leave blank to create a new Pages Project, or fill in an existing project name to update the project
"EDGEONE_PAGES_PROJECT_NAME": ""
}
}
}
}
Supported clients: Cursor, VSCode, Windsurf, ChatWise, Cheery Studio, and others.


Based on Streamable HTTP

Stdio is convenient, but because it depends on specific clients, there are also potential security risks. Therefore, remote calls via Streamable HTTP will become the future trend for MCP Server.

As shown below, only need to specify the remote endpoint.
{
"mcpServers": {
"edgeone-pages-mcp-server": {
"url": "https://mcp-on-edge.edgeone.app/mcp-server"
}
}
}
Supported clients: ChatWise
Note:
Note: The Streamable HTTP method does not support deployment of folders or ZIP packages.


Deploying a Service Based on Streamable HTTP

In addition, we have open-sourced the above Streamable HTTP service. You can quickly create an MCP Server with deployment service through Self Hosted Pages MCP, enabling AI to deploy static web pages under your Pages project.
Pre-issues:
1. Configure KV storage: used to store HTML content. The variable name for the bound KV namespace must be my_kv. Redeploy the project after binding. For more ways to use KV, see KV storage.
2. Bind a custom domain: Obtain an exclusive access address. For details, see custom domain name.

After deployment, add the following configurations in your MCP Server configuration file:
{
"mcpServers": {
"edgeone-pages": {
"url": "https://yourcustomdomainname/mcp-server"
}
}
}

You can use natural language to have AI help you deploy HTML content to Pages or through API, for example:
curl -X POST https://your-custom-domain-name/kv/set \ -H "Content-Type: application/json" \ -d '{"value": "<html><body><h1>Hello, World!</h1></body></html>"}'


Technical Principles

Pages MCP Server leverages serverless edge computing capability and KV storage to receive HTML content through API, automatically generating a public access URL that takes effect immediately, enabling static page deployment in seconds with built-in error handling mechanism.



Building Your Own MCP Service

In summary, MCP enables AI to obtain more resources and call more tools during the dialogue. Following are two sample codes to quickly start deploying your own MCP Server.


Local MCP Server

You can use the MCP Geo geographic location MCP Template for one-click deployment, then add a few simple configuration lines in editors like Cursor to enable it. When AI needs to get user geographic location, it can automatically obtain this information through the get_geo API, then recommend nearby restaurants or scenic spots.


Remote MCP Server

After the second edition of the MCP Server protocol was confirmed, we immediately upgraded Pages' dedicated service to support Streamable HTTP. Visit https://mcp-on-edge.edgeone.app/ to experience the web version of MCP.

You can also use the MCP on Edge Template to quickly deploy MCP Client and MCP Server implemented based on EdgeOne edge functions. The environment variables API_KEY, BASE_URL, and MODEL are fully compatible with OpenAI's API specification, which means you can directly configure these variables using OpenAI's official usage method.

The following table provides references for obtaining API_KEY
:
Platform
BASE_URL
API_KEY (document address)
MODEL (example model ID)
https://openrouter.ai/api/v1
https://openrouter.ai/settings/keys
anthropic/claude-3.7-sonnet
https://api.deepinfra.com/v1/openai
https://deepinfra.com/dash/api_keys
anthropic/claude-3-7-sonnet-latest
https://api.deepseek.com/v1
https://platform.deepseek.com/api_keys
deepseek-chat


Why Use EdgeOne Pages

MCP technology trends align well with the edge serverless architecture of Pages Functions. Its advantages in performance, scalability, and ease of use enable developers to enjoy the convenience of the Global Edge Network without managing infrastructure. We will continue to follow industry updates, combine them with the evolution of community technology, and continuously enhance MCP capabilities to help developers improve efficiency and development experience.

For more details about Pages, view the document in other chapters.