Publishing Your Website Content to AI Assistants
When people ask AI assistants about your product or project, they often get outdated information. Here's how to publish your static website content directly to AI tools using Cloudflare Workers and the Model Context Protocol.
How can we ensure that our content, such as project documentation, is available to those who use AI assistants for augmenting their capabilities? These tools work better when they can access the information using an MCP interface instead of relying on a point-in-time snapshot of the model’s training data or searching the web. For instance, when I use an AI coding tool for functionality related to Cloudflare, it quickly finds the specs because I added the Cloudflare docs MCP server to my AI environment.
I put together an approach you can use to make your own site available to AI assistants in a similar manner. It uses Cloudflare Workers to run the serverless MCP “server” remotely. It’s an open-source project you can replicate and customize for your needs.
Why the Model Context Protocol
Several approaches exist for giving AI assistants access to external content. You can use web search, but that requires the tool to decide to perform the search, wait for the process to complete, and then hope the right details appear in the results. You can use retrieval augmented generation (RAG), but that can be tricky to deploy and isn’t something your readers can easily add to their AI tools.
MCP allows you to take a different approach. When an AI user adds your MCP server to their AI environment, they’re extending the AI tool’s capability in an efficient way, making your site’s content available in a fast, accurate, and token-friendly way. Once the MCP server has indexed your site’s content, the AI tool can discover and automatically query the content under the appropriate circumstances without manual prompting.
How the Solution Works
The system I built has four components. At build time, an “adapter” processes your site’s markdown content and generates a search index. That index is stored in your Cloudflare R2 location. A Cloudflare Worker implements the MCP “server” in a serverless manner, handling search and retrieval requests from AI tools. When AI assistants connect, they discover your site’s capabilities and can query your content as needed.
The Cloudflare architecture keeps costs minimal and is often free even at the entry-level pricing tier. I published adapters for generating content for Astro, Hugo, and generic Markdown files. You or your AI coding agent can easily adapt it for other static website generators.
See It in Action
You can see this approach in action by adding the MCP server I created for REMnux documentation to your AI environment. REMnux is an open-source toolkit for analyzing malware. It has many tools, all of which are outlined on the project’s documentation site. The MCP server makes the details about the tools and their usage and applicability more readily available in REMnux users’ AI tools. You can add it to your environment by pointing your AI tool at https://docs-mcp.remnux.org/mcp.
I also created an MCP server for my own website. You’re welcome to add it to your AI environment if you’d like by pointing your AI tool at https://website-mcp.zeltser.com/mcp. Its purpose is mostly to allow me to query my own content so I can build on my prior work.
Getting Started
The repository I created includes the Worker code, build-time adapters for generating search indexes, and configuration examples for AI tools such as Claude Desktop and Claude Code. You can clone it and customize it or point your AI agent at it and let it do the work for you.