Deploy MCP with Cloudflare Worker and Superbase Edge Function

sw8898385 sw8898385
Blog
Deploy MCP with Cloudflare Worker and Superbase Edge Function

Goal: keep the Cloudflare Worker as the only thing the browser talks to, store data in Supabase, and still expose your own MCP tools (news, weather, etc.) to an LLM that decides when to call them.

Cloudflare Workers run inside V8 isolates – you can’t spawn a child process.
Therefore the “stdio MCP server” we drew on the laptop has to become an HTTP-over-POST MCP server (the new official transport is exactly that).
You have two realistic ways to host that server:

  1. Supabase Edge Functions (Deno Deploy under the hood) – cheapest / serverless
  2. A tiny Node process on your DigitalOcean droplet – use only if you need libraries that Deno can’t run (sharp, Python, etc.)

Below is a complete “minimum lovable” implementation that keeps everything inside Supabase unless you explicitly choose the droplet.


High-level picture

Browser ──HTTPS──▶  Cloudflare Worker  (routes /chat + /mcp/* )
                       │
                       ├─ /chat  ----------------------------------------------------┐
                       │   calls OpenAI with function list,                            │
                       │   gets back function_call → forwards to /mcp/v1/call       │
                       │   returns text to browser                                    │
                       │                                                            │
                       └─ /mcp/*  -----------------------------------------------------┘
                              proxies to  Supabase Edge Function  (MCP-over-HTTP)

  1. Supabase Edge Function = HTTP-MCP server

Create a new function in your Supabase project:

supabase/functions/mcp-server/index.ts

import { serve } from 'https://deno.land/std@0.177.0/http/server.ts'
import { corsHeaders } from '../_shared/cors.ts'   // optional CORS helper

// ---- 1. Your tools ----------------------------------------------------------
import { getNews } from './tools/news.ts'
import { getWeather } from './tools/weather.ts'
const TOOLS: Record<string, any> = { get_news: getNews, get_weather: getWeather }

// ---- 2. MCP HTTP endpoints --------------------------------------------------
// POST  /mcp/v1/tools/list   -> { tools: [...] }
// POST  /mcp/v1/tools/call   -> { content: [...] }
serve(async (req: Request) => {
  if (req.method === 'OPTIONS') return new Response('ok', { headers: corsHeaders })

  const url = new URL(req.url)
  const body = await req.json()

  try {
    if (url.pathname.endsWith('/list')) {
      const tools = Object.values(TOOLS).map(t => ({
        name: t.name,
        description: t.description,
        inputSchema: t.inputSchema
      }))
      return new Response(JSON.stringify({ tools }), { headers: { ...corsHeaders, 'Content-Type': 'application/json' } })
    }

    if (url.pathname.endsWith('/call')) {
      const { name, arguments: args } = body
      if (!TOOLS[name]) throw new Error('Unknown tool')
      const text = await TOOLS[name](args)
      return new Response(JSON.stringify({ content: [{ type: 'text', text }] }), { headers: { ...corsHeaders, 'Content-Type': 'application/json' } })
    }

    return new Response('Not found', { status: 404 })
  } catch (e) {
    return new Response(JSON.stringify({ error: e.message }), { status: 400, headers: { ...corsHeaders, 'Content-Type': 'application/json' } })
  }
})

supabase/functions/mcp-server/tools/news.ts

export const getNews = {
  name: 'get_news',
  description: 'Top headlines for a country',
  inputSchema: {
    type: 'object',
    properties: { country: { type: 'string', enum: ['ca', 'us', 'gb'] } },
    required: ['country']
  },
  handler: async ({ country }: { country: string }) => {
    const res = await fetch(
      `https://newsapi.org/v2/top-headlines?country=${country}&pageSize=5`,
      { headers: { 'X-Api-Key': Deno.env.get('NEWS_KEY')! } }
    )
    const json = await res.json()
    return json.articles.map((a: any) => `• ${a.title}`).join('\n')
  }
}

supabase/functions/mcp-server/tools/weather.ts

export const getWeather = {
  name: 'get_weather',
  description: 'Current weather for a city',
  inputSchema: {
    type: 'object',
    properties: { city: { type: 'string' } },
    required: ['city']
  },
  handler: async ({ city }: { city: string }) => {
    const key = Deno.env.get('WEATHER_KEY')!
    const res = await fetch(
      `https://api.openweathermap.org/data/2.5/weather?q=${city}&units=metric&appid=${key}`
    )
    const data = await res.json()
    return `${city}: ${data.main.temp}°C, ${data.weather[0].description}`
  }
}

Deploy:

supabase functions deploy mcp-server
supabase secrets set NEWS_KEY=xxx WEATHER_KEY=yyy

The function is now reachable at:
https://<project-ref>.supabase.co/functions/v1/mcp-server


  1. Cloudflare Worker = chat router + MCP client

wrangler.toml

name = "mcp-front"
main = "src/index.js"
compatibility_date = "2024-01-01"
[env.production.vars]
SUPABASE_URL = "https://<project-ref>.supabase.co"
SUPABASE_ANON_KEY = "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9....."
OPENAI_API_KEY = "sk-...."

src/index.js

export default {
  async fetch(req, env) {
    const url = new URL(req.url)

    // ---------- 1.  Browser chat endpoint  -------------------------------
    if (url.pathname === '/chat' && req.method === 'POST') {
      const history = await req.json() // {messages:[...]}

      // 1-a. ask Supabase for the current tool list
      const tools = await listTools(env)

      // 1-b. ask OpenAI to pick a function (or plain reply)
      const openai = {
        model: 'gpt-3.5-turbo',
        messages: history,
        functions: tools.map(t => ({ name: t.name, description: t.description, parameters: t.inputSchema })),
        function_call: 'auto'
      }
      const aiResp = await fetch('https://api.openai.com/v1/chat/completions', {
        method: 'POST',
        headers: { 'Content-Type': 'application/json', Authorization: `Bearer ${env.OPENAI_API_KEY}` },
        body: JSON.stringify(openai)
      }).then(r => r.json())

      const choice = aiResp.choices[0]
      if (choice.message.function_call) {
        const { name, arguments: argsStr } = choice.message.function_call
        const toolRes = await callTool(env, name, JSON.parse(argsStr))
        // inject tool result as assistant message and call AI again for final wording
        history.push(choice.message)
        history.push({ role: 'function', name, content: toolRes.content[0].text })
        const final = await fetch('https://api.openai.com/v1/chat/completions', {
          method: 'POST',
          headers: { 'Content-Type': 'application/json', Authorization: `Bearer ${env.OPENAI_API_KEY}` },
          body: JSON.stringify({ model: 'gpt-3.5-turbo', messages: history })
        }).then(r => r.json())
        return new Response(JSON.stringify(final.choices[0].message), { headers: { 'Content-Type': 'application/json' } })
      } else {
        // no tool needed
        return new Response(JSON.stringify(choice.message), { headers: { 'Content-Type': 'application/json' } })
      }
    }

    // ---------- 2.  Proxy MCP requests (optional debug)  -----------------
    if (url.pathname.startsWith('/mcp/')) {
      return fetch(env.SUPABASE_URL + '/functions/v1/mcp-server' + url.pathname, {
        method: req.method,
        headers: { 'Content-Type': 'application/json', apikey: env.SUPABASE_ANON_KEY },
        body: req.body
      })
    }

    return new Response('Not found', { status: 404 })
  }
}

// helpers
async function listTools(env) {
  const res = await fetch(env.SUPABASE_URL + '/functions/v1/mcp-server/list', {
    headers: { apikey: env.SUPABASE_ANON_KEY }
  })
  const json = await res.json()
  return json.tools
}

async function callTool(env, name, args) {
  const res = await fetch(env.SUPABASE_URL + '/functions/v1/mcp-server/call', {
    method: 'POST',
    headers: { 'Content-Type': 'application/json', apikey: env.SUPABASE_ANON_KEY },
    body: JSON.stringify({ name, arguments: args })
  })
  return res.json()
}

Deploy:

npm install -g wrangler
wrangler deploy

Your front-end now calls only https://mcp-front.<sub>.workers.dev/chat.


  1. Front-end (same as before)

const API = 'https://mcp-front.<sub>.workers.dev/chat'
async function send(){
  messages.push({role:'user', content:input.value})
  const res = await fetch(API,{
    method:'POST',
    body:JSON.stringify({messages})
  })
  const msg = await res.json()
  messages.push(msg)
  out.textContent += `Bot: ${msg.content}\n`
}

  1. When do I need the DigitalOcean droplet?

  • You need Python, sharp, libre-office, puppeteer, etc. that Deno can’t run.
  • You need long-running TCP servers (Postgres listener, websocket hub).

In that case just run the same index.ts inside a Node/Express wrapper on the droplet and change the SUPABASE_URL + '/functions/v1/mcp-server' strings in the Worker to https://droplet-ip/mcp.
Everything else stays identical.


  1. End-to-end walk-through for “How cold is it in Calgary?”

  1. Browser → Worker /chat {messages:[{role:'user',content:'How cold is it in Calgary?'}]}
  2. Worker fetches tool list from Supabase edge function → [{name:'get_weather',...}]
  3. Worker calls OpenAI with functions:[...] – model replies
    function_call: {name:'get_weather', arguments:'{"city":"Calgary"}'}
  4. Worker POSTs {name:'get_weather', arguments:{city:'Calgary'}} to Supabase edge function.
  5. Edge function → fetch(api.openweathermap.org) → returns -12°C, light snow
  6. Worker injects tool result, asks OpenAI again for friendly wording.
  7. Final reply sent to browser: It’s -12°C and light snow in Calgary right now.

Done—no child process, no extra port, everything serverless on Supabase + Cloudflare.

Comments (0)

U
Press Ctrl+Enter to post

No comments yet

Be the first to share your thoughts!