The Missing Standard Library for Supabase Edge Functions

What is Supabase Edge Toolkit?

Supabase Edge Toolkit is an open-source standard library for Supabase Edge Functions, providing 7 independent packages — error handling, validation, auth, resilience, logging, testing, and Langfuse integration — extracted from 76 production functions. It eliminates copy-paste boilerplate and lets teams import only the packages they need.

TL;DR

  • -76 production Edge Functions accumulated 4,700 lines of copy-pasted shared code — so we packaged it as a stdlib
  • -7 independent packages: errors, validation, auth, resilience, logger, testing, langfuse — import only what you need
  • -The resilience package (1,409 LOC) covers timeout, retry with backoff, and circuit breaker patterns
  • -518 tests total — more test lines (4,864) than code lines (4,703) — mandatory for a library 76 functions depend on
  • -Open-source on GitHub: github.com/spyrae/supabase-edge-toolkit

How we extracted 7 reusable packages from 76 production functions

The Copy-Paste Trap

Every Supabase Edge Functions project starts the same way. You write your first function and add CORS headers. Second function — you copy error handling from the first. Third — you drag along the same JWT verifier. By the tenth function, your _shared/ folder outweighs the actual business logic.

We run 76 Edge Functions in production. Four thousand seven hundred lines of shared code. And every time something changes — say, the error response format — you update one file but have to verify all 76 consumers.

Supabase itself recommends placing utilities in _shared/ and importing via relative paths. That works when you have 5 functions. At 76, those aren’t utilities anymore. They’re a standard library that nobody packaged as one.

So we did. The result: supabase-edge-toolkit — 7 independent packages, 518 tests, zero coupling between modules.

Why Edge Functions Need a Stdlib

The Node.js ecosystem stands on npm’s shoulders. Express has middleware for everything. Next.js has built-in error handling.

Supabase Edge Functions give you bare Deno.serve plus Request/Response. The rest is on you.

Every project re-solves the same problems:

  • CORS — three headers that are mandatory yet constantly forgotten
  • Error responses — JSON response format, status code mapping, Zod error handling
  • JWT verification — decoding tokens, checking exp, extracting sub
  • Retry and timeout — external APIs go down, you need retries with backoff
  • Logging — structured JSON for Grafana/Loki, not console.log
  • Testing — how do you mock the Supabase client without a real database?

Every project ends up with its own _shared/ that nobody tests independently, nobody versions, nobody documents. When a new developer joins, they read source code because there’s no README.

What We Built

Seven packages. Each works on its own. Import only what you need.

PackagePurposeLines of CodeTests
errorsError responses, CORS, status code mapping773461
validationZod validation, request body/query parsing517733
authJWT verification, auth middleware424428
resilienceTimeout, retry, circuit breaker1,4092,134
loggerStructured JSON logging342279
testingMockDB, PostgREST emulator, mock fetch1,025561
langfuseLightweight Langfuse prompt fetcher213268

Total: 4,703 lines of code, 4,864 lines of tests. More tests than code. For an internal project, that’s overkill. For a library that 76 functions depend on, it’s the minimum.

Deep Dive: Error Handling That Scales

The Problem

In a typical Edge Function, error handling looks like this:

Deno.serve(async (req) => {
  try {
    // ... business logic
  } catch (error) {
    if (error instanceof ZodError) {
      return new Response(JSON.stringify({ error: "Validation failed" }), {
        status: 400,
        headers: { "Content-Type": "application/json" },
      });
    }
    if (error.message?.includes("timeout")) {
      return new Response(JSON.stringify({ error: "Timeout" }), { status: 504 });
    }
    console.error(error);
    return new Response(JSON.stringify({ error: "Internal error" }), { status: 500 });
  }
});

Every function reinvents this catch block. Response formats drift apart. The frontend can’t parse errors consistently.

The Fix

One function, errorToResponse(), that knows about every error type:

import {
  createCorsResponse,
  createSuccessResponse,
  errorToResponse,
} from "@supa-edge-toolkit/errors";

Deno.serve(async (req) => {
  if (req.method === "OPTIONS") return createCorsResponse();

  try {
    const data = await handleRequest(req);
    return createSuccessResponse(data);
  } catch (error) {
    return errorToResponse(error);
  }
});

Three lines instead of twenty. errorToResponse identifies error types through duck typing:

  • ZodError → 400 with field-level validation errors
  • TimeoutError → 504
  • AuthError → 401 with a safe message (internal details don’t leak)
  • Error containing “rate limit” → 429 with a Retry-After header
  • Everything else → 500 with a generic message

What matters more: errorToResponse never sends internal error messages to the client. For auth errors, there’s a lookup table of safe messages:

const safeMessages: Record<string, string> = {
  MISSING_AUTH_HEADER: "Missing authentication",
  INVALID_TOKEN: "Invalid or expired token",
  TOKEN_EXPIRED: "Token expired",
};
const safeMessage = safeMessages[authErr.code] ?? "Authentication failed";

The frontend gets stable error codes (AUTH_ERROR, VALIDATION_ERROR, TIMEOUT_ERROR) — not raw stack traces.

Deep Dive: Resilience Without a Framework

Three Primitives

The resilience package is the largest (1,409 lines of code, 2,134 lines of tests). It tackles one problem: external APIs fail, and your Edge Function shouldn’t fail with them.

Three primitives, each usable on its own:

Timeout — kills a request after N milliseconds:

import { fetchWithTimeout } from "@supa-edge-toolkit/resilience";

// 5 seconds, then TimeoutError
const response = await fetchWithTimeout("https://api.example.com/data", {}, 5000);

Retry — repeats with exponential backoff:

import { withRetry, RETRY_CONFIGS } from "@supa-edge-toolkit/resilience";

const result = await withRetry(
  () => fetch("https://flaky-api.com/data"),
  RETRY_CONFIGS.EXTERNAL_API, // 3 attempts, backoff 1s → 2s → 4s
);

Circuit Breaker — stops calling a service that’s down:

import { CircuitBreaker } from "@supa-edge-toolkit/resilience";

const breaker = CircuitBreaker.getOrCreate("payments", {
  failureThreshold: 3,
  resetTimeoutMs: 30_000,
});
const result = await breaker.call(() => fetch("https://payments.api/charge"));

All Together: resilientFetch

In practice, you need all three at once. resilientFetch composes them:

import { resilientFetch } from "@supa-edge-toolkit/resilience";

const response = await resilientFetch(
  "https://api.openai.com/v1/chat/completions",
  {
    method: "POST",
    body: JSON.stringify({ model: "gpt-4", messages }),
    headers: { Authorization: `Bearer ${apiKey}` },
  },
  {
    serviceName: "openai",
    timeout: { timeoutMs: 30_000 },
    retry: { maxAttempts: 2 },
    circuitBreaker: { failureThreshold: 5, resetTimeoutMs: 60_000 },
  },
);

One function — timeout kills hung requests, retry repeats after failures, circuit breaker shuts down a service after repeated errors.

Jitter: Thundering Herd Protection

An easy detail to miss. Retry defaults have jitter enabled:

EXTERNAL_API: {
  maxAttempts: 3,
  baseDelayMs: 1000,
  backoffMultiplier: 2,
  jitter: true, // ±50% random variation
}

Why? Picture this: 100 requests fail simultaneously. Without jitter, all 100 retry after exactly 1 second, then 2, then 4. Synchronized waves. With jitter, each retries after 500–1500 ms, 1000–3000 ms, 2000–6000 ms. The load spreads out.

Jitter is the default, not an opt-in. Because retry without jitter is worse than no retry at all.

Being Honest About Limitations

The circuit breaker stores state in a module-level Map:

const circuitRegistry = new Map<string, CircuitBreakerState>();

This works within a single Deno isolate. But Supabase runs Edge Functions across multiple workers. One worker opens the circuit — another doesn’t know about it.

We deliberately didn’t try to solve this. A distributed circuit breaker needs external storage (Redis, Supabase Realtime, KV). For 90% of cases, per-isolate state is enough: if OpenAI is down, it’s down for everyone, and each isolate will open its circuit after its own first N failures.

The code documents this limitation. No illusions.

Deep Dive: Testing Without Supabase

The Problem

Edge Functions use @supabase/supabase-js to talk to the database. The client calls fetch internally. To test a function without a real database, you need to intercept fetch.

Replacing globalThis.fetch

The Supabase JS client reads the URL from SUPABASE_URL and calls globalThis.fetch. We replace both:

import { createTestContext, assertFetchCount } from "@supa-edge-toolkit/testing";

Deno.test("creates a user", async () => {
  const ctx = createTestContext({
    dbSeed: {
      users: [{ id: "u1", name: "Alice", email: "alice@test.com" }],
    },
  });

  try {
    // Supabase client works — fetch goes to MockDB
    const res = await fetch(
      "http://localhost:54321/rest/v1/users?id=eq.u1",
      { headers: { Authorization: "Bearer test-anon-key" } },
    );
    const data = await res.json();

    assertEquals(data[0].name, "Alice");
    assertFetchCount(ctx.fetchLog, "/rest/v1/users", 1);
  } finally {
    ctx.cleanup(); // restores original fetch
  }
});

createTestContext does four things:

  1. Creates a MockDBState with seed data
  2. Wires up the PostgREST emulator (understands eq., ilike., order, limit, offset)
  3. Replaces globalThis.fetch with a URL router
  4. Sets test environment variables (SUPABASE_URL, SUPABASE_ANON_KEY)

Zero config. Call createTestContext() and the standard Supabase client works against an in-memory database.

The PostgREST Emulator

MockDBState isn’t just a dictionary. It speaks real PostgREST protocol:

  • GET /rest/v1/users?email=eq.alice@test.com — field filtering
  • GET /rest/v1/users?name=ilike.*lic* — fuzzy search
  • POST /rest/v1/users — insert with auto-generated id and created_at
  • PATCH /rest/v1/users?id=eq.u1 — update with updated_at added
  • POST /rest/v1/rpc/my_function — RPC calls
  • HEAD /rest/v1/users — count via content-range

Tests go through the same path as production code. No mocks that just return hardcoded JSON.

Architecture Decisions

Zero Coupling

Each package has its own deno.json with its own name, version, and dependencies. No shared core library. No transitive dependencies between packages.

We made this trade-off on purpose. A shared core library is convenient for the author — but creates a problem for users. Want errors? Pull in core. Want auth? Pull in core and errors. The dependency tree grows.

Instead: validation optionally integrates with errors (if you use both), but works fine without it.

CryptoKey Caching

crypto.subtle.importKey is an async operation. In an Edge Function that handles dozens of requests during its isolate’s lifetime, calling it on every request is wasteful:

let _cachedVerifyKey: CryptoKey | null = null;
let _cachedJwtSecret: string | null = null;

async function getVerifyKey(): Promise<CryptoKey> {
  const jwtSecret = Deno.env.get("SUPABASE_JWT_SECRET");
  if (_cachedVerifyKey && _cachedJwtSecret === jwtSecret) {
    return _cachedVerifyKey;
  }
  _cachedVerifyKey = await crypto.subtle.importKey(
    "raw",
    new TextEncoder().encode(jwtSecret),
    { name: "HMAC", hash: "SHA-256" },
    false,
    ["verify"],
  );
  _cachedJwtSecret = jwtSecret;
  return _cachedVerifyKey;
}

The cache lives at the module level. It auto-invalidates when the secret changes — useful in tests where the secret may differ between test cases.

Secure Test Mode

The auth package supports user impersonation in dev environments. But unlike the naive “if development, trust the header” approach, ours requires a real service_role JWT:

export async function tryTestMode(req: Request): Promise<AuthResult | null> {
  if (!isTestModeAvailable()) return null; // dev only

  const testUserId = req.headers.get("X-Test-User-Id");
  if (!testUserId) return null;

  // Cryptographic verification of service_role JWT
  const serviceAuth = await verifyServiceRole(req);

  return {
    userId: testUserId,
    role: "authenticated",
    payload: { ...serviceAuth.payload, sub: testUserId, _testMode: true },
  };
}

Even if someone accidentally sets ENVIRONMENT=development in production — without a valid service_role JWT, impersonation won’t work.

How to Use It

Installation — one line per package:

// deno.json (import map)
{
  "imports": {
    "@supa-edge-toolkit/errors": "jsr:@supa-edge-toolkit/errors@0.1",
    "@supa-edge-toolkit/auth": "jsr:@supa-edge-toolkit/auth@0.1",
    "@supa-edge-toolkit/validation": "jsr:@supa-edge-toolkit/validation@0.1"
  }
}

A complete Edge Function using three packages:

import { z } from "npm:zod";
import { createCorsResponse, createSuccessResponse, errorToResponse } from "@supa-edge-toolkit/errors";
import { verifyUserToken } from "@supa-edge-toolkit/auth";
import { parseRequestBody } from "@supa-edge-toolkit/validation";

const CreatePostSchema = z.object({
  title: z.string().min(1).max(200),
  content: z.string().min(1),
  tags: z.array(z.string()).optional(),
});

Deno.serve(async (req) => {
  if (req.method === "OPTIONS") return createCorsResponse();

  try {
    const { userId } = await verifyUserToken(req);
    const body = await parseRequestBody(req, CreatePostSchema);

    // body is typed: { title: string; content: string; tags?: string[] }
    const post = await createPost(userId, body);

    return createSuccessResponse(post, 201);
  } catch (error) {
    return errorToResponse(error);
  }
});

Eight lines of request handling. Auth, validation, typing, error handling — it all works.

From Internal Code to Open Source

Extracting a library from a live project is harder than writing one from scratch. Here’s what we learned along the way.

Strip project-specific code ruthlessly. Our _shared/ had helpers for specific APIs (Foursquare, Tinkoff, Google Places). They didn’t make it into the toolkit. If a function only works with your stack, it’s not library material.

A 1:1 test ratio is the minimum. 4,703 lines of code, 4,864 lines of tests. For an internal project, that’s excessive. For a library, it’s not. Any broken edge case breaks code for every consumer.

Document your limitations. Circuit breaker isn’t distributed? Write that down. CryptoKey is cached per-isolate? Write that down. The person who picks your library will read it and avoid the pitfall.

Zero coupling costs more to develop, less to use. Maintaining 7 independent packages is harder than one monolith with a shared core. But for a user who only needs errors, it’s the difference between 773 lines and 4,703.


Repository: github.com/spyrae/supabase-edge-toolkit

7 packages. 518 tests. MIT license. It’s yours.

FAQ

The circuit breaker stores state per-isolate, but Supabase spins up multiple workers. How bad is this limitation in practice for high-traffic functions?

For most workloads it’s acceptable. Each isolate independently opens its circuit after failureThreshold failures, so if OpenAI is down for 30 seconds, every active worker will open within a few hundred milliseconds of each other — the cascade is fast. The real gap is reset: one isolate may recover while others remain open, causing request routing inconsistencies. If you need synchronized state, the recommended path is storing circuit state in Supabase KV (currently in beta) and polling it in the circuit breaker’s half-open check.

Does the PostgREST emulator in the testing package support JOIN operations or only single-table queries?

It handles single-table operations only: filtering, sorting, pagination, insert, update, upsert, and RPC calls. Cross-table joins (.select('*, related_table(*)')) are not emulated. The design decision was intentional — join logic is complex to emulate correctly, and functions that require joins can be tested by seeding the mock with pre-joined data or by extracting the join into a Postgres view and testing the view as a standalone table.

The errors package uses duck typing to identify error types. Does this break if multiple packages in the same project define their own TimeoutError class?

It can cause false positives if your custom TimeoutError has the same shape as the toolkit’s TimeoutError — specifically, a code property equal to TIMEOUT. The safe pattern is to extend the toolkit’s error classes rather than define parallel ones. If you need your own hierarchy, override the code property to a unique value (e.g., MY_TIMEOUT) so errorToResponse routes it to the generic 500 handler instead of 504.