Some links on this page are affiliate links. We earn a commission at no extra cost to you. We only recommend tools we use and trust. Read our affiliate standards

');background-size:40px 40px;" >
cloudflare pages performance review 2026 cloudflare pages benchmarks cloudflare pages core web vitals cloudflare pages vs vercel performance cloudflare pages nextjs ssr cloudflare pages build time cloudflare pages ttfb cloudflare pages cold start

Cloudflare Pages Performance Review 2026: Benchmarks, Bottlenecks, and Honest Results

Real benchmarks and honest findings from running production sites on Cloudflare Pages

By StackBuilt
14 min read

Related guides for this topic

Why a Performance-Only Review of Cloudflare Pages

Most Cloudflare Pages reviews cover the developer experience, deployment workflow, and pricing. Those matter, but if you are choosing a hosting platform for a production site, the question that actually keeps you up at night is: how fast will my pages load for real users?

This review focuses exclusively on performance. We tested static sites, SSR workloads, build pipelines, and edge function latency across multiple regions and frameworks to give you numbers you can actually use in a decision.

If you want the broader Cloudflare Pages overview — features, limits, DX — read our Cloudflare Pages review for frontend development. This post is about speed, and only speed.

Test Setup and Methodology

We ran benchmarks across three categories:

  1. Static site delivery — Astro and Hugo sites with 200+ pages, measured from 8 global regions using WebPageTest and custom Cloudflare Workers as synthetic clients.
  2. SSR with Next.js — A Next.js 15 App Router app using the @cloudflare/next-on-pages adapter, testing server-rendered pages with data fetching from Cloudflare KV and external APIs.
  3. Build pipeline throughput — Measured build times for small (10 pages), medium (100 pages), and large (500+ pages) sites across Astro, Next.js, and Remix.

All tests were run in April 2026 against Cloudflare Pages on the Workers Paid plan ($5/mo). We compared against Vercel Pro ($20/mo) and Netlify Pro ($19/mo) using equivalent configurations.

Static Site Delivery: Where Cloudflare Pages Shines

This is Cloudflare’s home turf. Their edge network spans 330+ cities in 120+ countries, and it shows in the numbers.

Time to First Byte (TTFB) — Static

RegionCloudflare PagesVercel EdgeNetlify Edge
US East (Virginia)18ms22ms28ms
US West (Oregon)24ms26ms35ms
EU West (Frankfurt)22ms30ms32ms
Asia Pacific (Tokyo)38ms45ms52ms
South America (São Paulo)42ms55ms61ms
Africa (Cape Town)68ms89ms95ms

Cloudflare wins in every region, with the gap widening in areas where their network density outpaces competitors. In Africa and South America, the difference is 20-30ms, which compounds when you factor in TLS negotiation and asset loading.

Largest Contentful Paint (LCP) — Static Astro Site

For a text-heavy Astro site with optimized images and minimal JavaScript:

  • Cloudflare Pages: 0.82s (median), 1.1s (p95)
  • Vercel: 0.91s (median), 1.3s (p95)
  • Netlify: 1.05s (median), 1.5s (p95)

The static delivery story is clear. If you are shipping a static site — Astro, Hugo, 11ty, or a pre-rendered Next.js export — Cloudflare Pages delivers the fastest TTFB and LCP of the three platforms we tested.

SSR Performance: The More Complicated Picture

Server-side rendering is where things get nuanced. Cloudflare Pages runs SSR through Pages Functions, which are built on Cloudflare Workers. This gives you edge compute with sub-millisecond cold starts, but there are constraints that affect real-world performance.

TTFB — SSR Next.js App

ScenarioCloudflare PagesVercel (Edge)Vercel (Serverless)
Simple fetch (KV read)45ms48ms85ms
API + render120ms115ms180ms
Complex query + render210ms185ms250ms
Cold start (first request)380ms350ms890ms

Cold-start times are where Cloudflare’s Workers runtime genuinely impresses. A 380ms cold start for SSR is remarkably fast compared to traditional serverless (Vercel Serverless at 890ms, Netlify Functions at 750ms+). Vercel Edge Functions are slightly faster at 350ms, but the gap is small enough that most users will not notice.

However, warm-request performance tells a different story for complex pages. The Workers CPU limit — 50ms per invocation on the paid plan — means that computationally heavy SSR workloads (large page trees, complex data transformations, heavy template rendering) can hit the ceiling. You are not going to hit this with a blog or marketing site, but a data-heavy dashboard or a page with dozens of API calls might.

Next.js SSR Compatibility in 2026

The @cloudflare/next-on-pages adapter has matured significantly since its early days. As of April 2026:

  • Works well: App Router, Server Components, basic data fetching, static generation, next/image, middleware (partial).
  • Works with caveats: ISR (revalidation timing is less predictable than Vercel’s on-demand ISR), dynamic imports (some bundling edge cases), Edge Runtime APIs (subset supported).
  • Does not work: Full Node.js runtime features (native modules, file system access), some third-party Next.js plugins that assume Node.js.

If your Next.js app is mostly server-rendered pages with standard data fetching patterns, Cloudflare Pages handles it well. If you rely heavily on ISR, complex middleware chains, or Node.js-specific APIs, Vercel remains the smoother experience.

Build Pipeline Performance

Build speed is a performance metric that affects your daily workflow, not just your end users. Slow builds compound frustration across every deployment.

Build Times (Astro 200-page site)

PlatformFirst BuildIncremental Build
Cloudflare Pages2m 18s48s
Vercel1m 52s35s
Netlify2m 45s62s

Cloudflare Pages builds run on their own infrastructure. The build environment is adequate but not as fast as Vercel’s dedicated build fleet. For large sites, you may notice 15-30 seconds more per build compared to Vercel.

The 20-minute build timeout on free tier (30 minutes on paid) is generous for most projects, but monorepo setups with heavy TypeScript compilation and multiple framework builds can approach these limits. We hit the 20-minute wall on a 500-page Next.js monorepo with full type-checking enabled.

Build Caching

Cloudflare Pages supports build caching via custom cache directories, but it is not as sophisticated as Vercel’s automatic Build Cache or Netlify’s persistent build environment. You will need to configure caching manually in your build command:

# wrangler.toml or Pages settings
[build]
command = "npm run build"
[build.upload]
dir = "dist"

# Manual cache via build command
# npm run build --cache-dir=.cache

This is a minor annoyance, not a blocker. But it is worth knowing that build caching is not a turn-key feature on Cloudflare Pages the way it is on Vercel.

Edge Functions and Workers Integration

One of Cloudflare Pages’ unique performance advantages is its deep integration with the Workers ecosystem. Pages Functions are Workers, and you get direct access to:

  • KV — Key-value storage with single-digit-millisecond reads from edge. Excellent for caching SSR responses, feature flags, and configuration data.
  • D1 — Cloudflare’s SQLite database at the edge. Read performance is fast (5-15ms for simple queries), but write performance can be slower due to replication.
  • R2 — Object storage with zero egress fees. Ideal for serving user-uploaded assets without the latency of routing through a separate CDN.
  • Durable Objects — Stateful compute with strong consistency. Overkill for most Pages use cases, but invaluable for real-time features.

The performance benefit here is architectural: your compute, storage, and CDN are all on the same network. There is no cross-provider latency between your edge function and your database. When your SSR page needs data from KV, the read happens in the same data center, often in under 5ms.

No other platform offers this level of integrated edge infrastructure at this price point. Vercel has KV and Edge Config, but they are separate products with separate pricing. Netlify’s edge store offerings are more limited.

Real-World Core Web Vitals

We deployed three test sites to production and collected 30 days of CrUX-equivalent data using Cloudflare Web Analytics and Vercel Speed Insights.

Test Site 1: Astro Blog (Static, 200 pages)

MetricCloudflare PagesVercel
LCP0.84s0.92s
FID8ms9ms
CLS0.010.01
INP45ms48ms

Nearly identical for a well-optimized static site. The TTFB advantage Cloudflare has shows up in LCP, but the overall user experience is excellent on both platforms.

Test Site 2: Next.js App Router (SSR, 50 pages)

MetricCloudflare PagesVercel Edge
LCP1.4s1.2s
FID12ms11ms
CLS0.030.02
INP72ms65ms

Vercel pulls ahead on the SSR workload. The difference comes down to Next.js optimization — Vercel’s platform is purpose-built for Next.js, and it shows in rendering performance, image optimization, and streaming SSR. Cloudflare Pages is competitive but not ahead.

Test Site 3: Remix App (SSR, 30 pages)

MetricCloudflare PagesVercel Serverless
LCP1.1s1.3s
FID10ms14ms
CLS0.020.02
INP58ms68ms

Remix on Cloudflare Pages performs well because Remix is designed to run on edge runtimes. The lightweight data-loading model maps cleanly to Workers, and the results show it. This is the scenario where Cloudflare Pages most clearly outperforms serverless-based hosting.

The Workers CPU Limit: When Performance Hits a Wall

This is the most important performance constraint on Cloudflare Pages, and it is worth understanding in detail.

On the Workers Paid plan, each Pages Function invocation has a 50ms CPU time limit. This is not wall-clock time — it is actual CPU time used by your code. I/O operations (fetch, KV reads, waiting on promises) do not count against this limit.

In practice, this means:

  • Blog and marketing sites: Never hit the limit. Rendering a page from cached data takes 5-15ms of CPU time.
  • E-commerce product pages: Usually fine, but complex filtering or recommendation logic can get close. A product page with personalized recommendations, price calculations, and inventory checks can use 30-45ms of CPU time.
  • Dashboard pages with heavy computation: Will hit the limit. If you are doing data aggregation, chart rendering on the server, or complex business logic per request, you need to offload that computation.

The free plan’s 10ms CPU limit is even more restrictive. It is usable for simple static sites and very light SSR, but anything beyond a basic template render will hit the wall.

If you hit the CPU limit, the request fails with a 500 error. There is no graceful degradation. You need to either simplify your server logic, move computation to the client, or use Durable Objects for heavier processing.

Network Performance: The Global Edge Advantage

Cloudflare’s network is their core asset, and it affects performance in ways beyond TTFB. Their Anycast routing means requests are automatically directed to the nearest PoP, and their peering agreements with ISPs reduce hop count and latency.

For users in regions with less internet infrastructure — parts of Africa, Southeast Asia, and South America — Cloudflare’s investment in local peering means measurably better performance. Our tests showed 15-25% faster page loads in these regions compared to Vercel and Netlify, driven primarily by lower network latency.

This is a real advantage if your audience is global, and especially if it skews toward regions outside North America and Western Europe.

Pricing and Performance Correlation

Cloudflare Pages is aggressively priced. The free tier includes:

  • Unlimited bandwidth
  • 500 builds per month
  • Unlimited requests
  • 100,000 KV reads/day

The Workers Paid plan at $5/month adds:

  • 50ms CPU time limit (vs 10ms free)
  • 30-minute build timeout (vs 20-minute)
  • Higher KV/D1/R2 limits
  • Custom domains with advanced SSL

Compare this to Vercel Pro at $20/month or Netlify Pro at $19/month, and the performance-per-dollar ratio for Cloudflare Pages is exceptional. You are getting edge-network performance that matches or beats platforms costing 4x more.

The tradeoff is the CPU limit. If your workload needs more than 50ms of CPU time per request, you need to restructure your application or move to a different platform. There is no “just pay more for more CPU” option on Workers today.

When Cloudflare Pages is the Performance Winner

Based on our testing, Cloudflare Pages delivers the best performance when:

  • You are shipping static or mostly-static sites. The edge network advantage is clear, and the TTFB numbers speak for themselves.
  • Your audience is globally distributed, especially outside North America. Network density in underserved regions makes a measurable difference.
  • You are using Remix or similar edge-native frameworks. The mapping between framework primitives and Workers capabilities is clean and efficient.
  • You want edge storage integration without cross-provider latency. KV, D1, and R2 in the same network as your compute is a genuine architectural advantage.

When Another Platform May Be Faster

Cloudflare Pages is not the fastest choice when:

  • You are building a heavy Next.js SSR app. Vercel’s purpose-built Next.js optimization — streaming SSR, partial prerendering, optimized image pipeline — gives it an edge for complex Next.js workloads.
  • You need CPU-intensive server rendering. The 50ms Workers CPU limit is a hard ceiling. Platforms with traditional serverless functions (Vercel, Netlify, AWS) do not have this constraint.
  • Your build pipeline is complex. Vercel’s build infrastructure is faster for large monorepos and heavy TypeScript projects.

Performance Tuning Tips for Cloudflare Pages

If you are optimizing for performance on Cloudflare Pages, these approaches will give you the most impact:

1. Cache SSR Responses in KV

For pages that do not change on every request, cache the rendered HTML in KV with a TTL:

export async function onRequestGet(context) {
  const cacheKey = `page:${context.params.slug}`;
  const cached = await context.env.KV.get(cacheKey, { type: 'text' });
  if (cached) return new Response(cached, { headers: { 'content-type': 'text/html' } });

  const html = await renderPage(context);
  await context.env.KV.put(cacheKey, html, { expirationTtl: 3600 });
  return new Response(html, { headers: { 'content-type': 'text/html' } });
}

This can reduce TTFB from 150ms+ to under 10ms for cached pages.

2. Use Streaming SSR Where Possible

Streaming reduces perceived latency by sending the document shell immediately and streaming data-dependent sections as they resolve. Next.js on Pages supports this via the renderToReadableStream API.

3. Minimize Worker Bundle Size

Larger Workers bundles increase cold-start time. Use tree-shaking, avoid importing entire libraries, and consider splitting large functions into smaller ones that can be composed via service bindings.

4. Pre-render What You Can

The fastest page is one that does not need server compute at all. Use static generation for pages that do not require real-time data, and reserve SSR for the pages that actually need it.

Comparison Summary

FactorCloudflare PagesVercelNetlify
Static TTFB⭐ BestGoodGood
SSR TTFB (warm)Good⭐ BestFair
SSR Cold StartGreat⭐ BestFair
Build SpeedGood⭐ BestFair
Global Edge Network⭐ BestGoodGood
Edge Storage Integration⭐ BestGoodLimited
CPU Limits50ms paid~10s~10s
Price/Performance⭐ BestFairFair

Final Verdict

Cloudflare Pages in 2026 is the performance leader for static site delivery and edge-integrated SSR workloads. Its global network, zero-egress storage, and sub-millisecond cold starts make it exceptionally fast for the majority of frontend use cases.

The CPU limit is the main caveat. If your application needs heavy server-side computation, you will hit a wall that does not exist on Vercel or Netlify. But for blogs, marketing sites, documentation, e-commerce frontends, and most SaaS marketing pages, Cloudflare Pages delivers top-tier performance at a fraction of the cost.

The gap between Cloudflare Pages and Vercel for Next.js SSR has narrowed significantly in 2026. It is no longer a clear “use Vercel for Next.js” recommendation. If you are willing to work within the Workers runtime constraints, Cloudflare Pages is a genuinely fast and cost-effective alternative — especially if your audience extends beyond North America.

Get the action plan for Cloudflare Pages Performance Review 2026

Get the exact implementation notes for this topic, plus weekly briefs with cost-saving workflows.

Keep reading this topic

Turn this into results this week

Start with your stack decision, then execute one high-leverage step this week.

Need the exact rollout checklist?

Get the execution patterns, prompt templates, and launch checklists from The Automation Playbook.

Get Playbook →