
Tutorial: Pre-rendering Lovable websites with Cloudflare Workers and no-code in 2026
Step-by-step guide to setup pre-rendering your Lovable single page application for search engines and AI crawlers using a Cloudflare Worker and the LovableHTML Prerender API.
Googlebot visits your Lovable app and sees <div id="root"></div>. That's it. Your beautiful SPA is invisible to every search engine and AI crawler because they don't execute JavaScript. They grab the HTML, see nothing, and leave.
Pre-rendering fixes this. You intercept bot requests at the edge, hand them fully rendered HTML, and let normal users keep using your SPA like nothing changed.
Don't want to touch code? LovableHTML prerenders your site with no-code.
How it works
Search Bot -> Cloudflare Worker -> LovableHTML API -> Fully Rendered HTML^ |+------------------ cached response <--------------+Regular User -> Cloudflare Worker -> Your SPA (unchanged)
Your Worker sits in front of your domain. Every request comes through it. If someone's asking for an HTML page, the Worker calls the LovableHTML API. The API figures out if pre-rendering applies (bot? social preview crawler?). If yes, it returns the rendered HTML. If no, it returns a 304 and the Worker passes the request straight to your SPA.
Stuff you don't have to build:
- Bot detection for 200+ crawlers (Googlebot, Bingbot, ChatGPT, Claude, Perplexity, all the social ones)
- Headless browser infrastructure
- Snapshot caching and invalidation
- Edge delivery from 300+ locations
- Sitemap monitoring and auto re-crawl
Your Worker is ~30 lines. The API does the rest.
Prerequisites
- Domain on Cloudflare (free plan is fine)
- LovableHTML account with your domain added
- API key from the LovableHTML dashboard
Step 1: Create the Worker
- Open Cloudflare dashboard -> Workers & Pages
- Create -> pick "Hello World" Worker -> Deploy
- Hit Edit Code, replace everything with this:
// lovablehtml-prerender.js (Cloudflare Worker)export default {async fetch(req, env) {// Only handle public GET navigationsif (req.method !== "GET") return fetch(req);const isHtmlRequest = (req.headers.get("accept") || "").includes("text/html",);if (!isHtmlRequest) return fetch(req);const headers = new Headers();headers.set("x-lovablehtml-api-key", env.LOVABLEHTML_API_KEY);headers.set("accept", "text/html");const forward = ["accept-language","sec-fetch-mode","sec-fetch-site","sec-fetch-dest","sec-fetch-user","upgrade-insecure-requests","referer","user-agent",];for (const name of forward) {const v = req.headers.get(name);if (v) headers.set(name, v);}const r = await fetch("https://lovablehtml.com/api/prerender/render?url=" +encodeURIComponent(req.url),{ headers },);// 304 = not pre-rendered, pass through to originif (r.status === 304) {return fetch(req);}if ((r.headers.get("content-type") || "").includes("text/html")) {return new Response(await r.text(), {headers: { "content-type": "text/html; charset=utf-8" },});}return fetch(req);},};
- Save and Deploy.
Step 2: Add your API key
Don't hardcode keys.
- Worker -> Settings -> Variables and Secrets
- Add -> Name:
LOVABLEHTML_API_KEY, Value: your key from the dashboard - Save
Step 3: Attach it to your domain
- Worker settings -> Domains & Routes
- Add ->
yourdomain.com/* - Serving on
wwwtoo? Addwww.yourdomain.com/* - Save
Done. All traffic now goes through the Worker.
Step 4: Test it
Three checks to make sure everything works.
1. Hit the LovableHTML API directly
curl -sS -D - -o /dev/null \-H "x-lovablehtml-api-key: YOUR_API_KEY" \-H "Accept: text/html" \"https://lovablehtml.com/api/prerender/render?url=https%3A%2F%2Fyour-domain.com%2Fyour-page"
You want HTTP/1.1 200 and a x-lovablehtml-render-cache: hit | miss header. hit means it was cached. miss means fresh render.
2. Hit your site as Googlebot
curl -sS -D - -o /dev/null \-H "Accept: text/html" \-A "Googlebot" \"https://your-domain.com/your-page"
Should return 200 with content-type: text/html. Check the body. You should see your actual <h1>, meta tags, page content. Not an empty shell.
3. Confirm normal users aren't affected
curl -sS -D - -o /dev/null \-A "Mozilla/5.0" \"https://your-domain.com/your-page"
No Accept: text/html header means the Worker skips LovableHTML and passes through to your SPA. Regular users never notice anything.
You can also paste your URL into the LovableHTML crawler simulator to see what bots actually get.
What you get
Speed
Median response time under 500ms. Fastest pre-rendering in the industry. Cached pages come back in ~23ms from 300+ edge locations. No cold starts, no session queuing.
Bot detection
30+ crawler user agents handled automatically. Googlebot, Bingbot, DuckDuckBot, OAI-SearchBot (ChatGPT), ClaudeBot, PerplexityBot, plus social crawlers from X, LinkedIn, Facebook, WhatsApp. You don't maintain regex lists.
Sitemap monitoring
LovableHTML watches your sitemap and re-renders pages when content changes. New pages get picked up automatically.
Cache invalidation
Push a deploy? Bust the cache and prewarm it so the next bot request is instant.
Single page:
curl -X POST \-H "content-type: application/json" \-H "x-lovablehtml-api-key: YOUR_API_KEY" \-d '{"domain":"yourdomain.com","path":"/pricing","prewarm":true}' \https://lovablehtml.com/api/prerender/cache/invalidate-page-cache
Multiple paths:
curl -X POST \-H "content-type: application/json" \-H "x-lovablehtml-api-key: YOUR_API_KEY" \-d '{"domain":"yourdomain.com","paths":["/","/pricing","/blog/new-post"],"prewarm":true}' \https://lovablehtml.com/api/prerender/cache/invalidate-paths-cache
Entire site:
curl -X POST \-H "content-type: application/json" \-H "x-lovablehtml-api-key: YOUR_API_KEY" \-d '{"domain":"yourdomain.com"}' \https://lovablehtml.com/api/prerender/cache/invalidate-site-cache
Setting "prewarm": true re-renders the page right after purging so the cache is hot before the next crawler shows up.
DNS gotcha for Lovable users
If you're pointing a custom domain to your Lovable app through Cloudflare, use CNAME records. Not A records. This trips people up constantly.
| Type | Name | Value |
|---|---|---|
| CNAME | @ | your-app.lovable.app |
| CNAME | www | your-app.lovable.app |
Both need to be Proxied (orange cloud on) so traffic routes through Cloudflare and your Worker actually runs.
Why not just use Cloudflare's Browser Rendering?
It's the first thing people try. Spin up headless Chromium inside a Worker, render the page yourself, cache it. Works in a demo. Falls apart in production.
The limits are real
Cloudflare's Browser Rendering caps you at 2 browsers per minute and 2 concurrent sessions on free. Paid plans get you 30 browser instances and 30 sessions per minute. That sounds okay until Google decides to crawl 50 pages at once, which it does regularly. Bots that get timeouts don't come back.
For any site beyond a weekend project, these limits aren't workable.
LovableHTML has no caps
Median response time under 500ms. No browser instance limits. No session throttling. No cold starts. 10 bot requests or 10,000 in a minute, every single one gets served.
The maintenance tax
Even if you hack around Cloudflare's limits, you still need to:
- Keep a bot user-agent list current (AI crawlers are multiplying monthly)
- Build cache invalidation for dynamic content
- Monitor for silent rendering failures
- Handle social preview crawlers separately from search bots
- Pay for Cloudflare Browser Rendering usage on top of your Workers plan
At $19/mo for 10,000 renders, the API costs less than the Browser Rendering fees alone. And you're not spending engineering hours maintaining it.
Best practices
- Keep secrets out of git. Use Cloudflare's secret manager for
LOVABLEHTML_API_KEY. Never hardcode it. - Don't proxy static assets. The
Accept: text/htmlcheck in the Worker handles this. JS, CSS, images, fonts pass straight through. - Handle 304 properly.
304means prerendering doesn't apply. Always fall back to your origin. - Invalidate after deploys. Use the cache invalidation endpoints with
prewarm: trueso the cache is hot before bots show up.
That's it
Pre-rendering is the highest-leverage SEO fix for any SPA. 15 minutes with a Cloudflare Worker and the LovableHTML API gets you from invisible to fully indexable. No code changes to your app. No Next.js migration. No headless Chrome to babysit.
Links: