Deploying My SvelteKit Blog to Vercel
Deploying My SvelteKit Blog to Vercel
This site is built with SvelteKit and deployed to Vercel. The setup itself is pretty simple, but there were a couple of decisions — especially around caching — worth walking through.
The Adapter
SvelteKit uses adapters to target different deployment platforms. The default adapter-auto will detect Vercel and do the right thing, but if you want any Vercel-specific options (and you will), it's worth installing the adapter directly:
bun add -d @sveltejs/adapter-vercelThen wire it up in svelte.config.js:
import adapter from '@sveltejs/adapter-vercel';
export default {
kit: {
adapter: adapter()
}
};That's actually enough to go live. Connect your repo to Vercel, push, done. But there's more fun to be had.
Choosing a Runtime
Vercel defaults to Node for serverless functions, but there's also experimental Bun support. I switched to it for this site:
adapter: adapter({
runtime: 'experimental_bun1.x'
})Cold starts feel a bit snappier. Since this is a personal blog, "experimental" doesn't bother me. If you've got paying customers depending on this, probably stick with Node until Bun support is more settled — check Vercel's docs for the latest.
Prerendering Blog Posts
My blog posts are markdown files that live in the repo. They don't change between deployments, so there's no point rendering them on every request. SvelteKit makes this easy — just export prerender from the route:
// src/routes/blog/+page.server.ts
export const prerender = true;
export async function load() {
const posts = getBlogPostsMetadata();
return { posts };
}Both the blog listing and individual posts are prerendered at build time. Static HTML, no serverless functions, instant loads.
ISR for the Homepage
The homepage is a different story. It fetches blog posts (fine, those are local) but also pulls my latest YouTube videos from the API. I don't want to hit the YouTube API on every page load, but I also don't want to trigger a full redeploy every time I upload a video.
Enter Incremental Static Regeneration. ISR caches the server-rendered page and regenerates it in the background after a set interval:
// src/routes/+page.server.ts
import { BYPASS_TOKEN } from '$env/static/private';
import type { Config } from '@sveltejs/adapter-vercel';
export async function load() {
const [posts, videos] = await Promise.all([
getBlogPostsMetadata(),
getLatestVideos(),
]);
return { posts, videos };
}
export const config: Config = {
isr: {
expiration: 60 * 60 * 6, // regenerate every 6 hours
bypassToken: BYPASS_TOKEN,
},
};With a 6-hour expiration, the page is essentially static for most visitors. Every 6 hours, the next request triggers a background regeneration — that visitor still gets the cached version instantly, and the fresh copy is ready for the next person.
The bypassToken is a secret (32+ characters) stored in your Vercel environment variables. It lets you manually bust the cache with a special header — handy when you want fresh data without waiting for the expiry window.
One thing to keep in mind: ISR pages should return the same content for all visitors. If you're pulling session-specific data, you risk caching one user's response and serving it to everyone.
What Else You Can Configure
Any route can export a config object to control how it's deployed. Quick reference:
All functions:
runtime—'edge','nodejs22.x', or'experimental_bun1.x'regions— which edge regions to deploy tosplit— deploy the route as its own function instead of bundling it
Serverless only:
memory— RAM in MB (128–3008, defaults to 1024)maxDuration— timeout in seconds (depends on your plan)isr— the caching config above
Edge only:
external— dependencies for esbuild to treat as external
Set it in a layout to apply across all child routes, or override it per-page.
Edge Functions
For lower latency on a specific route, just opt it into edge:
export const config = {
runtime: 'edge'
};Push your code and Vercel handles the rest. Blog posts are prerendered at build time, the homepage is cached with ISR and refreshes every 6 hours, and the whole thing runs on Bun. Not bad for a config file that's 8 lines long.
Happy coding! 👋