Kept the site static but made one component server-rendered using Astro Server Islands. The homepage activity chart now pulls live GitHub data on each request, cached at the edge for 24 hours.
Fresh Data on a Static Site
The activity section on this site’s homepage shows a live chart of my GitHub commits. Until recently, that data came from a JSON file generated by a script I ran manually, committed, and deployed. It worked, but the data was always stale by however long it had been since I last remembered to run the sync.
I wanted the chart to show current data without making the entire site server-rendered, setting up cron jobs, or wiring GitHub Actions to push commits on a schedule.
The Options I Considered
GitHub Actions + scheduled commits. A workflow runs the sync script daily and commits the updated JSON. The site rebuilds on push. This works but adds noise to the git history and burns build minutes for a single JSON file.
Cloudflare Pages Functions + KV. A serverless function fetches GitHub data and caches it in KV storage. The homepage calls that function at the edge. Clean, but now I’m managing a function, a KV namespace, and the binding between them for what’s really just a data refresh.
R2 + GitHub Actions. Similar to the KV approach but writes to object storage. Same coordination overhead.
Astro Server Islands. Mark one component as server-rendered while the rest of the page stays static. The component fetches its own data at request time. Astro handles the plumbing.
Server Islands won because there’s almost nothing to set up. No extra infrastructure, no scheduled jobs, no new services to monitor. One component opts out of prerendering and that’s it.
How Server Islands Work
Astro 5 merged hybrid mode into static. You don’t need to change output in your config. Any component can opt out of prerendering individually.
The parent page uses server:defer to mark the component:
<ActivityPreview server:defer>
<div slot="fallback" class="activity-preview-skeleton" />
</ActivityPreview>The static HTML ships with the fallback slot content (a skeleton placeholder). When the page loads, Astro’s client runtime fetches the rendered component from a server endpoint and swaps it in. The user sees the skeleton for a moment, then the real chart.
Inside the component, export const prerender = false opts it out of static generation:
---
export const prerender = false;
const token = import.meta.env.GITHUB_TOKEN;
let data;
try {
data = token ? await fetchActivityData(token) : getActivityData();
} catch {
data = getActivityData();
}
---If there’s no token (local dev, or if the secret isn’t set), it falls back to the build-time JSON file. Same component, same rendering, different data source.
Edge Caching
Fetching from GitHub’s GraphQL API on every page load would be slow and wasteful. The component sets a CDN-Cache-Control header so Cloudflare’s edge caches the response:
Astro.response.headers.set(
'CDN-Cache-Control',
'public, max-age=86400, stale-while-revalidate=604800'
);First visitor of the day hits the origin, which calls GitHub’s API. Everyone else gets the cached response from the nearest Cloudflare edge node. stale-while-revalidate means even when the cache expires, visitors still get the stale version instantly while the edge fetches a fresh copy in the background.
The result: data refreshes daily, latency is edge-speed for almost all visitors, and GitHub’s API gets at most a handful of calls per day.
The Fetch Layer
The existing sync script used gh api graphql (the GitHub CLI). That doesn’t work in a serverless function. I wrote a fetchActivityData() function that calls GitHub’s GraphQL API directly with fetch() and a PAT token.
GitHub’s contributions API only returns one year of data per query. The function builds time windows from a start date to today, fires all of them in parallel with Promise.all, then merges the results. It applies the same repo config (aliases, visibility, minimum commit thresholds) that the CLI script used, so the output shape is identical.
const [repoCountsResult, ...contributionResults] = await Promise.all([
graphql(token, REPO_COUNTS_QUERY),
...windows.map((w) =>
graphql(token, CONTRIBUTIONS_QUERY, { from: w.from, to: w.to })
),
]);What Changed
Three files:
ActivityPreview.astrostopped receiving data as props. It fetches its own data now, withexport const prerender = false.index.astroswapped<ActivityPreview data={activityData} />for<ActivityPreview server:defer>with a skeleton fallback.fetch-github-activity.tsis new. Server-side GitHub GraphQL fetcher that returns the sameActivityDatatype as the JSON file.
The rest of the page, the chart components, the animation, the tooltip, stayed exactly the same. They don’t care where the data comes from.
Tradeoffs
The skeleton fallback is visible for a moment on first load. For a homepage activity chart that’s below the fold, this is fine. I wouldn’t use this pattern for above-the-fold content where layout shift matters.
You need a GitHub PAT with read:user scope set as GITHUB_TOKEN in your hosting environment. One more secret to manage, but it’s a single env var in the Cloudflare Pages dashboard.
The sync script still exists and still works. It’s useful for generating the fallback JSON that the component uses when no token is available. The dedicated /activity dashboard page is fully static, prerendered at build time from that same JSON. So the homepage preview shows live data via the server island, while the full dashboard shows whatever was current at the last deploy.
Server Islands are a good fit when you have a mostly-static site with one section that needs fresh data. You get the performance of static HTML for everything else, server rendering for the part that needs it, and edge caching to keep origin calls low. No infrastructure to manage beyond a single environment variable.