This site — the one you are reading right now — was designed, engineered, and deployed by me. Not outsourced, not templated, not Squarespace. Built from scratch in Next.js, styled with Tailwind CSS v4, deployed on Vercel with a custom domain and a working transactional email pipeline. This post explains the decisions behind it and why I think they matter for my statistical consultancy practice.
Why this is in the garden: Building production software is part of my skill set, not a side note. A statistician who can also ship clean, performant, production-grade web infrastructure is a different kind of hire. This post is evidence of that — not a claim about it.
Stack decisions
Every technology choice was deliberate. The goal was a site that is fast, maintainable, SEO-sound, and capable of growing without technical debt — not something that looks impressive in a screenshot and falls apart under scrutiny.
Next.js 15 with the App Router
React with the App Router gives server-side rendering by default, file-system routing, and clean separation between server and client components. For a content-heavy consultancy site that needs good Core Web Vitals and structured metadata, this is the right foundation. Pages are rendered at build time where possible; dynamic routes (like this garden) are generated statically.
The alternative — a plain React SPA — would have required a separate solution for SEO, slower initial page loads, and considerably more configuration overhead. Next.js handles that plumbing cleanly.
TypeScript throughout
Every file is .tsx or .ts. The content data layer — SEO page definitions, publication lists, garden entries — is typed with explicit interfaces. This is not ceremonial. It means a typo in a content object fails at compile time rather than silently producing a broken page in production.
// src/lib/content/seo-pages.ts
export type SeoLandingPageContent = {
slug: string;
path: string;
metaTitle: string;
intro: string;
focusAreas: Array<{
title: string;
description: string;
}>;
// ...
};
export const seoPages: SeoLandingPageContent[] = [ ... ];The type system catches structural errors before they reach the browser. That is the same principle behind a good statistical analysis plan — define your schema before you touch the data.
Tailwind CSS v4
Tailwind v4 removed the JavaScript config file and shifted to a pure CSS-first approach. Utility classes are generated from a single CSS import rather than a compiled config object. For a solo project with a consistent design language, this is considerably cleaner than maintaining a tailwind.config.js with arbitrary values scattered across it.
The design system uses a deliberate palette: slate-950 backgrounds, lime-400 as the primary accent, emerald-400 and teal-400 as secondaries. Those choices are not arbitrary — high-contrast lime on near-black reads well, differentiates clearly from the standard blue-heavy palettes in tech, and holds at small sizes.
MDX for the Digital Garden
Long-form garden content is written in MDX — Markdown with embedded React components. The intercensal method post, for instance, uses react-katex to render LaTeX equations inline. That would be impossible in a plain Markdown renderer and clunky with raw HTML. MDX lets the writing stay readable while the output stays precise.
## The Core Equation
The intercensal method exploits the relationship:
<BlockMath math="\Pi(x+n, t+n) = \frac{\Pi(x,t) \cdot (1 - {}_nq_x^{HU} - {}_nq_x^{HD})
+ (1 - \Pi(x,t)) \cdot {}_nq_x^{UH}}{1 - {}_nq_x}" />
Where <InlineMath math="\Pi(x,t)" /> is the prevalence of healthy
individuals at age x and time t.The contact pipeline
The contact form is a good example of the kind of lightweight backend work this stack makes straightforward. Rather than a mailto: link — which opens a user's email client and loses half the enquiries — the form posts to a Next.js API route that sends a formatted email via Resend.
// src/app/api/contact/route.ts
import { Resend } from "resend";
import { NextRequest, NextResponse } from "next/server";
const resend = new Resend(process.env.RESEND_API_KEY);
export async function POST(request: NextRequest) {
const { name, email, organisation, enquiryType, message }
= await request.json();
await resend.emails.send({
from: "AKALYSIS Enquiries <contact@akalysis.co.uk>",
to: ["drandrewkingston@gmail.com"],
replyTo: email,
subject: `New enquiry from ${name} — ${enquiryType}`,
html: `...`,
});
return NextResponse.json({ success: true });
}The domain verification (DKIM, SPF, DMARC) is handled through Vercel's DNS management, since the domain nameservers point there rather than to Namecheap directly. The API key lives as a Vercel environment variable — never in the repository. Emails arrive formatted with the enquirer's details and a reply-to set to their address, so responding is a single click.
Deployment and domain
The site is deployed on Vercel with automatic deployments triggered by pushes to the main branch on GitHub. A commit pushed from a local terminal is live at akalysis.co.uk within roughly 60 seconds, including build time.
$ git push origin main
→ GitHub receives push
→ Vercel webhook triggers build
→ Next.js build: ~45s
→ akalysis.co.uk updated ✓
The domain is registered with Namecheap. Nameservers point to Vercel, which handles DNS, SSL certificate provisioning, and CDN distribution automatically. There is no server to maintain, no certificate renewal to schedule, no hosting bill beyond the domain registration cost.
SEO and structured data
Every page has explicit metadata generated through a typed utility function, ensuring consistent Open Graph tags, canonical URLs, and page titles across the site. High-value pages carry JSON-LD structured data for the organisation, person, and website entities — giving search engines explicit, machine-readable context rather than relying on inference.
// src/lib/seo.ts
export function createPageMetadata({
title, description, path, keywords
}: PageMetadataInput): Metadata {
return {
title: `${title} | AKALYSIS`,
description,
keywords,
alternates: { canonical: `https://www.akalysis.co.uk${path}` },
openGraph: {
title,
description,
url: `https://www.akalysis.co.uk${path}`,
siteName: "AKALYSIS",
type: "website",
},
};
}Why this matters for a statistician
Data scientists who can only operate within pre-built environments — Jupyter notebooks, RStudio, vendor dashboards — are limited in what they can deliver. The ability to take analytical output and build the infrastructure that surfaces it to decision-makers is a meaningful extension of that role.
This site is a working instance of that. The statistical writing in this garden is served by a production Next.js application with typed data models, a serverless email backend, automated deployment, and DNS-verified transactional email — all built and maintained by the same person doing the analysis.
That is not a coincidence. It reflects how I approach analytical problems generally: end to end, with attention to the infrastructure that makes outputs usable, not just the models that generate them.
Repository
The source code for this site is available on GitHub. It is not a polished open-source framework — it is a working consultancy site — but it is readable, typed, and a reasonable reference for the stack described here.
akalysis/akalysis-hubNeed analytical infrastructure, not just analysis?
If your team needs someone who can build the pipeline, ship the output, and do the statistics — get in touch.
Discuss a Project