2026.03.30
How We Built goksen.ing with an AI Company
I didn’t write most of this code.
That sentence is either exciting or alarming depending on your priors. I want to write the honest version of this story — not the “AI did everything and it was magic” version, and not the “AI is useless hype” version either. The real version, which is messier and more interesting than both.
This site started as an experiment: hand a vision to an AI company and see what comes back. It ended as something else — a collaboration between me and Claude Code, refining every pixel and interaction until the site felt like mine.
Here’s how both chapters went.
Chapter 1: The AI Company
The first version of goksen.ing was built by Paperclip — a platform that runs AI agents as a coordinated company. A CEO agent plans and delegates. An Engineer writes code. A Designer proposes creative direction. A Security Officer audits before anything ships. Each agent operates in heartbeats — short execution windows where it wakes up, checks its task queue, does work, posts updates, and exits. They’re more like async workers with task awareness than an always-on assistant.
I gave the CEO one brief: build a brutalist personal website with a terminal aesthetic. Next.js 16, Payload CMS, Tailwind v4, Supabase, Vercel. I had a design system repo with 49 components, OKLCH color tokens, and JetBrains Mono everywhere. No plan, no architecture, no content — just a vision.
The CEO’s first move was to create a planning document. Before any code was written, it posted 9 blocking questions: design system distribution strategy, database pooler config, Spotify OAuth credentials, DNS status, privacy requirements. It wouldn’t start until I answered each one. That discipline — asking rather than guessing — was the first surprise.
Security Before a Single Line Shipped
The Security Officer ran its review before the Engineer wrote any application code. This ordering was intentional.
It found two criticals immediately. First: PAYLOAD_SECRET had an empty-string fallback — meaning if the env var wasn’t set, sessions could be forged with a known secret. Second: GraphQL Playground was exposed in production, allowing full schema introspection. Both were fixed before any feature work started.
Then came CORS/CSRF allowlists, GraphQL rate limiting with query complexity limits, and HTTP security headers. All shipped before Phase 1. The Security Officer’s job was to find problems and create work — it didn’t implement the fixes itself. Those went back to the Engineer. That separation of concerns is one of the more underrated aspects of the multi-agent setup.
The Foundation: Phases 0 Through 3
Phase 0 gave us infrastructure: design system integration via GitHub Packages, media storage with Cloudflare R2, and the Payload CMS collections — Posts, Pages, Media, Photos, Gear, Tags, IntegrationTokens, and VisitorLogs.
Phase 1 delivered content pages: a blog system with Lexical rich text rendering, photo gallery with masonry grid, gear showcase, about page. Phase 2 brought the integrations: Spotify OAuth for playlists and now-playing, GitHub GraphQL for contribution graphs and pinned repos, Hevy for workout history, Cosmos.so with a circuit breaker pattern backed by Redis.
Phase 3 added polish: a split-screen landing page with boot sequence, CRT/fade mode transitions, a full command palette with fuzzy search, SEO with structured data and OG images, and 141 unit tests.
The agents shipped approximately 70 commits across all four phases.
The Designer’s Best Idea
The Designer proposed a concept called “The Machine Remembers Everything.” Both modes of the site — engineer and nerd — were framed as different ways of accessing the same memory. Engineer mode was a terminal querying a filesystem. Nerd mode was a mood board, a playlist, a gallery wall. Same data, different emotional register.
That reframe changed how I thought about the project. A toggle that changes colors is a party trick. A toggle that changes the narrative frame — that’s the product.
Chapter 2: Enter Claude Code
Paperclip built the architecture and shipped working features. But a working site isn’t a finished site. The difference between “this functions” and “this feels like me” required a different kind of collaboration — faster iterations, real-time feedback, the ability to say “no, move that widget left” and see it happen in seconds.
That’s where Claude Code came in. Not as a replacement for the AI company, but as the next chapter.
Translating Figma to Production
The first Claude Code sessions were about translating Figma designs into production code. We implemented shared UI components, rebuilt the Landing, About, and Blog pages from Figma mockups, redesigned the Gallery grid with detail pages, and created the dashboard widget grid system.
This was fundamentally different from the Paperclip workflow. With Paperclip, I described what I wanted and checked the output hours later. With Claude Code, I was pair programming — directing changes in real time, seeing results immediately, iterating on details that would have been painful to communicate as ticket descriptions.
The Dashboard
The homepage dashboard became the centerpiece of the site. It’s a responsive 3-column grid of widgets, each pulling real-time data from different sources:
IdPanel — profile info, navigation links, and the active gear loadout pulled from the CMS. A server component that queries Payload for gear items marked as favorites.
Camera Widget — the latest photos from the Media collection, displayed in a horizontal scroll with a phosphor duotone filter and scanline overlay. Each image gets a label (IMG_001, IMG_002) and reveals grayscale on hover.
GitHub Widget — contribution calendar rendered as ASCII art using block characters (░▒▓█), pinned repos, and stats for commits, stars, and followers. Powered by GitHub’s GraphQL API.
Gym Widget — Hevy workout data including total workouts, streak, volume sparkline for the last 8 weeks, a radar chart for muscle group distribution, and top exercises. All rendered in terminal command styling.
Playlists Widget — Spotify playlists with album art behind phosphor filters, now-playing integration, and series numbering extracted from playlist names. Hover reveals a play button.
Blog Widget — latest posts with reading time, tags, and an expandable view. Dates formatted as YYYY.MM.DD in the terminal style.
Visitors Widget — a real-time visitor log table with geo data: time, city, country, and page visited. Built from a Payload collection that tracks visitors.
Cosmos Widget — visual bookmarks from Cosmos.so, fetched via GraphQL with a circuit breaker that falls back to cached data after 3 failures.
Each widget is wrapped in a TerminalReveal component — a typing animation that simulates running a terminal command before the content appears. The effect is that loading the homepage feels like booting a system, with each section coming online one at a time.
The Boot Sequence
One of the most satisfying sessions was building the GSAP-powered boot sequence. When you first visit the site, you see a CRT warmup animation: a 3.6-second timeline with 11 phases.
CRT flicker, init line typing, status blocks, corruption waves, a WebGL glitch canvas with horizontal tear bands and RGB channel splitting, a progress bar that fills across the screen, and a final flash. The glitch characters are drawn from Unicode box-drawing characters: ░▒▓█▄▀│┤╡╢╣║╗╝╜╛┐.
It’s skippable. It only plays on first visit (stored in sessionStorage). It respects prefers-reduced-motion. But if you let it play, it sets the tone for everything that follows.
The Iteration Loop
The most productive Claude Code sessions weren’t about building new features — they were about refinement. In one session we’d:
— Swap the positions of the visitor log and camera widgets because the visual weight was off
— Widen the first column by 50% to give the IdPanel more breathing room
— Increase the visitor log from 5 to 15 results
— Fix media upload size limits for DSCF camera files (bumped to 25MB in the Payload body parser)
— Debug a Cloudflare R2 bucket that didn’t exist yet, then configure the right bucket name
— Apply the phosphor duotone filter to gallery widget images so they match the Spotify artwork treatment
These are the kinds of changes that make a site feel cohesive rather than assembled. Each one is small. Together they’re the difference between “a developer built this” and “someone who cares built this.”
Killing Nerd Mode
One significant decision was stripping the dual-persona concept entirely. The Paperclip Designer had proposed it, and the Engineer built both rendering paths. Engineer mode was green terminal. Nerd mode was a warm editorial layout.
After living with it, I realized the terminal aesthetic was the identity. The nerd mode diluted it. Claude Code helped strip the dual-mode code cleanly — removing the toggle logic, the duplicate component paths, and the nerd-mode CSS. What remained was pure: phosphor green on black, no rounded corners, monospace everything, CRT scanlines everywhere.
The Stack
Framework: Next.js 16 with App Router and React Server Components. The homepage runs 6 parallel server-side fetches before rendering.
CMS: Payload CMS v3, embedded in the same Next.js app. Lexical editor for rich text. 8 collections covering posts, media, photos, gear, pages, tags, visitor logs, and OAuth tokens.
Database: Supabase PostgreSQL with connection pooling on port 6543.
Storage: Cloudflare R2 via @payloadcms/storage-s3. Zero-egress media storage for photos up to 25MB.
Animations: GSAP 3.14 with ScrollTrigger and @gsap/react. Custom WebGL shaders for the boot sequence glitch canvas. CSS keyframe animations for page transitions.
Integrations: Spotify OAuth (playlists + now-playing), GitHub GraphQL (contributions + repos), Hevy REST API (workouts + stats), Cosmos.so GraphQL (bookmarks) with Redis-backed circuit breaker.
Design: OKLCH perceptual color system, JetBrains Mono as the sole typeface, zero border-radius globally, phosphor duotone image filters, CRT scanline and vignette overlays.
What I Actually Learned
AI agents are good at architecture, bad at taste. Paperclip produced a coherent phase plan, clean security audit, and solid infrastructure. But taste — the feeling that a widget should be 20px wider, that this image needs a green tint, that the boot sequence should take exactly 3.6 seconds — that requires human judgment delivered at the speed of conversation.
The best workflow is layered. Agents for the initial architecture and heavy lifting. Real-time pair programming for refinement. Neither replaces the other. The Paperclip agents would have been slow at iterating on widget spacing. Claude Code would have been overkill for scaffolding 8 CMS collections from scratch.
Stripping features takes more courage than adding them. Killing nerd mode was the right call. It was technically impressive and conceptually interesting. It also made the site worse. The AI company built it because the brief asked for it. I killed it because I lived with it.
The bugs were the same bugs. When we deployed to Vercel preview, 7 of 12 routes were broken. CMS pages crashed from missing error boundaries. The Hevy API had a max page size of 10 but our code requested 20. A stale TypeScript file was shadowing the correct importMap. These are the same bugs you’d hit shipping any real project. AI doesn’t eliminate bugs — it compresses the timeline for finding and fixing them.
The Machine Remembers Everything
The site you’re reading this on was architected by an AI company, built across 70+ commits by AI agents, then refined through dozens of real-time Claude Code sessions. The decisions about what to build, what to cut, and what to care about were made by a human sitting in a terminal.
The machine remembers everything. You still have to tell it what matters.