First AI Assisted Project

Share
First AI Assisted Project

This is not a story about asking an AI to build something and watching it appear. It is a story about using AI as a deliberate engineering tool — with planning, constraints, and accountability — on a real production project.

The Project

Live Recorder is a live-TV catchup recording service. It captures live streams as HLS segments, stores them in AWS S3, and generates on-demand clips via FFmpeg. The backend is .NET 10 with ABP Framework. The frontend is React 19 with TanStack Router and shadcn/ui. Two developers. 139 commits.

The Process

Plan before you prompt

Before a line of code was written, we had a full development plan — 61 stories across five phases, each with an objective, numbered steps, and acceptance criteria. Every story had its own file. Claude Code sessions were fed these files directly.

This was the most important decision we made. AI needs structure to produce consistent output. Without it, every session is a fresh improvisation. With it, AI follows the plan.

Each story followed a fixed template: objective, context, domain rules to enforce, numbered steps, acceptance criteria, and a "do not" list. That last section matters more than it sounds. It is the place where you tell AI what not to invent — what it will likely get wrong if left to its own judgement.

Give AI a brain for the project

We used a CLAUDE.md file — a document that encodes everything AI needs to stay consistent across sessions: the architecture, the framework conventions, naming rules, patterns to follow, things to never do. Without something like this, every new session risks drifting from decisions made in previous ones.

Think of it as the project constitution. AI reads it at the start of every session. It does not need to be reminded.

Use opinionated frameworks

The backend was built on ABP Framework — a DDD-based .NET framework we had used on multiple previous projects. This turned out to matter a lot. ABP is well-documented and opinionated. AI mimicked it well. There was little room to improvise badly because the framework already had strong answers to most structural questions.

The lesson: opinionated tooling is an advantage in AI-assisted projects, not a constraint.

Start from a good template, not a blank canvas

For the admin UI, the original plan was to have AI generate all management panel components from scratch. It was painful — inconsistent output, repeated corrections, slow progress.

We changed approach. We found a well-organised admin template, asked AI to strip it to bare minimum, and built everything on top of that foundation. The result was faster and more consistent. AI is better at adapting something good than inventing something from nothing.

Phase-gate your delivery

The project was organised into phases with clear gates. After Phase 2 we added a dedicated hardening phase — not a phase for new features, just quality: fixing race conditions, tightening edge cases, reviewing what AI had generated under pressure. It was not optional.

AI accelerates delivery. It does not remove the need for quality gates.

When the domain changes, change the plan first

Midway through the project, the domain model needed a significant restructure. New entities, renamed fields, removed abstractions. Before touching the code, we wrote a full breaking-changes report and a migration plan for the frontend.

The discipline was: update the documentation first, then execute. If you skip that step with AI-assisted development, you lose the thread quickly.

Own the commit history

Every commit referenced a story number. Every change was traceable back to a plan. AI wrote the code. The engineer owned the history. That distinction matters.

Where AI Fell Short

The FFmpeg recording domain was the failure point.

The planning for that domain was not detailed enough. We wrote stories at a level that worked well for the CRUD and API layers — but the recording engine required a deeper design. AI filled the gaps with its own interpretation and got it wrong. The logic did not match what we needed.

It took multiple iterations to correct. Each iteration required refining the specification further — going back to the plan, adding more detail, then re-running.

The root cause was not AI. It was under-specified planning on a complex domain.

AI fills vague language with its own interpretation. Every place your spec says "handle the recording logic" instead of numbered steps is a place where AI will guess. Sometimes it guesses right. On the recording engine, it did not.

What I Would Do Differently

Go deeper on the task tree. Tasks → subtasks → sub-subtasks, as many levels as the domain complexity demands. The flat story format worked well for straightforward layers. It was not enough for the recording engine.

Also: accept that some domains cannot be fully designed upfront. You discover the shape of the problem as you build. Build more thinking time into the plan for complex domains before writing stories.

The signal that you need more depth: if you cannot describe the expected behaviour of a component in numbered steps without using vague language — go deeper. Flat stories are enough for CRUD and standard framework patterns. They are not enough for anything with concurrency, state machines, or novel business logic.

What AI Actually Changed

Scaffolding is fast. Boilerplate disappears. A project that would have taken months to reach this stage took weeks.

But the planning work is still on the engineer. The architecture decisions are still on the engineer. The quality judgement is still on the engineer. AI executes well when it has clear input. Producing that input is still the hard part.

The constraint was never the code. It was always the thinking before the code. AI has not changed that. It has just made the gap between good thinking and working software much smaller.


Written by me, edited with Claude.