ai-agent-dev-workflow

Feature Specification Process

Create a specification for the following: $ARGUMENTS

Specifications clarify WHAT to build before implementation begins. Instead of discovering ambiguity mid-implementation (expensive), surface it upfront through structured exploration and clarification. This is the highest-leverage pattern for complex features.

Mapping to Claude Code Workflow

Claude Code Phase Spec Phase What Happens
Explore Phase 1: Explore Read files, understand context, check PROJECT.md
Plan Phase 2: Specify User stories, acceptance criteria, edge cases
Plan Phase 3: Clarify Interactive disambiguation (max 5 questions)
Plan Phase 4: Validate 8-point spec quality checklist
Plan Phase 5: Handoff Summary report, suggest /tdd

When to Use This Process

Scope Approach
Trivial (typo, config, rename) Don’t use this skill – just do it directly
Small (well-understood, single concern) Quick spec shortcut (skip clarification)
Medium (multi-concern, some ambiguity) Full process
Large (cross-cutting, unfamiliar domain) Full process, thorough clarification

Quick spec shortcut: For small well-understood changes with no domain ambiguity, collapse to: Explore -> Specify -> Validate -> Handoff. Skip clarification entirely. Use when the feature can be described in one sentence and has no competing interpretations.

Scope negotiation: If the request contains 3+ distinct features, recommend splitting into separate specs before proceeding. Each spec should cover one coherent feature with independent user value.

Product vs feature: If the request describes an entire product or system (multiple independent capabilities, separate components, its own installation story), treat it as a product-level spec:

If a product spec exceeds 12 stories, consider splitting into a product spec (overview) + separate feature specs (details).

Product specs with interacting capabilities need explicit interface contracts — file paths, data formats, handoff expectations. Document these in Assumptions or a dedicated section. Without them, components may diverge on implicit conventions that break silently at integration time.

Supporting Files

Load these on demand, not all upfront:

File When to Load
spec-template.md Phase 2, when generating the spec
clarification-taxonomy.md Phase 2 (scanning) and Phase 3 (questions)
validation-checklist.md Phase 4 (validation)

Process Overview

Phase 1: Explore
Phase 2: Specify
Phase 3: Clarify (interactive, skippable)
Phase 4: Validate
Phase 5: Handoff

Each phase must complete before the next begins. Phase 3 is skipped if no ambiguities are found.


Phase 1: Explore

1.1 Understand the Request

1.2 Explore Current Code

Greenfield projects: If the codebase doesn’t exist yet, explore the inputs that define WHAT to build: analysis docs, design docs, reference materials, domain knowledge files, and similar projects. The goal is the same — understand context before specifying — but the context lives in documents rather than code.

Context tip: Steps 1.2 and 1.3 are independent — run them in parallel (e.g., use subagents) when the codebase is large or unfamiliar.

1.3 Read Project Context

1.4 Identify Affected Components

1.5 Check for Existing Spec


Phase 2: Specify

2.1 Load Spec Template

Read spec-template.md for the output format. Use its section structure and frontmatter schema.

2.2 Generate User Stories

For each distinct user-facing behavior:

2.3 Write Acceptance Criteria

For each user story, write testable acceptance criteria:

2.4 Identify Edge Cases

Load clarification-taxonomy.md. Scan all 9 categories against the feature:

2.5 Document Assumptions

Record reasonable defaults for unspecified details. See spec-template.md §Assumptions for guidelines. Use the taxonomy’s “Reasonable defaults” as a guide.

2.6 Mark Ambiguities

For categories marked Partial or Missing where the ambiguity materially impacts the spec:

2.7 Write Spec File


Phase 3: Clarify

Skip this phase if no [NEEDS CLARIFICATION] markers exist. Report: “No critical ambiguities detected. Skipping clarification.”

3.1 Prioritize Questions

From the [NEEDS CLARIFICATION] markers:

3.2 Interactive Questioning Loop

For each question in the queue:

  1. State context: Quote the relevant spec section
  2. Recommend an answer: Based on codebase patterns, project context, and best practices. Format: **Recommended:** Option X -- <one sentence reasoning>
  3. Present options as a markdown table:

    Option Description Implications
    A
    B
    C
  4. Accept response: User replies with:
    • Option letter (e.g., “A”)
    • “yes” or “recommended” to accept the recommendation
    • A custom short answer
  5. Update the spec immediately after each answer:
    • Replace the [NEEDS CLARIFICATION] marker with the answer
    • Add to Clarifications section: - Q: <question> -> A: <answer>
    • Save the file (atomic update)
  6. Handle scope changes: If the user’s answer introduces new scope (new scenarios, new capabilities, new actors), update affected stories and criteria before proceeding. Treat the answer as a mini re-spec of the affected section — don’t just record it and move on
  7. Proceed to next question or stop

3.3 Stop Conditions

Stop asking questions when:

3.4 Full Abort

If the user says “wrong approach”, “start over”, or similar:

3.5 Deferred Markers

If questions remain after early termination, replace unresolved markers with: [DEFERRED: not clarified -- <brief reason>]


Phase 4: Validate

4.1 Load Validation Checklist

Read validation-checklist.md for the 8-point criteria.

4.2 Run 8-Point Check

For each point, verify with specific evidence from the spec:

  1. Testability — Every Given/When/Then maps to a test assertion
  2. Completeness — Every story has criteria; every P1 has error paths
  3. Clarity — No vague adjectives without measurable targets
  4. Scope — Out of Scope section exists and is non-empty
  5. Independence — P1 stories deliver standalone value
  6. Priority — P1/P2/P3 assigned; P1 set forms coherent MVP
  7. Edge Cases — Failure modes identified per P1 story
  8. Resolution — No unresolved [NEEDS CLARIFICATION] markers

4.3 Project-Specific Validation

Using PROJECT.md (already loaded in Phase 1), run the project-specific checks per validation-checklist.md §Project-Specific Validation.

4.4 Fix Issues

If points fail, fix the spec directly and re-run the checklist. See validation-checklist.md §Handling Failures for the iteration limit and fallback behavior.

4.5 Mark Spec Ready

Update frontmatter status:


Phase 5: Handoff

5.1 Summary Report

Output to the user:

5.2 TDD Integration

Explain how the spec maps to TDD implementation:

Spec Section TDD Phase How to Use
User Stories Phase 1: Analysis Scope of what to explore
Acceptance Criteria Phase 3: Pre-Test Write these as failing tests
Edge Cases Phase 3: Pre-Test Additional test cases
Affected Components Phase 2: Planning Starting point for chunks
Assumptions Phase 1: Analysis Context for architecture decisions

5.3 Suggest Next Step

Next: /tdd implement <feature> (spec: docs/specs/<feature>.md)

If the spec has warnings, note them: “The spec has N unresolved warnings — review the Notes section before starting TDD implementation.”

Non-code projects: If the spec covers content, configuration, or documentation (no executable code), adapt the TDD mapping:


Post-Validation Pivot

If the user challenges a fundamental assumption after the spec is marked Ready (e.g., changing the architecture, switching platforms, redefining scope), handle it as a controlled re-spec:

  1. Acknowledge the pivot — don’t defend the existing spec
  2. Assess impact — which stories survive the pivot vs need rewriting?
  3. Bump spec-version and update the frontmatter
  4. Rewrite affected sections — stories, criteria, affected components
  5. Re-run Phase 4 validation on the updated spec
  6. Log the pivot in Clarifications: - Pivot: <old> → <new> (<reason>)

This is cheaper than starting fresh because unaffected stories, edge cases, and clarifications are preserved.


Lessons Learned (Apply Every Time)

  1. Acceptance Criteria That Can’t Fail Aren’t Criteria - If a Given/When/Then always passes regardless of implementation, it tests nothing. “System handles errors” is not testable. “When API returns 500, user sees error banner with retry button” is
  2. Edge Cases Come from the Domain, Not the Feature - The most dangerous edge cases aren’t in the feature itself — they’re in how the feature interacts with existing domain rules. A “delete” feature seems simple until you consider: shared items, items in progress, items with dependencies, undo expectations, sync conflicts
  3. The Spec Is Not the Plan - Specs define WHAT and WHY. Plans define HOW. If you catch yourself writing framework names, file paths, or algorithm choices, stop — those belong in TDD, not here
  4. Three Clarifications Beat Ten Assumptions - A single well-chosen clarification question that resolves a P1 scope ambiguity saves more rework than ten documented assumptions. The max-5 limit forces prioritization — use it on what matters most
  5. “Simple” Features Hide Domain Complexity - “Add search” sounds simple until you consider: fuzzy matching, pagination, permissions filtering, empty states, indexing strategy. The taxonomy exists to surface this complexity before implementation
  6. User Stories Should Survive a Pivot - If changing the tech stack invalidates your user stories, they contain implementation details. “User finds events by keyword” survives a rewrite. “User calls GET /api/search” doesn’t
  7. Don’t Spec What You Can Grep - If the codebase already implements a pattern, reference it instead of re-specifying it. “Follow the same pattern as OrderService” is more accurate than re-describing what OrderService does
  8. Scope Creep Starts in the Spec - Every “while we’re at it” in a spec doubles implementation time. Be aggressive about P2/P3 classification. Ship P1, then spec P2 separately. The Out of Scope section is your best defense
  9. The Best Specs Are Boring - A good spec reads like a checklist, not a narrative. If it’s exciting to read, it probably contains opinions instead of criteria. Save the creativity for the code
  10. Check Quality Standards During Specification, Not Just Validation - Phase 4 catches missing criteria categories (accessibility, offline behavior) but fixing them there costs a validation iteration. Scan PROJECT.md Quality Standards during Phase 2.3 while writing criteria — it’s cheaper to include them upfront than to discover them late

Session Scope

Specs complete in one session. There is no tracker or session resumption mechanism. To revisit a feature’s spec later, re-run /spec — Phase 1.5 detects the existing file and offers update or fresh start.


Commit Guidance

Do not commit proactively. Wait for the user to request it. Refer to PROJECT.md for project-specific commit conventions (author, message format, trailers).