The definition of done checklist your team actually needs

Agile team standing around a whiteboard with a completed checklist, high-fiving after finishing a sprint incrementAgile team standing around a whiteboard with a completed checklist, high-fiving after finishing a sprint increment Every Scrum team has a moment where someone marks a story as "done" and someone else asks, "but did you write the tests?" That gap between what one person considers finished and what the team actually needs is exactly what a Definition of Done solves. The 2020 Scrum Guide elevated the DoD from a nice-to-have into a formal commitment tied to the Increment. Yet research from Ron Lichty's Study of Product Team Performance shows that only 45% of teams have a DoD their own team created. The rest either work without one or follow a checklist someone else handed them. The interesting part: only team-created definitions of done correlate with high performance. Externally imposed ones show no correlation at all.

What the definition of done actually is

The DoD is a shared quality standard that applies to every piece of work your team delivers. It's not the same as acceptance criteria.
Definition of DoneAcceptance criteria
ScopeUniversal, applies to all workSpecific to one story
FocusQuality and process standardsFunctional requirements
Who writes itThe whole Scrum TeamProduct Owner (with team input)
Example"Code reviewed by at least one dev""User can filter by date range"
Both need to be satisfied before a story is complete. Acceptance criteria tell you what to build. The DoD tells you how well it needs to be built.

A definition of done checklist at three levels

Not every team needs the same DoD. A startup shipping its MVP has different quality needs than a healthcare platform with compliance requirements. Here's a tiered approach.

Starter DoD

For teams just getting started with a formal definition of done:

Code peer-reviewed by at least one other developer

Unit tests written and passing

No new compiler warnings or errors

Acceptance criteria verified

Code merged to main branch

Builds successfully from source control

Intermediate DoD

For teams with established CI/CD and a few sprints under their belt:

Code peer-reviewed by at least one other developer

Unit tests written and passing

Code coverage does not decrease below current threshold

Integration tests passing

No critical or high-severity bugs remain

Acceptance criteria verified end-to-end

Deployed to staging environment

Product Owner has reviewed and approved

Technical documentation updated

Accessibility standards met

Advanced DoD

For mature teams shipping to production every sprint:

Code peer-reviewed by at least one other developer

Unit, integration, and regression tests passing

Code coverage maintained or improved

Security vulnerability scan passed

Performance benchmarks met

Deployed to production behind feature flag

Monitoring and alerting configured

User-facing documentation updated

Release notes written

Acceptance criteria verified in production

Product Owner sign-off complete

Team looking at a large monitor showing a CI/CD pipeline with green checkmarks at each stage, representing automated quality gatesTeam looking at a large monitor showing a CI/CD pipeline with green checkmarks at each stage, representing automated quality gates The right level depends on your context. Start with a checklist your team can consistently meet, then raise the bar over time. Adding items your team routinely skips just trains everyone to ignore the DoD entirely.

Why DoD matters more than you think for estimation

This is where most articles about the definition of done stop. But the link between DoD and estimation accuracy is the most underrated reason to get it right. The Scrum Guide says it plainly: developers forecast more confidently when they understand their Definition of Done. When your team estimates a story during planning poker, that estimate should include everything required to meet the DoD. Not just writing the code, but the review, the tests, the deployment, all of it. Teams that estimate only the coding effort and then discover DoD activities at the end of the sprint consistently overcommit. The work didn't take longer than estimated. The estimate just ignored half the work. When you add new items to your DoD (like security scanning or performance testing), expect velocity to drop temporarily. That's healthy. Your velocity is just getting honest.

Five anti-patterns that undermine your DoD

1. Set it and forget it

The team wrote a DoD six months ago and hasn't looked at it since. Tools change and the product grows. Review the DoD in your retrospectives even if you don't change it every time.

2. Created by one person

A tech lead or Scrum Master drafts the DoD alone and presents it as final. The rest of the team never feels ownership over it, so they treat it as optional. The fix: build it together in a workshop. Everyone who does the work should have a say in what "done" means.

3. Too vague to verify

"Code is good quality" and "testing done" aren't verifiable. Every item should have a clear pass/fail condition. "Code passes static analysis with zero critical issues" is something you can check. "Code is clean" is a matter of opinion. Person looking at a giant checklist with some items clearly marked done and others ambiguous, illustrating the difference between vague and specific criteriaPerson looking at a giant checklist with some items clearly marked done and others ambiguous, illustrating the difference between vague and specific criteria

4. Lowering the bar under pressure

When deadlines get tight, teams sometimes weaken the DoD to ship faster. The 2020 Scrum Guide is explicit here: the DoD can evolve to improve quality, but it should not be weakened. Cutting corners on quality doesn't speed you up. It creates the illusion of speed while piling up problems for future sprints.

5. Confusing DoD with acceptance criteria

The DoD is universal. Acceptance criteria are story-specific. Mixing them together means you either lose the quality standard or forget the functional requirements. Keep them as separate checklists that work together.

How to build your first DoD

If your team doesn't have a definition of done yet, here's a practical way to create one.
Start with what's already happening
Ask your team: "What do we already do before calling something done?" Write down every answer. Most teams already have informal standards that haven't been documented.
Identify the gaps
Look at recent bugs or production incidents. What would have caught them earlier? Those gaps become candidates for new DoD items.
Keep it short
Aim for 6-12 items. Every item should earn its place by preventing a real category of problem. If you can't point to a specific issue an item would catch, cut it.
Make it visible
Post the DoD where the team can see it during sprint planning and daily work. A checklist buried in a wiki is a checklist nobody reads.
Review it every retrospective
Add a standing agenda item to your retros. Ask: "Did the DoD catch what it needed to? Did anything slip through? Should we add or remove items?"

What high-performing teams do differently

Ron Lichty's research points to a clear pattern. High-performing teams own their DoD and evolve it deliberately. They also use it to keep estimation honest instead of optimistic. These teams automate as much of the checklist as possible. Code coverage checks and static analysis run in CI/CD pipelines alongside security scans. The DoD gets enforced by the system rather than by memory, which frees the team to focus on the items that actually require human judgment, like whether the acceptance criteria are met from a user's perspective. The most mature teams tie their DoD to business outcomes, not just technical standards. Martin Hinshelwood gives an example: "Live in production, gathering telemetry, supporting or diminishing the starting hypothesis." That's a team whose definition of done isn't just "the code works" but "we're learning from what we shipped."

Getting started

Pick three items from the starter checklist above. Write them on a whiteboard or drop them in a shared doc. Use them for one sprint. In the retro, ask what worked and what's missing. Add one or two items. Repeat. A small, consistent DoD that the team actually follows beats a comprehensive one that everyone ignores. Build the habit first, then raise the bar. And the next time someone asks if a story is done, you'll have an answer that doesn't require a 10-minute conversation. If your team uses planning poker for estimation, keep the DoD visible during those sessions. It's the fastest way to make sure estimates reflect the full scope of what "done" actually means.

Review it every sprint retrospective. Most sprints you won't change anything, but the habit keeps it relevant. Update it when you find recurring quality gaps or when new tools make existing items automatable.

The Scrum Guide says if an organization has a standard DoD, teams must follow it as a minimum. Teams can add stricter standards on top. In practice, each team should own the items beyond the organizational baseline.

Definition of Ready covers whether a backlog item has enough information to start work (clear requirements, dependencies identified). Definition of Done covers whether the finished work meets quality standards. Ready is the entry gate, Done is the exit gate.

No. If it doesn't meet the full DoD, it's not done. It goes back to the product backlog. Partially done work should never be presented in sprint review or counted toward velocity.
Last Updated on 10/02/2026