Website QA intelligence for teams who ship
Guides Tool Comparisons QA Glossary Archive RSS Feed
HomeGlossaryDefinition of Done

Definition of Done (DoD)

A Definition of Done is a comprehensive checklist of quality criteria that must be satisfied before any work item can be considered complete and ready for release. It serves as a non-negotiable quality gate that applies uniformly to all development work, ensuring consistent standards across features, bug fixes, and enhancements. Unlike acceptance criteria which vary by individual story, the DoD establishes baseline quality expectations that every piece of work must meet.

The Definition of Done functions as a team contract that explicitly defines what 'complete' means in measurable terms. It typically encompasses technical requirements like code review completion, automated test passage, and security scanning, alongside functional requirements such as cross-browser compatibility verification, performance benchmarking, and accessibility compliance. The DoD should be visible to all team members and updated regularly as quality standards evolve. Each criterion must be binary - either satisfied or not - to prevent ambiguous interpretations during delivery decisions.

For QA teams managing website estates, the DoD becomes critical for maintaining consistent quality across multiple properties and preventing regression issues that could impact user experience or compliance. It ensures that every feature release meets the same rigorous standards regardless of developer, timeline pressure, or stakeholder urgency. This consistency is particularly vital in regulated industries where incomplete testing or missing documentation can result in compliance violations. The DoD also provides QA managers with clear metrics for quality tracking and helps identify process gaps when items repeatedly fail to meet completion criteria.

Common mistakes include creating overly lengthy DoDs that become ceremonial rather than practical, failing to update criteria as technology stacks evolve, and allowing exceptions that undermine the entire framework. Teams often struggle with vague criteria like 'code is clean' instead of specific measures like 'code coverage exceeds 80 percent.' Another pitfall is treating the DoD as static rather than evolving it based on production issues and team learning. Some organizations also confuse DoD with acceptance criteria, leading to redundant or conflicting requirements.

The Definition of Done directly impacts broader delivery workflows by creating predictable quality outputs that downstream processes can depend on. It reduces the feedback loop between development and QA teams by catching issues earlier, minimizes production hotfixes that disrupt release schedules, and provides clear expectations for stakeholder sign-off processes. A well-implemented DoD also improves user experience consistency by ensuring every feature meets the same usability, performance, and accessibility standards before reaching customers.

Why It Matters for QA Teams

Without a shared DoD, 'done' means different things to different people. One developer might consider a feature done after writing code, while QA expects full test coverage and a staging deployment first.

Example

An e-commerce team managing multiple brand websites establishes their DoD to include: code passes automated security scanning, feature functions correctly on mobile Safari and Chrome, page load time remains under 3 seconds on 3G connections, all interactive elements meet WCAG 2.1 AA standards, analytics tracking fires correctly in staging environment, and feature documentation is updated in the team wiki. When implementing a new product filtering feature, the developer completes the code and initial testing, but during DoD verification, QA discovers the filter dropdown fails on mobile Safari and page load time increases to 4.2 seconds. Despite pressure from the product manager to ship before a marketing campaign, the team holds firm to their DoD criteria. The feature returns to development for optimization, ultimately launching two days later but meeting all quality standards and avoiding the customer complaints and emergency fixes that would have resulted from the initial implementation.