Definition of Ready (DoR)

The Definition of Ready defines the conditions a user story must meet before it can enter a sprint. It protects the team from picking up work that is too ambiguous to deliver.

A story is ready when it is INVEST:

LetterMeaning
IIndependent — does not block or depend on other stories
NNegotiable — not a rigid spec, but a basis for conversation
VValuable — delivers clear business or user value
EEstimable — can be sized with reasonable confidence
SSmall — fits within a single iteration
TTestable — acceptance criteria can be validated

DoR Checklist

  • The user story is properly defined with business value and acceptance criteria
  • Non-functional requirements are identified where applicable
  • External dependencies are resolved before sprint start
  • The story is estimated and prioritized
  • Front-end design and layout mocks are defined
  • SLOs are defined
  • Feature toggles are defined

Definition of Done (DoD)

The Definition of Done is a set of rules that determine when an item is truly finished. When a story meets the DoD, it becomes part of the product increment. The DoD creates shared transparency — everyone understands exactly what “done” means.

The DoD is structured across 8 categories:


1. Functionality

  • Functional acceptance criteria are met (validated via automated tests)
  • Non-functional requirements are implemented: documentation updated, concurrency and performance verified, API volumetry changes communicated

2. Process

  • Code developed following the agreed Way of Coding
  • Pull requests are correctly traced to user stories, follow branch naming conventions, and have been reviewed by TL or QA
  • Feature branches cannot be merged until all acceptance criteria are met
  • Stories cannot be closed until all branches are merged and all criteria satisfied

3. Quality

  • All tests implemented within the sprint and added to the regression plan
  • No statically detected defects in the code
  • Full test pyramid coverage:
    • Unit tests with required coverage and mutation testing
    • Integration tests with required coverage
    • E2E tests automated and linked to test management (XRay)
  • All detected defects closed before story closure

4. Usage Analytics

  • Front-end: events measuring real user experience (latency, network, rendering time)
  • Every use case measures its execution time from user action to completion
  • Back-end: metrics, logs, and alerts implemented for the use case
  • Alerts configured so responsible parties are notified on failure

5. Performance / Resilience

  • Load tests executed with k6 for representative use cases
  • Chaos / resilience tests executed: service cuts, latency, pod restarts — system recovers without human intervention
  • Resilience patterns implemented: circuit breakers, bulkheads, rate limiters
  • Fallback strategies in place for unavailable dependencies
  • Feature flags available for rollback if new features degrade performance
  • Alerts available in alerting systems covering the use case technologies

6. UX

  • Front-end design complies with agreed specifications
  • Design validated across all screen sizes, resolutions, and pixel densities
  • Tooltips implemented, text overflow handled
  • Feature documented in “What’s New” for the next version
  • Internationalization: translated into all required product languages

7. Reusability

  • Development follows API First approach
  • API consumers notified of contract changes with sufficient migration time
  • Functionality exposed to the rest of the system via API
  • Internal vs. public API visibility correctly set
  • Microfrontend creation assessed if functionality serves multiple surfaces

8. Data Product Quality

  • Own data issued via topical queues (Kafka or equivalent) with quality rules
  • Third-party data retrieved through appropriate services with DSAs where needed
  • Data cloning avoided; when necessary, event-based sourcing with reconciliation mechanisms
  • Completeness, uniqueness, consistency, and integrity guaranteed
  • All data validations covered by an effective alerting system

DoD Summary

CategoryCore requirement
FunctionalityAcceptance criteria met, functional and non-functional
ProcessWay of Coding followed, PRs reviewed and traceable
QualityFull test pyramid in CI/CD, zero open defects
AnalyticsUse cases instrumented with events, metrics, alerts
Performance/ResilienceLoad and chaos tests passed, fallbacks in place
UXDesign compliance, i18n, “What’s New” updated
ReusabilityAPI First, consumers notified, visibility set
Data QualityOwn data issued and validated, third-party data contracted