Definition of Done

We use this to gain a common understanding within the team about exactly what is expected to be delivered so that we can ensure transparency and quality fit for the proposed solution development.

Definition of Done (DoD) refers to all the work needed to deliver a quality solution - ALL the work!

From the Scrum Guide:

https://scrumguides.org/scrum-guide.html#increment

"...

The Definition of Done is a formal description of the state of the Increment when it meets the quality measures required for the product.

The moment a Product Backlog item meets the Definition of Done, an Increment is born.

The Definition of Done creates transparency by providing everyone a shared understanding of what work was completed as part of the Increment. If a Product Backlog item does not meet the Definition of Done, it cannot be released or even presented at the Sprint Review. Instead, it returns to the Product Backlog for future consideration.

If the Definition of Done for an increment is part of the standards of the organization, all Scrum Teams must follow it as a minimum. If it is not an organizational standard, the Scrum Team must create a Definition of Done appropriate for the product.

The Developers are required to conform to the Definition of Done. If there are multiple Scrum Teams working together on a product, they must mutually define and comply with the same Definition of Done.

..."

The Definition of Done (DoD) is the cornerstone to estimating the solution and the methods for delivering the solution. It clearly states all steps necessary to complete a Story. Unless everyone understands the holistic DoD, it is almost impossible to plan and deliver the correct solution.

The DoD does the following:

  • Creates a common terminology

    • We may refer to the same activity by different names. For example, "Implementation", "Coding" and "Development" are all different terms for the same things. Having the team agree on a single name for each activity introduces a consistent terminology that the team can use to enhance their communication.

  • Defines the minimum Quality Assurance necessary

    • This increases the transparency and the trust in the delivery team capacity to deliver quality solutions

  • Create a clear definition of what DONE means.

    • Collaboratively documenting and publishing the DoD, establishes and communicate the steps used to create the solution.

  • Help the team to homogeneously size a solution.

    • The DoD creates a clear understanding of what is required to create a solution

  • Focus the team

    • Without any clear and common understanding of "DONE" the team may do the wrong work, in the wrong way

Definition of Done As a Tool

  • The DoD should be something that every team member uses every day - especially during estimation sessions.

  • During an estimation session, you use the DoD to help inform you about all the tasks that must be conducted in order to say this is done.

  • Daily use of the DoD can be through checklists for the items of the DoD. This prompt thinking and encourage you to ask questions to your team - "Do we need testing for this story?" (The answer is always yes!).

  • So keep the DoD in a place of high visibility for all to use and use it to help you do the right work at the right time.

Definition of Done versus Acceptance Criteria (AC)

DoD is different to Acceptance Criteria (AC).

DoD defines the tasks required by all Stories. AC defines how a specific Story should behave to accomplish the desired value with the change. The AC defines what is right - what is acceptable to the user.

For example, something may be done, "I'm done with my exam", but doesn't meet all the AC, "You received a D+".

We aim to be done and right by completing all the tasks and ensuring that we make them acceptable to the user.

"Done-but" & Undone Work

The Agile Manifesto states "Working software is the primary measure of progress", this translates to the done state being binary - work is either done or not done. In traditional methods, we asked for percentage complete and everyone always said 80% cause it showed that you had done a lot of work - but there was still lots to go and managers seemed to like that. Unfortunately, it meant nothing, as we can't deploy something 80% done wot our customers and therefore, we can't create value with the increment.

There is no such a thing as "dev done", "almost done", "done except for ...", "quite done" or "done but not done-done" It is done or it is not done.

This helps with the transparency and trust between the delivery team and its stakeholders.

If at the end of a Sprint you are not done, that's ok. It's okay not to showcase and deploy something that isn't done. Don't change the estimate. Talk about it in the Retro - why wasn't it finished? Was there something off with the estimate? Or did we not plan properly? How can we fix the process so that we limit the amount of "not done" work moving to the next iteration? Finally, move it to the next iteration and finish it asap - as we were working on it because it was a high priority and we have started working on it so finish it!

Example DoD

How to run a DoD session

Based on the current Jira statuses for a Story and a Defect, here is the list of minimum tasks/answers required for a Story to progress from one Status to the Next.

Each End to End Delivery Team has their own Definition of Done, based on the following template, and are using these currently. Reach out to your Scrum Master or Product Owner if you would like more detail on this.

Start

Does the Story have a description?

Has the Product Owner prioritised this work to be done in the next 6 sprints?

TO DO

Does the Story have a User Story that passes the INVEST quality test? “I” ndependent (of all others) “N” egotiable (not a specific contract for features) “V” aluable (or vertical) “E” stimable (to a good approximation) “S” mall (so as to fit within an iteration) “T” estable (in principle, even if there isn’t a test for it yet) As a I want , so that

If a Story, Does it describe and define a benefit forecast or hypothesis? If a Defect, Was it triaged and contain empirical data supporting its severity? This defines the value of a piece of work, from different perspectives like benefits, risk adversity, loss, security,

When required, does the Story have its benefits mapped to a Scorecard To map the initiatives that will contribute to the Objectives for the quarter Does the Story have an Acceptance Criteria? Does the Defect have the Expected Behaviour? Story

One or more scenarios defining the scope of the change. It can be done using BDD or TDD depending on the change.

Defect

The expected behaviour is well defined and achievable, but cannot be a change in requirements. If it is will require a Story to be created instead of a Defect.

When major UI changes, is the mockup attached to the story, defining the behaviour of its components

Has the Product Owner prioritised this work to be done in the next 4 sprints?

REFINEMENT READY

Has the Product Owner prioritised this work to be done in the next 2 sprints?

When required, is the mockup attached to the story, defining the behaviour of its components

Is a technical investigation or a spike required to clarify the change?

Is Unit Test required?

Is Test Automation Required?

Does the change require updates or creation in documentation?

Are technical and team dependencies mapped? Is there any work required to be able to start or complete the change, independent of the team performing it? Is something blocking the start of this change?

Is the scope locked down and understood by the whole team?

Has the team agreed on the estimation of the work?

SPRINT READY

Has the team agreed to include the Story in the current Sprint?

Are all know impediments and dependencies accounted for?

Is someone actively working on the change?

IN PROGRESS

Does the Story conform with the Acceptance Criteria?

Was the story Shift-Left feature tested? Anyone that knows about the story peer with the developer to review the story against the acceptance criteria Does the code pass the code quality assessment tool?

When required, are the Unit Tests done and passing?

When required, are the test automation done and passing?

Was the code peer-reviewed and approved? In some cases, the peer-review must be done by another team Is the code merged into the code base?

Was the code deployed to the UAT environment?

Peer Review Code is Peer reviewed and Approved Deployed to Nx Feature branch is deleted Code is merged into the development code base IN TESTING

Does the story pass progression manual tests?

Does the story pass progression automated tests?

Does the story pass regression automated tests?

DONE

Last updated