Often I get called in to coach teams who have been trying to implement Scrum and are struggling to deliver on their commitments by the end of the sprint.They agree what features should be possible to get done but somehow things don't go to plan. It looks like the problem is testing but there's more to it than that. Well-known agile expert Jeff Patton illustrates the underlying reality using the following picture. He used this sketch in support of arguments people use for going to a Kanban style approach ( which also focuses teams on what Done means).
One source of the pain is that team members lack a shared agreement on what quality level they're aiming for. Some programmers may be writing automated unit tests, others may delivering code without checking it works. Some testers may talk to programmers as soon as they find problems, others may spend a lot of time filing length defect reports. Soon a blame game starts.
Instead, if we establish a shared "Definition of Done", we have a better chance of understanding what needs to happen during the sprint to get the features properly ready ("Done-Done").
That sounds fine and dandy but just how do we go about creating a such a definition of Done in our team? Here's some simple steps that you can try:
- Surface the current practice (Always, Sometimes, Not Yet)
- Discuss what Definition of Done what feels practical for the team
- Resolve any puzzles and concerns before putting into action
- After a couple of weeks, repeat from step 1.
I've posted some pictures below from a conversation with five developers in a small non-profit organisation to show you what this looks like.
Step one is to invite everyone in the team to write sticky notes about what they do now. Expect to get some duplicates. You'll also find that there is some discussion about which category the notes should go in. Most often these debates are resolved by putting the notes into the Sometimes category (if it really is Always or Not Doing then why the debate?).
It's now clear to the team that they have a pretty minimal test strategy and also that people on the team have different assumptions on what needs to be done. No wonder there's confusion. Now it's time to turn the conversation to working out a basic definition of Done that they can start using starting tomorrow. Help the team get clear what each proposal means to them and whether they are really ready to do this yet? It best not to be too ambitious, remind them that they can improve it over time.
The team decided to explore this basic proposal for a new definition of Done.
Then we explored what this would actually mean for the team. We needed this become a shared working agreement that every team member was happy to implement. So you may be surprised to hear that we didn't vote on it. Instead, we worked towards consensus by checking for concerns.The pink notes represent each element of Done. The green notes are questions and concerns from the team. We put these in two columns "What we do.." elaborating detail about what this means (who/what/when) and "How we check it's being done" about making this activity visible.
Don't forget that step 4 is to review how the new definition of Done worked for the team. Agree a date when this will happen by, a natural point is the next retrospective but the team may prefer to wait longer to allow the new approach time to bed in.
Hopefully, you find these tips useful. Please do add your own suggestions for other ways of how to do this as blog comments below. And, yes, I know that there's more to a test strategy than establishing a definition of done :-)
Fantastic example of how to have the conversation around "Done." Well said!
Posted by: Peter Saddington | 04 October 2010 at 07:50 PM
Great post! Thanks.
Over the years, I've extended the DOD in order to "Surface the current practice" at all levels; from project launch (Sprint 0) to the final solution. In this case, a DOD can include stuff that goes form coding and testing to...printing promotional T-shirt (true story :)
All "done items" not included in a User Story are then added to the Product Backlog to create visibility and conversations with the ultimate goal to reduce inventory, overproduction and waste.
I elaborate on this here : http://tinyurl.com/2cu8fpl
Thanks again for the post!
Posted by: Eric Laramée | 08 October 2010 at 01:39 PM
Thanks for the great post. I think it's done when everyone that's a stakeholder says it meets requirements or expectations. I'm still trying to figure out what agile methodology that we used at my last company. I know with their stories there were several check offs that had to happen before the story was considered done. First we had the developer saying that it was complete, then the stakeholder would check off and finally the QA would say it was accepted. If something didn't pass, then it had to be started over if there was time in the sprint or it would be backlogged for the dreaded 'R' word (rework) for the next sprint. With all these checks in place we rarely had things submitted as done when they really weren't.
Posted by: Joe Woods | 07 December 2011 at 10:40 PM