Demo in Sprint Review
Currently we are following one week Sprint, Post refinement sometimes we encounter few stories which may take more than 1 week time to complete development, integration & Test. This means to produce working software increment it may take more than 1 week.
In planning we are committing logical components development completion and in review we are providing update to PO and Stakeholders on the status of Sprint backlog( how many we achieved and how many we slipped and what happen to make it or slip it).
Is this correct?
Does Scrum mandate working demo in Sprint Review ? Or can we agree a set of PBI and achieve those by keeping demo as optional (do if end to end development & testings is completed)?
scrum agile sprint-review
New contributor
add a comment |
Currently we are following one week Sprint, Post refinement sometimes we encounter few stories which may take more than 1 week time to complete development, integration & Test. This means to produce working software increment it may take more than 1 week.
In planning we are committing logical components development completion and in review we are providing update to PO and Stakeholders on the status of Sprint backlog( how many we achieved and how many we slipped and what happen to make it or slip it).
Is this correct?
Does Scrum mandate working demo in Sprint Review ? Or can we agree a set of PBI and achieve those by keeping demo as optional (do if end to end development & testings is completed)?
scrum agile sprint-review
New contributor
2
I think this question is generating some of the longest answers I've seen on PMSE. This is probably because it's an important question that a lot of new Scrum practitioners have, but it may also indicate an X/Y problem that's rather broad. Since it's generating multiple good answers, I'm leaving it open.
– Todd A. Jacobs♦
2 hours ago
add a comment |
Currently we are following one week Sprint, Post refinement sometimes we encounter few stories which may take more than 1 week time to complete development, integration & Test. This means to produce working software increment it may take more than 1 week.
In planning we are committing logical components development completion and in review we are providing update to PO and Stakeholders on the status of Sprint backlog( how many we achieved and how many we slipped and what happen to make it or slip it).
Is this correct?
Does Scrum mandate working demo in Sprint Review ? Or can we agree a set of PBI and achieve those by keeping demo as optional (do if end to end development & testings is completed)?
scrum agile sprint-review
New contributor
Currently we are following one week Sprint, Post refinement sometimes we encounter few stories which may take more than 1 week time to complete development, integration & Test. This means to produce working software increment it may take more than 1 week.
In planning we are committing logical components development completion and in review we are providing update to PO and Stakeholders on the status of Sprint backlog( how many we achieved and how many we slipped and what happen to make it or slip it).
Is this correct?
Does Scrum mandate working demo in Sprint Review ? Or can we agree a set of PBI and achieve those by keeping demo as optional (do if end to end development & testings is completed)?
scrum agile sprint-review
scrum agile sprint-review
New contributor
New contributor
edited 1 hour ago
user34755
New contributor
asked 10 hours ago
user34755user34755
262
262
New contributor
New contributor
2
I think this question is generating some of the longest answers I've seen on PMSE. This is probably because it's an important question that a lot of new Scrum practitioners have, but it may also indicate an X/Y problem that's rather broad. Since it's generating multiple good answers, I'm leaving it open.
– Todd A. Jacobs♦
2 hours ago
add a comment |
2
I think this question is generating some of the longest answers I've seen on PMSE. This is probably because it's an important question that a lot of new Scrum practitioners have, but it may also indicate an X/Y problem that's rather broad. Since it's generating multiple good answers, I'm leaving it open.
– Todd A. Jacobs♦
2 hours ago
2
2
I think this question is generating some of the longest answers I've seen on PMSE. This is probably because it's an important question that a lot of new Scrum practitioners have, but it may also indicate an X/Y problem that's rather broad. Since it's generating multiple good answers, I'm leaving it open.
– Todd A. Jacobs♦
2 hours ago
I think this question is generating some of the longest answers I've seen on PMSE. This is probably because it's an important question that a lot of new Scrum practitioners have, but it may also indicate an X/Y problem that's rather broad. Since it's generating multiple good answers, I'm leaving it open.
– Todd A. Jacobs♦
2 hours ago
add a comment |
3 Answers
3
active
oldest
votes
TL;DR
There is a lot going on here. The short answer is that the Sprint Review does not have to be a demo but Scrum does advocate that the team should produce a potentially shippable product increment. That means the code is able to be shipped even if the Product Owner chooses to hold it back for whatever reason.
Breaking Down Your Post
Minor points
- Scrum is not an acronym. It is just Scrum; it comes from rugby. Imagine the ball is the goal of a project and as a team you are just moving it down the field a small iteration at a time
- Anything you want is optional, it just means that you are no longer doing Scrum, There is a difference between doing Scrum and being Agile. Agile is a set of values. Scrum is a framework that you can add things to but not take things away from
Major points
We have one week Sprint but due to application architecture complexity
There one sentence alone could form the basis of a retrospective. Why is the architecture complex? Are you working in a tightly regulated environment? Have the business mandated it that?
There are no wrong answers so just be really honest. Scrum Teams cannot always change their work environment.
What technologies are you working with? Do architectural decisions have to pass through a TDA or a CAB and does that impact the success of your sprints? Do you have a full time architect? Are you trying to work with an emergent architecture or plan everything up front?
Each of these items unlocks another little part of the puzzle to make Scrum a success. In highly dependent infrastructure projects where the requirements are known then Scrum (short sprints) is not always the optimal solution.
For instance, in business intelligence there is a large part of the data strategy that must be built up front and cannot be delivered incrementally/emergent because the rework penalty would be so high and the business would suffer.
However, most projects can be delivered incrementally if a business is willing to change it's behaviours. Be really honest; what is your project attempting to accomplish and how?
we unable to product working software in end of week.
Why not? Is this due to organisation barriers or the technology implementation? What is the shortest period of time that you do think you could produce working software? Are all parts of the project struggling with the week-long timebox or simply infrastructure?
In planning we are committing logic components development completion
Are you breaking down Epics and User Stories to a granular enough level and are you using the INVEST criteria to evaluate stories? Are you refining stories before Sprint Planning and do you have a BA to support the activities?
Also, remember that in Scrum, the planning ceremony is a forecast, not a commitment. The easy way to stop the business punishing slipped items is to gradually take less and less into the Sprint until you never miss a forecast.
in review we are providing update PO and Stakeholders on the status of Sprint backlog( how many we achieved and how many we slipped and what happen to make it or slip it)
Do you have a Scrum Master and are they supporting this activity? What is the PO doing during the Review?
Remember, the primary goal of the Review is to gain feedback on what has been delivered or is underway. It is a chance for stakeholders to see the vision of the product and inform the builders whether that is what they had envisioned. It is not to make teams feel bad for missing forecasts or focussing on what slipped.
It is not to evaluate the project for hitting an arbitrary timeline or metric of completed user stories. Especially since we could game that metric very easily but taking the large story and making it 50 one-line stories. Suddenly the team have hit 98% productivity and only 1 user stories remains saying deploy... (Don't do this. I am just showing how arbitrary Agile metrics can be manipulated).
Or can we agree a set of PBI and achieve those by keeping demo as optional (do if end to end development completed)?
From Scrum.org: Sprint Review is held at the end of the Sprint to inspect the Increment and adapt the Product Backlog if needed. There could have been a single deployment or many deployments during a Sprint which lead up to that Increment to be inspected.
- The Product Owner explains what Product Backlog items have been “Done” and what has not been “Done”
- The Development Team discusses what went well during the Sprint, what problems it ran into, and how those problems were solved
- The entire group collaborates on what to do next, so that the Sprint Review provides valuable input to subsequent Sprint Planning;
- A collaborative discussion about timelines, budgets and other product/project administration if appropriate
Other Questions that an Agile Coach or a Scrum Master should ask
- Do you have a Definition of Done? If so, what does it say and is it valid and when was it last updated?
- Do you have play ready stories and have they been suitable broken down into small enough chunks to ensure sprint success?
- Do you have all of the skills within the team that you need?
Unfortunately this post has become less about giving you the answers and more a chance for you to start framing the problem in a way that a Scrum Master or an Agile Coach would you help you work through it.
As always; the first line support for your question can be found in the Scrum Guide and from your Scrum Master. Second line support should be your organisational Agile Coaches. You cannot go wrong simply reading the Scrum Guide and, if the advice is too difficult, asking why it is difficult within your organisation and project.
add a comment |
Forget about Demo.
Before downvotes start to fly, bear with me.
As Venture mentioned (+1!), you have foundation problems that should be addressed before you get concerned about a Demo. I'll bet you come from a Company where senior management heard that Scrum is the next big thing and you should be having Sprints and Demos and Daily meetings to grab a few awards, amirite?
Jokes aside, I hope I'm wrong. Anyway, back to the main answer - before focusing on Demos, review what's required to deliver value at the end of the iteration.
- You may need to get rid of some activities if they're not helping you deliver (just as focusing on how to have excellent ceremonies without delivering actual value).
- You may need more automation (on testing, on deployment, whatever).
- You may need more contact with business to understand the minimal delivery you can offer on the shortest timespam. Business should work closer to IT, and IT should understand the business.
- You may need to review how to have cadence to deliver and keep a good pace. You may need to have bigger iterations, as you're probably working with a 10 years old juggernaut that has been patched over and over.
Once you have covered all the above, you'll be potentially working with an agile team (compare above to agile values to have a more consistent idea). Then, you can think of evolve into a specific framework, such as Scrum. Trying to become agile by implementing Scrum ceremonies is like starting a house from the roof. So... forget about the Demo. And tell senior management to forget about zero value awards. At least for now.
Thanks for Comment! . I agree that we are yet to leverage test automation and improvise the development activity. We are focusing that as well. I believe here I need to clarify few things. We are developing an ordering web application, for that we need to fetch different data elements from different databases and process to represent in UI. For this we may(not always) need more than 5 working days to complete development and testing. This means for few sprints when we plan, we agree that there won't be any visual demo to stakeholders (since development is not completely done).
– user34755
1 hour ago
That's exactly the kind of scenario the sprint should be longer... Otherwise you'll stick to the one week iteration for the sake of saying you're agile, when you're barely scratching the real benefits of it. If that makes feel a bit better, I'm more or less on the same situation :)
– Tiago Cardoso♦
1 hour ago
"Demo in Sprint review is optional", Is this statement correct?
– user34755
1 hour ago
add a comment |
Demonstrations are Required; Visual Demos are a Best Practice
Scrum requires that work be demonstrated during the Sprint Review. The Scrum Guide says:
The Development Team demonstrates the work that it has "Done" and answers questions about the Increment[.]
While the framework doesn't require hands-on or visual demonstrations, it's widely considered a best practice. Such demonstrations are generally the most effective way to present useful product information to stakeholders, and to solicit their feedback. Thinking about how you might demonstrate the increment forces the Scrum Team to take a test-first (or perhaps "demo first") approach to the work during Sprint Planning.
The Scrum Guide
The Scrum Guide defines the Sprint Review as follows:
A Sprint Review is held at the end of the Sprint to inspect the Increment and adapt the Product Backlog if needed. During the Sprint Review, the Scrum Team and stakeholders collaborate about what was done in the Sprint.
While it has more to say, the underlying goal of this Scrum event is to present completed work to stakeholders, and to gather feedback. Perhaps more importantly, the work presented should be in a potentially-shippable state. The Scrum Guide goes on to say:
At the end of a Sprint, the new Increment must be "Done," which means it must be in useable condition and meet the Scrum Team’s definition of "Done". An increment is a body of inspectable, done work that supports empiricism at the end of the Sprint. The increment is a step toward a vision or goal. The increment must be in useable condition regardless of whether the Product Owner decides to release it.
So, almost by definition, the contents of the Sprint Review should contain usable feature increments. While there are occasionally products or individual Sprints where visual or visible progress can't be demonstrated, it is more likely that the Backlog Refinement and Sprint Planning processes need work.
How to Fix Your Processes
In general, "we can't demo our increment" is a project smell that indicates that Backlog Refinement and/or Sprint Planning are not being done in a truly agile way. In many cases, these can be boiled down to a few common anti-patterns with common solutions.
The Product Backlog is not properly ordered.
It's hard to work incrementally when the backlog is an unordered list of work items and specifications, rather than a coherent set of features that represent an increment of value to the value consumer of a user story (e.g. the "As a , I want..." part of the Connextra format).
Help the Product Owner refactor the Product Backlog, and make sure that any user stories that are near the top of the backlog are broken down into stories. These prioritized stories should each be smaller than a single Sprint, if not even smaller.
User stories don't meet INVEST criteria.
Each user story should be:
- Independent
- Negotiable
- Valuable
- Estimate-able
- Small
- Testable
The last is actually the most important, and feeds directly into the advice to demonstrate the increment at the end of each Sprint. By thinking about how to test each story to very that it meets your Definition of Done, you ensure that the team takes a test-driven approach to design and implementation. In addition, by designing the tests first during Sprint Planning, you have the ability to bake the presentation of the work into your plans and estimates.
Sprint Planning for each increment lacks a cohesive Sprint Goal.
Sprint Goals are not optional. The framework requires them. The Scrum Guide says:
During Sprint Planning the Scrum Team also crafts a Sprint Goal. The Sprint Goal is an objective that will be met within the Sprint through the implementation of the Product Backlog, and it provides guidance to the Development Team on why it is building the Increment.
In addition, work selected by the Scrum Team for the Sprint should provide coherence:
The Sprint Goal gives the Development Team some flexibility regarding the functionality implemented within the Sprint. The selected Product Backlog items deliver one coherent function, which can be the Sprint Goal. The Sprint Goal can be any other coherence that causes the Development Team to work together rather than on separate initiatives.
So, the goal of a Sprint is never to just "do all the work." It must always be to deliver a functional increment of value. Many new Scrum Teams could solve a large host of framework implementation problems by ensuring that they always start each Sprint with a well-defined, measurable Sprint Goal.
The team and stakeholders lack a Definition of Done.
When a Scrum Team lacks a Definition of Done, it can't really meet stakeholder expectations about the functionality or quality of an increment. More importantly, it can't test "doneness" or communicate effectively about scope. Progress on the project then becomes a qualitative, rather than quantitative, metric. This leads to common pitfalls like saying "60% of stuff 40% done," which in the real world means the project is doomed.
The team must develop a test-first mentality for its work product, and codify those expectations into a testable (or at least "checklistable") format. This definition should also include criteria from stakeholders, so that everyone agrees on what done really looks like.
The Scrum Team is not working iteratively.
In most agile frameworks, work should be both incremental and iterative. The ideal unit of work in Scrum is a thin, vertical slice of functionality that can be developed, tested, and delivered within a single Sprint. When the work is not sized or sliced correctly, you lose essential benefits of the iterative/incremental development model and are essentially doing waterfall within an ersatz Scrum.
If you practice test-driven Sprint Planning, this issue will become much more visible. Implementing Sprint Goals and ensuring Product Backlog Items meet INVEST criteria will solve a lot of the sizing and scoping problems you're experiencing.
Sprint boundaries are not being respected.
While some advanced practitioners advocate decoupling delivery from deployment, Scrum is a time-boxing framework. Scrum Teams, especially new ones, often struggle with explicit time boxes and the iterative nature of the work. This leads to Sprints that lack clear boundaries for the work, and a lack of independence (remember the INVEST criteria?) between features or backlog items.
The absolute best advice for any new Scrum Team is to respect the Sprint boundaries. Each Sprint Goal should be something the Scrum Team thinks it can deliver within a single Sprint, and all the work the Development Team accepts into the Sprint should ideally be only a day or two's worth of effort. In no event should a Sprint Backlog item ever exceed the length of the current Sprint!
Work near the top of the Product Backlog should be decomposed into small, independent, testable user stories during Backlog Refinement. Any epics or user stories that won't fit within the Development Team's capacity estimate for the current Sprint should be decomposed further or renegotiated with the Product Owner during Sprint Planning.
Conclusion
Your current product roadmap and architecture do not appear to be test- or behavior-driven. The result is work that is hard to time-box, and almost impossible to work on iteratively/incrementally.
You should leverage the Sprint Retrospective and other Scrum events to identify which common implementation errors your process may currently have, and collaborate with the team and with stakeholders to correct them. By implementing the framework properly, and by adopting widely-accepted best practices such as test-driven design and just-in-time planning, you should be able to reduce the project's friction points significantly.
1
Thanks for response! We are developing components which takes more than 5 working days to fetch data from databases (multiple data entities from multiple databases) and process it to represent in UI. Hence when we follow 1 week (5 days) sprint, we unable to give visual demo to Stakeholders.
– user34755
2 hours ago
@user34755 Then you may need longer Sprint cycles. You either need to reduce work to fit your Sprints, or increase Sprint lengths to fit your work. If what you're doing isn't working, you need to change something.
– Todd A. Jacobs♦
43 mins ago
add a comment |
Your Answer
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "208"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
user34755 is a new contributor. Be nice, and check out our Code of Conduct.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fpm.stackexchange.com%2fquestions%2f25651%2fdemo-in-sprint-review%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
TL;DR
There is a lot going on here. The short answer is that the Sprint Review does not have to be a demo but Scrum does advocate that the team should produce a potentially shippable product increment. That means the code is able to be shipped even if the Product Owner chooses to hold it back for whatever reason.
Breaking Down Your Post
Minor points
- Scrum is not an acronym. It is just Scrum; it comes from rugby. Imagine the ball is the goal of a project and as a team you are just moving it down the field a small iteration at a time
- Anything you want is optional, it just means that you are no longer doing Scrum, There is a difference between doing Scrum and being Agile. Agile is a set of values. Scrum is a framework that you can add things to but not take things away from
Major points
We have one week Sprint but due to application architecture complexity
There one sentence alone could form the basis of a retrospective. Why is the architecture complex? Are you working in a tightly regulated environment? Have the business mandated it that?
There are no wrong answers so just be really honest. Scrum Teams cannot always change their work environment.
What technologies are you working with? Do architectural decisions have to pass through a TDA or a CAB and does that impact the success of your sprints? Do you have a full time architect? Are you trying to work with an emergent architecture or plan everything up front?
Each of these items unlocks another little part of the puzzle to make Scrum a success. In highly dependent infrastructure projects where the requirements are known then Scrum (short sprints) is not always the optimal solution.
For instance, in business intelligence there is a large part of the data strategy that must be built up front and cannot be delivered incrementally/emergent because the rework penalty would be so high and the business would suffer.
However, most projects can be delivered incrementally if a business is willing to change it's behaviours. Be really honest; what is your project attempting to accomplish and how?
we unable to product working software in end of week.
Why not? Is this due to organisation barriers or the technology implementation? What is the shortest period of time that you do think you could produce working software? Are all parts of the project struggling with the week-long timebox or simply infrastructure?
In planning we are committing logic components development completion
Are you breaking down Epics and User Stories to a granular enough level and are you using the INVEST criteria to evaluate stories? Are you refining stories before Sprint Planning and do you have a BA to support the activities?
Also, remember that in Scrum, the planning ceremony is a forecast, not a commitment. The easy way to stop the business punishing slipped items is to gradually take less and less into the Sprint until you never miss a forecast.
in review we are providing update PO and Stakeholders on the status of Sprint backlog( how many we achieved and how many we slipped and what happen to make it or slip it)
Do you have a Scrum Master and are they supporting this activity? What is the PO doing during the Review?
Remember, the primary goal of the Review is to gain feedback on what has been delivered or is underway. It is a chance for stakeholders to see the vision of the product and inform the builders whether that is what they had envisioned. It is not to make teams feel bad for missing forecasts or focussing on what slipped.
It is not to evaluate the project for hitting an arbitrary timeline or metric of completed user stories. Especially since we could game that metric very easily but taking the large story and making it 50 one-line stories. Suddenly the team have hit 98% productivity and only 1 user stories remains saying deploy... (Don't do this. I am just showing how arbitrary Agile metrics can be manipulated).
Or can we agree a set of PBI and achieve those by keeping demo as optional (do if end to end development completed)?
From Scrum.org: Sprint Review is held at the end of the Sprint to inspect the Increment and adapt the Product Backlog if needed. There could have been a single deployment or many deployments during a Sprint which lead up to that Increment to be inspected.
- The Product Owner explains what Product Backlog items have been “Done” and what has not been “Done”
- The Development Team discusses what went well during the Sprint, what problems it ran into, and how those problems were solved
- The entire group collaborates on what to do next, so that the Sprint Review provides valuable input to subsequent Sprint Planning;
- A collaborative discussion about timelines, budgets and other product/project administration if appropriate
Other Questions that an Agile Coach or a Scrum Master should ask
- Do you have a Definition of Done? If so, what does it say and is it valid and when was it last updated?
- Do you have play ready stories and have they been suitable broken down into small enough chunks to ensure sprint success?
- Do you have all of the skills within the team that you need?
Unfortunately this post has become less about giving you the answers and more a chance for you to start framing the problem in a way that a Scrum Master or an Agile Coach would you help you work through it.
As always; the first line support for your question can be found in the Scrum Guide and from your Scrum Master. Second line support should be your organisational Agile Coaches. You cannot go wrong simply reading the Scrum Guide and, if the advice is too difficult, asking why it is difficult within your organisation and project.
add a comment |
TL;DR
There is a lot going on here. The short answer is that the Sprint Review does not have to be a demo but Scrum does advocate that the team should produce a potentially shippable product increment. That means the code is able to be shipped even if the Product Owner chooses to hold it back for whatever reason.
Breaking Down Your Post
Minor points
- Scrum is not an acronym. It is just Scrum; it comes from rugby. Imagine the ball is the goal of a project and as a team you are just moving it down the field a small iteration at a time
- Anything you want is optional, it just means that you are no longer doing Scrum, There is a difference between doing Scrum and being Agile. Agile is a set of values. Scrum is a framework that you can add things to but not take things away from
Major points
We have one week Sprint but due to application architecture complexity
There one sentence alone could form the basis of a retrospective. Why is the architecture complex? Are you working in a tightly regulated environment? Have the business mandated it that?
There are no wrong answers so just be really honest. Scrum Teams cannot always change their work environment.
What technologies are you working with? Do architectural decisions have to pass through a TDA or a CAB and does that impact the success of your sprints? Do you have a full time architect? Are you trying to work with an emergent architecture or plan everything up front?
Each of these items unlocks another little part of the puzzle to make Scrum a success. In highly dependent infrastructure projects where the requirements are known then Scrum (short sprints) is not always the optimal solution.
For instance, in business intelligence there is a large part of the data strategy that must be built up front and cannot be delivered incrementally/emergent because the rework penalty would be so high and the business would suffer.
However, most projects can be delivered incrementally if a business is willing to change it's behaviours. Be really honest; what is your project attempting to accomplish and how?
we unable to product working software in end of week.
Why not? Is this due to organisation barriers or the technology implementation? What is the shortest period of time that you do think you could produce working software? Are all parts of the project struggling with the week-long timebox or simply infrastructure?
In planning we are committing logic components development completion
Are you breaking down Epics and User Stories to a granular enough level and are you using the INVEST criteria to evaluate stories? Are you refining stories before Sprint Planning and do you have a BA to support the activities?
Also, remember that in Scrum, the planning ceremony is a forecast, not a commitment. The easy way to stop the business punishing slipped items is to gradually take less and less into the Sprint until you never miss a forecast.
in review we are providing update PO and Stakeholders on the status of Sprint backlog( how many we achieved and how many we slipped and what happen to make it or slip it)
Do you have a Scrum Master and are they supporting this activity? What is the PO doing during the Review?
Remember, the primary goal of the Review is to gain feedback on what has been delivered or is underway. It is a chance for stakeholders to see the vision of the product and inform the builders whether that is what they had envisioned. It is not to make teams feel bad for missing forecasts or focussing on what slipped.
It is not to evaluate the project for hitting an arbitrary timeline or metric of completed user stories. Especially since we could game that metric very easily but taking the large story and making it 50 one-line stories. Suddenly the team have hit 98% productivity and only 1 user stories remains saying deploy... (Don't do this. I am just showing how arbitrary Agile metrics can be manipulated).
Or can we agree a set of PBI and achieve those by keeping demo as optional (do if end to end development completed)?
From Scrum.org: Sprint Review is held at the end of the Sprint to inspect the Increment and adapt the Product Backlog if needed. There could have been a single deployment or many deployments during a Sprint which lead up to that Increment to be inspected.
- The Product Owner explains what Product Backlog items have been “Done” and what has not been “Done”
- The Development Team discusses what went well during the Sprint, what problems it ran into, and how those problems were solved
- The entire group collaborates on what to do next, so that the Sprint Review provides valuable input to subsequent Sprint Planning;
- A collaborative discussion about timelines, budgets and other product/project administration if appropriate
Other Questions that an Agile Coach or a Scrum Master should ask
- Do you have a Definition of Done? If so, what does it say and is it valid and when was it last updated?
- Do you have play ready stories and have they been suitable broken down into small enough chunks to ensure sprint success?
- Do you have all of the skills within the team that you need?
Unfortunately this post has become less about giving you the answers and more a chance for you to start framing the problem in a way that a Scrum Master or an Agile Coach would you help you work through it.
As always; the first line support for your question can be found in the Scrum Guide and from your Scrum Master. Second line support should be your organisational Agile Coaches. You cannot go wrong simply reading the Scrum Guide and, if the advice is too difficult, asking why it is difficult within your organisation and project.
add a comment |
TL;DR
There is a lot going on here. The short answer is that the Sprint Review does not have to be a demo but Scrum does advocate that the team should produce a potentially shippable product increment. That means the code is able to be shipped even if the Product Owner chooses to hold it back for whatever reason.
Breaking Down Your Post
Minor points
- Scrum is not an acronym. It is just Scrum; it comes from rugby. Imagine the ball is the goal of a project and as a team you are just moving it down the field a small iteration at a time
- Anything you want is optional, it just means that you are no longer doing Scrum, There is a difference between doing Scrum and being Agile. Agile is a set of values. Scrum is a framework that you can add things to but not take things away from
Major points
We have one week Sprint but due to application architecture complexity
There one sentence alone could form the basis of a retrospective. Why is the architecture complex? Are you working in a tightly regulated environment? Have the business mandated it that?
There are no wrong answers so just be really honest. Scrum Teams cannot always change their work environment.
What technologies are you working with? Do architectural decisions have to pass through a TDA or a CAB and does that impact the success of your sprints? Do you have a full time architect? Are you trying to work with an emergent architecture or plan everything up front?
Each of these items unlocks another little part of the puzzle to make Scrum a success. In highly dependent infrastructure projects where the requirements are known then Scrum (short sprints) is not always the optimal solution.
For instance, in business intelligence there is a large part of the data strategy that must be built up front and cannot be delivered incrementally/emergent because the rework penalty would be so high and the business would suffer.
However, most projects can be delivered incrementally if a business is willing to change it's behaviours. Be really honest; what is your project attempting to accomplish and how?
we unable to product working software in end of week.
Why not? Is this due to organisation barriers or the technology implementation? What is the shortest period of time that you do think you could produce working software? Are all parts of the project struggling with the week-long timebox or simply infrastructure?
In planning we are committing logic components development completion
Are you breaking down Epics and User Stories to a granular enough level and are you using the INVEST criteria to evaluate stories? Are you refining stories before Sprint Planning and do you have a BA to support the activities?
Also, remember that in Scrum, the planning ceremony is a forecast, not a commitment. The easy way to stop the business punishing slipped items is to gradually take less and less into the Sprint until you never miss a forecast.
in review we are providing update PO and Stakeholders on the status of Sprint backlog( how many we achieved and how many we slipped and what happen to make it or slip it)
Do you have a Scrum Master and are they supporting this activity? What is the PO doing during the Review?
Remember, the primary goal of the Review is to gain feedback on what has been delivered or is underway. It is a chance for stakeholders to see the vision of the product and inform the builders whether that is what they had envisioned. It is not to make teams feel bad for missing forecasts or focussing on what slipped.
It is not to evaluate the project for hitting an arbitrary timeline or metric of completed user stories. Especially since we could game that metric very easily but taking the large story and making it 50 one-line stories. Suddenly the team have hit 98% productivity and only 1 user stories remains saying deploy... (Don't do this. I am just showing how arbitrary Agile metrics can be manipulated).
Or can we agree a set of PBI and achieve those by keeping demo as optional (do if end to end development completed)?
From Scrum.org: Sprint Review is held at the end of the Sprint to inspect the Increment and adapt the Product Backlog if needed. There could have been a single deployment or many deployments during a Sprint which lead up to that Increment to be inspected.
- The Product Owner explains what Product Backlog items have been “Done” and what has not been “Done”
- The Development Team discusses what went well during the Sprint, what problems it ran into, and how those problems were solved
- The entire group collaborates on what to do next, so that the Sprint Review provides valuable input to subsequent Sprint Planning;
- A collaborative discussion about timelines, budgets and other product/project administration if appropriate
Other Questions that an Agile Coach or a Scrum Master should ask
- Do you have a Definition of Done? If so, what does it say and is it valid and when was it last updated?
- Do you have play ready stories and have they been suitable broken down into small enough chunks to ensure sprint success?
- Do you have all of the skills within the team that you need?
Unfortunately this post has become less about giving you the answers and more a chance for you to start framing the problem in a way that a Scrum Master or an Agile Coach would you help you work through it.
As always; the first line support for your question can be found in the Scrum Guide and from your Scrum Master. Second line support should be your organisational Agile Coaches. You cannot go wrong simply reading the Scrum Guide and, if the advice is too difficult, asking why it is difficult within your organisation and project.
TL;DR
There is a lot going on here. The short answer is that the Sprint Review does not have to be a demo but Scrum does advocate that the team should produce a potentially shippable product increment. That means the code is able to be shipped even if the Product Owner chooses to hold it back for whatever reason.
Breaking Down Your Post
Minor points
- Scrum is not an acronym. It is just Scrum; it comes from rugby. Imagine the ball is the goal of a project and as a team you are just moving it down the field a small iteration at a time
- Anything you want is optional, it just means that you are no longer doing Scrum, There is a difference between doing Scrum and being Agile. Agile is a set of values. Scrum is a framework that you can add things to but not take things away from
Major points
We have one week Sprint but due to application architecture complexity
There one sentence alone could form the basis of a retrospective. Why is the architecture complex? Are you working in a tightly regulated environment? Have the business mandated it that?
There are no wrong answers so just be really honest. Scrum Teams cannot always change their work environment.
What technologies are you working with? Do architectural decisions have to pass through a TDA or a CAB and does that impact the success of your sprints? Do you have a full time architect? Are you trying to work with an emergent architecture or plan everything up front?
Each of these items unlocks another little part of the puzzle to make Scrum a success. In highly dependent infrastructure projects where the requirements are known then Scrum (short sprints) is not always the optimal solution.
For instance, in business intelligence there is a large part of the data strategy that must be built up front and cannot be delivered incrementally/emergent because the rework penalty would be so high and the business would suffer.
However, most projects can be delivered incrementally if a business is willing to change it's behaviours. Be really honest; what is your project attempting to accomplish and how?
we unable to product working software in end of week.
Why not? Is this due to organisation barriers or the technology implementation? What is the shortest period of time that you do think you could produce working software? Are all parts of the project struggling with the week-long timebox or simply infrastructure?
In planning we are committing logic components development completion
Are you breaking down Epics and User Stories to a granular enough level and are you using the INVEST criteria to evaluate stories? Are you refining stories before Sprint Planning and do you have a BA to support the activities?
Also, remember that in Scrum, the planning ceremony is a forecast, not a commitment. The easy way to stop the business punishing slipped items is to gradually take less and less into the Sprint until you never miss a forecast.
in review we are providing update PO and Stakeholders on the status of Sprint backlog( how many we achieved and how many we slipped and what happen to make it or slip it)
Do you have a Scrum Master and are they supporting this activity? What is the PO doing during the Review?
Remember, the primary goal of the Review is to gain feedback on what has been delivered or is underway. It is a chance for stakeholders to see the vision of the product and inform the builders whether that is what they had envisioned. It is not to make teams feel bad for missing forecasts or focussing on what slipped.
It is not to evaluate the project for hitting an arbitrary timeline or metric of completed user stories. Especially since we could game that metric very easily but taking the large story and making it 50 one-line stories. Suddenly the team have hit 98% productivity and only 1 user stories remains saying deploy... (Don't do this. I am just showing how arbitrary Agile metrics can be manipulated).
Or can we agree a set of PBI and achieve those by keeping demo as optional (do if end to end development completed)?
From Scrum.org: Sprint Review is held at the end of the Sprint to inspect the Increment and adapt the Product Backlog if needed. There could have been a single deployment or many deployments during a Sprint which lead up to that Increment to be inspected.
- The Product Owner explains what Product Backlog items have been “Done” and what has not been “Done”
- The Development Team discusses what went well during the Sprint, what problems it ran into, and how those problems were solved
- The entire group collaborates on what to do next, so that the Sprint Review provides valuable input to subsequent Sprint Planning;
- A collaborative discussion about timelines, budgets and other product/project administration if appropriate
Other Questions that an Agile Coach or a Scrum Master should ask
- Do you have a Definition of Done? If so, what does it say and is it valid and when was it last updated?
- Do you have play ready stories and have they been suitable broken down into small enough chunks to ensure sprint success?
- Do you have all of the skills within the team that you need?
Unfortunately this post has become less about giving you the answers and more a chance for you to start framing the problem in a way that a Scrum Master or an Agile Coach would you help you work through it.
As always; the first line support for your question can be found in the Scrum Guide and from your Scrum Master. Second line support should be your organisational Agile Coaches. You cannot go wrong simply reading the Scrum Guide and, if the advice is too difficult, asking why it is difficult within your organisation and project.
answered 9 hours ago
Venture2099Venture2099
2,455827
2,455827
add a comment |
add a comment |
Forget about Demo.
Before downvotes start to fly, bear with me.
As Venture mentioned (+1!), you have foundation problems that should be addressed before you get concerned about a Demo. I'll bet you come from a Company where senior management heard that Scrum is the next big thing and you should be having Sprints and Demos and Daily meetings to grab a few awards, amirite?
Jokes aside, I hope I'm wrong. Anyway, back to the main answer - before focusing on Demos, review what's required to deliver value at the end of the iteration.
- You may need to get rid of some activities if they're not helping you deliver (just as focusing on how to have excellent ceremonies without delivering actual value).
- You may need more automation (on testing, on deployment, whatever).
- You may need more contact with business to understand the minimal delivery you can offer on the shortest timespam. Business should work closer to IT, and IT should understand the business.
- You may need to review how to have cadence to deliver and keep a good pace. You may need to have bigger iterations, as you're probably working with a 10 years old juggernaut that has been patched over and over.
Once you have covered all the above, you'll be potentially working with an agile team (compare above to agile values to have a more consistent idea). Then, you can think of evolve into a specific framework, such as Scrum. Trying to become agile by implementing Scrum ceremonies is like starting a house from the roof. So... forget about the Demo. And tell senior management to forget about zero value awards. At least for now.
Thanks for Comment! . I agree that we are yet to leverage test automation and improvise the development activity. We are focusing that as well. I believe here I need to clarify few things. We are developing an ordering web application, for that we need to fetch different data elements from different databases and process to represent in UI. For this we may(not always) need more than 5 working days to complete development and testing. This means for few sprints when we plan, we agree that there won't be any visual demo to stakeholders (since development is not completely done).
– user34755
1 hour ago
That's exactly the kind of scenario the sprint should be longer... Otherwise you'll stick to the one week iteration for the sake of saying you're agile, when you're barely scratching the real benefits of it. If that makes feel a bit better, I'm more or less on the same situation :)
– Tiago Cardoso♦
1 hour ago
"Demo in Sprint review is optional", Is this statement correct?
– user34755
1 hour ago
add a comment |
Forget about Demo.
Before downvotes start to fly, bear with me.
As Venture mentioned (+1!), you have foundation problems that should be addressed before you get concerned about a Demo. I'll bet you come from a Company where senior management heard that Scrum is the next big thing and you should be having Sprints and Demos and Daily meetings to grab a few awards, amirite?
Jokes aside, I hope I'm wrong. Anyway, back to the main answer - before focusing on Demos, review what's required to deliver value at the end of the iteration.
- You may need to get rid of some activities if they're not helping you deliver (just as focusing on how to have excellent ceremonies without delivering actual value).
- You may need more automation (on testing, on deployment, whatever).
- You may need more contact with business to understand the minimal delivery you can offer on the shortest timespam. Business should work closer to IT, and IT should understand the business.
- You may need to review how to have cadence to deliver and keep a good pace. You may need to have bigger iterations, as you're probably working with a 10 years old juggernaut that has been patched over and over.
Once you have covered all the above, you'll be potentially working with an agile team (compare above to agile values to have a more consistent idea). Then, you can think of evolve into a specific framework, such as Scrum. Trying to become agile by implementing Scrum ceremonies is like starting a house from the roof. So... forget about the Demo. And tell senior management to forget about zero value awards. At least for now.
Thanks for Comment! . I agree that we are yet to leverage test automation and improvise the development activity. We are focusing that as well. I believe here I need to clarify few things. We are developing an ordering web application, for that we need to fetch different data elements from different databases and process to represent in UI. For this we may(not always) need more than 5 working days to complete development and testing. This means for few sprints when we plan, we agree that there won't be any visual demo to stakeholders (since development is not completely done).
– user34755
1 hour ago
That's exactly the kind of scenario the sprint should be longer... Otherwise you'll stick to the one week iteration for the sake of saying you're agile, when you're barely scratching the real benefits of it. If that makes feel a bit better, I'm more or less on the same situation :)
– Tiago Cardoso♦
1 hour ago
"Demo in Sprint review is optional", Is this statement correct?
– user34755
1 hour ago
add a comment |
Forget about Demo.
Before downvotes start to fly, bear with me.
As Venture mentioned (+1!), you have foundation problems that should be addressed before you get concerned about a Demo. I'll bet you come from a Company where senior management heard that Scrum is the next big thing and you should be having Sprints and Demos and Daily meetings to grab a few awards, amirite?
Jokes aside, I hope I'm wrong. Anyway, back to the main answer - before focusing on Demos, review what's required to deliver value at the end of the iteration.
- You may need to get rid of some activities if they're not helping you deliver (just as focusing on how to have excellent ceremonies without delivering actual value).
- You may need more automation (on testing, on deployment, whatever).
- You may need more contact with business to understand the minimal delivery you can offer on the shortest timespam. Business should work closer to IT, and IT should understand the business.
- You may need to review how to have cadence to deliver and keep a good pace. You may need to have bigger iterations, as you're probably working with a 10 years old juggernaut that has been patched over and over.
Once you have covered all the above, you'll be potentially working with an agile team (compare above to agile values to have a more consistent idea). Then, you can think of evolve into a specific framework, such as Scrum. Trying to become agile by implementing Scrum ceremonies is like starting a house from the roof. So... forget about the Demo. And tell senior management to forget about zero value awards. At least for now.
Forget about Demo.
Before downvotes start to fly, bear with me.
As Venture mentioned (+1!), you have foundation problems that should be addressed before you get concerned about a Demo. I'll bet you come from a Company where senior management heard that Scrum is the next big thing and you should be having Sprints and Demos and Daily meetings to grab a few awards, amirite?
Jokes aside, I hope I'm wrong. Anyway, back to the main answer - before focusing on Demos, review what's required to deliver value at the end of the iteration.
- You may need to get rid of some activities if they're not helping you deliver (just as focusing on how to have excellent ceremonies without delivering actual value).
- You may need more automation (on testing, on deployment, whatever).
- You may need more contact with business to understand the minimal delivery you can offer on the shortest timespam. Business should work closer to IT, and IT should understand the business.
- You may need to review how to have cadence to deliver and keep a good pace. You may need to have bigger iterations, as you're probably working with a 10 years old juggernaut that has been patched over and over.
Once you have covered all the above, you'll be potentially working with an agile team (compare above to agile values to have a more consistent idea). Then, you can think of evolve into a specific framework, such as Scrum. Trying to become agile by implementing Scrum ceremonies is like starting a house from the roof. So... forget about the Demo. And tell senior management to forget about zero value awards. At least for now.
answered 8 hours ago
Tiago Cardoso♦Tiago Cardoso
4,79821651
4,79821651
Thanks for Comment! . I agree that we are yet to leverage test automation and improvise the development activity. We are focusing that as well. I believe here I need to clarify few things. We are developing an ordering web application, for that we need to fetch different data elements from different databases and process to represent in UI. For this we may(not always) need more than 5 working days to complete development and testing. This means for few sprints when we plan, we agree that there won't be any visual demo to stakeholders (since development is not completely done).
– user34755
1 hour ago
That's exactly the kind of scenario the sprint should be longer... Otherwise you'll stick to the one week iteration for the sake of saying you're agile, when you're barely scratching the real benefits of it. If that makes feel a bit better, I'm more or less on the same situation :)
– Tiago Cardoso♦
1 hour ago
"Demo in Sprint review is optional", Is this statement correct?
– user34755
1 hour ago
add a comment |
Thanks for Comment! . I agree that we are yet to leverage test automation and improvise the development activity. We are focusing that as well. I believe here I need to clarify few things. We are developing an ordering web application, for that we need to fetch different data elements from different databases and process to represent in UI. For this we may(not always) need more than 5 working days to complete development and testing. This means for few sprints when we plan, we agree that there won't be any visual demo to stakeholders (since development is not completely done).
– user34755
1 hour ago
That's exactly the kind of scenario the sprint should be longer... Otherwise you'll stick to the one week iteration for the sake of saying you're agile, when you're barely scratching the real benefits of it. If that makes feel a bit better, I'm more or less on the same situation :)
– Tiago Cardoso♦
1 hour ago
"Demo in Sprint review is optional", Is this statement correct?
– user34755
1 hour ago
Thanks for Comment! . I agree that we are yet to leverage test automation and improvise the development activity. We are focusing that as well. I believe here I need to clarify few things. We are developing an ordering web application, for that we need to fetch different data elements from different databases and process to represent in UI. For this we may(not always) need more than 5 working days to complete development and testing. This means for few sprints when we plan, we agree that there won't be any visual demo to stakeholders (since development is not completely done).
– user34755
1 hour ago
Thanks for Comment! . I agree that we are yet to leverage test automation and improvise the development activity. We are focusing that as well. I believe here I need to clarify few things. We are developing an ordering web application, for that we need to fetch different data elements from different databases and process to represent in UI. For this we may(not always) need more than 5 working days to complete development and testing. This means for few sprints when we plan, we agree that there won't be any visual demo to stakeholders (since development is not completely done).
– user34755
1 hour ago
That's exactly the kind of scenario the sprint should be longer... Otherwise you'll stick to the one week iteration for the sake of saying you're agile, when you're barely scratching the real benefits of it. If that makes feel a bit better, I'm more or less on the same situation :)
– Tiago Cardoso♦
1 hour ago
That's exactly the kind of scenario the sprint should be longer... Otherwise you'll stick to the one week iteration for the sake of saying you're agile, when you're barely scratching the real benefits of it. If that makes feel a bit better, I'm more or less on the same situation :)
– Tiago Cardoso♦
1 hour ago
"Demo in Sprint review is optional", Is this statement correct?
– user34755
1 hour ago
"Demo in Sprint review is optional", Is this statement correct?
– user34755
1 hour ago
add a comment |
Demonstrations are Required; Visual Demos are a Best Practice
Scrum requires that work be demonstrated during the Sprint Review. The Scrum Guide says:
The Development Team demonstrates the work that it has "Done" and answers questions about the Increment[.]
While the framework doesn't require hands-on or visual demonstrations, it's widely considered a best practice. Such demonstrations are generally the most effective way to present useful product information to stakeholders, and to solicit their feedback. Thinking about how you might demonstrate the increment forces the Scrum Team to take a test-first (or perhaps "demo first") approach to the work during Sprint Planning.
The Scrum Guide
The Scrum Guide defines the Sprint Review as follows:
A Sprint Review is held at the end of the Sprint to inspect the Increment and adapt the Product Backlog if needed. During the Sprint Review, the Scrum Team and stakeholders collaborate about what was done in the Sprint.
While it has more to say, the underlying goal of this Scrum event is to present completed work to stakeholders, and to gather feedback. Perhaps more importantly, the work presented should be in a potentially-shippable state. The Scrum Guide goes on to say:
At the end of a Sprint, the new Increment must be "Done," which means it must be in useable condition and meet the Scrum Team’s definition of "Done". An increment is a body of inspectable, done work that supports empiricism at the end of the Sprint. The increment is a step toward a vision or goal. The increment must be in useable condition regardless of whether the Product Owner decides to release it.
So, almost by definition, the contents of the Sprint Review should contain usable feature increments. While there are occasionally products or individual Sprints where visual or visible progress can't be demonstrated, it is more likely that the Backlog Refinement and Sprint Planning processes need work.
How to Fix Your Processes
In general, "we can't demo our increment" is a project smell that indicates that Backlog Refinement and/or Sprint Planning are not being done in a truly agile way. In many cases, these can be boiled down to a few common anti-patterns with common solutions.
The Product Backlog is not properly ordered.
It's hard to work incrementally when the backlog is an unordered list of work items and specifications, rather than a coherent set of features that represent an increment of value to the value consumer of a user story (e.g. the "As a , I want..." part of the Connextra format).
Help the Product Owner refactor the Product Backlog, and make sure that any user stories that are near the top of the backlog are broken down into stories. These prioritized stories should each be smaller than a single Sprint, if not even smaller.
User stories don't meet INVEST criteria.
Each user story should be:
- Independent
- Negotiable
- Valuable
- Estimate-able
- Small
- Testable
The last is actually the most important, and feeds directly into the advice to demonstrate the increment at the end of each Sprint. By thinking about how to test each story to very that it meets your Definition of Done, you ensure that the team takes a test-driven approach to design and implementation. In addition, by designing the tests first during Sprint Planning, you have the ability to bake the presentation of the work into your plans and estimates.
Sprint Planning for each increment lacks a cohesive Sprint Goal.
Sprint Goals are not optional. The framework requires them. The Scrum Guide says:
During Sprint Planning the Scrum Team also crafts a Sprint Goal. The Sprint Goal is an objective that will be met within the Sprint through the implementation of the Product Backlog, and it provides guidance to the Development Team on why it is building the Increment.
In addition, work selected by the Scrum Team for the Sprint should provide coherence:
The Sprint Goal gives the Development Team some flexibility regarding the functionality implemented within the Sprint. The selected Product Backlog items deliver one coherent function, which can be the Sprint Goal. The Sprint Goal can be any other coherence that causes the Development Team to work together rather than on separate initiatives.
So, the goal of a Sprint is never to just "do all the work." It must always be to deliver a functional increment of value. Many new Scrum Teams could solve a large host of framework implementation problems by ensuring that they always start each Sprint with a well-defined, measurable Sprint Goal.
The team and stakeholders lack a Definition of Done.
When a Scrum Team lacks a Definition of Done, it can't really meet stakeholder expectations about the functionality or quality of an increment. More importantly, it can't test "doneness" or communicate effectively about scope. Progress on the project then becomes a qualitative, rather than quantitative, metric. This leads to common pitfalls like saying "60% of stuff 40% done," which in the real world means the project is doomed.
The team must develop a test-first mentality for its work product, and codify those expectations into a testable (or at least "checklistable") format. This definition should also include criteria from stakeholders, so that everyone agrees on what done really looks like.
The Scrum Team is not working iteratively.
In most agile frameworks, work should be both incremental and iterative. The ideal unit of work in Scrum is a thin, vertical slice of functionality that can be developed, tested, and delivered within a single Sprint. When the work is not sized or sliced correctly, you lose essential benefits of the iterative/incremental development model and are essentially doing waterfall within an ersatz Scrum.
If you practice test-driven Sprint Planning, this issue will become much more visible. Implementing Sprint Goals and ensuring Product Backlog Items meet INVEST criteria will solve a lot of the sizing and scoping problems you're experiencing.
Sprint boundaries are not being respected.
While some advanced practitioners advocate decoupling delivery from deployment, Scrum is a time-boxing framework. Scrum Teams, especially new ones, often struggle with explicit time boxes and the iterative nature of the work. This leads to Sprints that lack clear boundaries for the work, and a lack of independence (remember the INVEST criteria?) between features or backlog items.
The absolute best advice for any new Scrum Team is to respect the Sprint boundaries. Each Sprint Goal should be something the Scrum Team thinks it can deliver within a single Sprint, and all the work the Development Team accepts into the Sprint should ideally be only a day or two's worth of effort. In no event should a Sprint Backlog item ever exceed the length of the current Sprint!
Work near the top of the Product Backlog should be decomposed into small, independent, testable user stories during Backlog Refinement. Any epics or user stories that won't fit within the Development Team's capacity estimate for the current Sprint should be decomposed further or renegotiated with the Product Owner during Sprint Planning.
Conclusion
Your current product roadmap and architecture do not appear to be test- or behavior-driven. The result is work that is hard to time-box, and almost impossible to work on iteratively/incrementally.
You should leverage the Sprint Retrospective and other Scrum events to identify which common implementation errors your process may currently have, and collaborate with the team and with stakeholders to correct them. By implementing the framework properly, and by adopting widely-accepted best practices such as test-driven design and just-in-time planning, you should be able to reduce the project's friction points significantly.
1
Thanks for response! We are developing components which takes more than 5 working days to fetch data from databases (multiple data entities from multiple databases) and process it to represent in UI. Hence when we follow 1 week (5 days) sprint, we unable to give visual demo to Stakeholders.
– user34755
2 hours ago
@user34755 Then you may need longer Sprint cycles. You either need to reduce work to fit your Sprints, or increase Sprint lengths to fit your work. If what you're doing isn't working, you need to change something.
– Todd A. Jacobs♦
43 mins ago
add a comment |
Demonstrations are Required; Visual Demos are a Best Practice
Scrum requires that work be demonstrated during the Sprint Review. The Scrum Guide says:
The Development Team demonstrates the work that it has "Done" and answers questions about the Increment[.]
While the framework doesn't require hands-on or visual demonstrations, it's widely considered a best practice. Such demonstrations are generally the most effective way to present useful product information to stakeholders, and to solicit their feedback. Thinking about how you might demonstrate the increment forces the Scrum Team to take a test-first (or perhaps "demo first") approach to the work during Sprint Planning.
The Scrum Guide
The Scrum Guide defines the Sprint Review as follows:
A Sprint Review is held at the end of the Sprint to inspect the Increment and adapt the Product Backlog if needed. During the Sprint Review, the Scrum Team and stakeholders collaborate about what was done in the Sprint.
While it has more to say, the underlying goal of this Scrum event is to present completed work to stakeholders, and to gather feedback. Perhaps more importantly, the work presented should be in a potentially-shippable state. The Scrum Guide goes on to say:
At the end of a Sprint, the new Increment must be "Done," which means it must be in useable condition and meet the Scrum Team’s definition of "Done". An increment is a body of inspectable, done work that supports empiricism at the end of the Sprint. The increment is a step toward a vision or goal. The increment must be in useable condition regardless of whether the Product Owner decides to release it.
So, almost by definition, the contents of the Sprint Review should contain usable feature increments. While there are occasionally products or individual Sprints where visual or visible progress can't be demonstrated, it is more likely that the Backlog Refinement and Sprint Planning processes need work.
How to Fix Your Processes
In general, "we can't demo our increment" is a project smell that indicates that Backlog Refinement and/or Sprint Planning are not being done in a truly agile way. In many cases, these can be boiled down to a few common anti-patterns with common solutions.
The Product Backlog is not properly ordered.
It's hard to work incrementally when the backlog is an unordered list of work items and specifications, rather than a coherent set of features that represent an increment of value to the value consumer of a user story (e.g. the "As a , I want..." part of the Connextra format).
Help the Product Owner refactor the Product Backlog, and make sure that any user stories that are near the top of the backlog are broken down into stories. These prioritized stories should each be smaller than a single Sprint, if not even smaller.
User stories don't meet INVEST criteria.
Each user story should be:
- Independent
- Negotiable
- Valuable
- Estimate-able
- Small
- Testable
The last is actually the most important, and feeds directly into the advice to demonstrate the increment at the end of each Sprint. By thinking about how to test each story to very that it meets your Definition of Done, you ensure that the team takes a test-driven approach to design and implementation. In addition, by designing the tests first during Sprint Planning, you have the ability to bake the presentation of the work into your plans and estimates.
Sprint Planning for each increment lacks a cohesive Sprint Goal.
Sprint Goals are not optional. The framework requires them. The Scrum Guide says:
During Sprint Planning the Scrum Team also crafts a Sprint Goal. The Sprint Goal is an objective that will be met within the Sprint through the implementation of the Product Backlog, and it provides guidance to the Development Team on why it is building the Increment.
In addition, work selected by the Scrum Team for the Sprint should provide coherence:
The Sprint Goal gives the Development Team some flexibility regarding the functionality implemented within the Sprint. The selected Product Backlog items deliver one coherent function, which can be the Sprint Goal. The Sprint Goal can be any other coherence that causes the Development Team to work together rather than on separate initiatives.
So, the goal of a Sprint is never to just "do all the work." It must always be to deliver a functional increment of value. Many new Scrum Teams could solve a large host of framework implementation problems by ensuring that they always start each Sprint with a well-defined, measurable Sprint Goal.
The team and stakeholders lack a Definition of Done.
When a Scrum Team lacks a Definition of Done, it can't really meet stakeholder expectations about the functionality or quality of an increment. More importantly, it can't test "doneness" or communicate effectively about scope. Progress on the project then becomes a qualitative, rather than quantitative, metric. This leads to common pitfalls like saying "60% of stuff 40% done," which in the real world means the project is doomed.
The team must develop a test-first mentality for its work product, and codify those expectations into a testable (or at least "checklistable") format. This definition should also include criteria from stakeholders, so that everyone agrees on what done really looks like.
The Scrum Team is not working iteratively.
In most agile frameworks, work should be both incremental and iterative. The ideal unit of work in Scrum is a thin, vertical slice of functionality that can be developed, tested, and delivered within a single Sprint. When the work is not sized or sliced correctly, you lose essential benefits of the iterative/incremental development model and are essentially doing waterfall within an ersatz Scrum.
If you practice test-driven Sprint Planning, this issue will become much more visible. Implementing Sprint Goals and ensuring Product Backlog Items meet INVEST criteria will solve a lot of the sizing and scoping problems you're experiencing.
Sprint boundaries are not being respected.
While some advanced practitioners advocate decoupling delivery from deployment, Scrum is a time-boxing framework. Scrum Teams, especially new ones, often struggle with explicit time boxes and the iterative nature of the work. This leads to Sprints that lack clear boundaries for the work, and a lack of independence (remember the INVEST criteria?) between features or backlog items.
The absolute best advice for any new Scrum Team is to respect the Sprint boundaries. Each Sprint Goal should be something the Scrum Team thinks it can deliver within a single Sprint, and all the work the Development Team accepts into the Sprint should ideally be only a day or two's worth of effort. In no event should a Sprint Backlog item ever exceed the length of the current Sprint!
Work near the top of the Product Backlog should be decomposed into small, independent, testable user stories during Backlog Refinement. Any epics or user stories that won't fit within the Development Team's capacity estimate for the current Sprint should be decomposed further or renegotiated with the Product Owner during Sprint Planning.
Conclusion
Your current product roadmap and architecture do not appear to be test- or behavior-driven. The result is work that is hard to time-box, and almost impossible to work on iteratively/incrementally.
You should leverage the Sprint Retrospective and other Scrum events to identify which common implementation errors your process may currently have, and collaborate with the team and with stakeholders to correct them. By implementing the framework properly, and by adopting widely-accepted best practices such as test-driven design and just-in-time planning, you should be able to reduce the project's friction points significantly.
1
Thanks for response! We are developing components which takes more than 5 working days to fetch data from databases (multiple data entities from multiple databases) and process it to represent in UI. Hence when we follow 1 week (5 days) sprint, we unable to give visual demo to Stakeholders.
– user34755
2 hours ago
@user34755 Then you may need longer Sprint cycles. You either need to reduce work to fit your Sprints, or increase Sprint lengths to fit your work. If what you're doing isn't working, you need to change something.
– Todd A. Jacobs♦
43 mins ago
add a comment |
Demonstrations are Required; Visual Demos are a Best Practice
Scrum requires that work be demonstrated during the Sprint Review. The Scrum Guide says:
The Development Team demonstrates the work that it has "Done" and answers questions about the Increment[.]
While the framework doesn't require hands-on or visual demonstrations, it's widely considered a best practice. Such demonstrations are generally the most effective way to present useful product information to stakeholders, and to solicit their feedback. Thinking about how you might demonstrate the increment forces the Scrum Team to take a test-first (or perhaps "demo first") approach to the work during Sprint Planning.
The Scrum Guide
The Scrum Guide defines the Sprint Review as follows:
A Sprint Review is held at the end of the Sprint to inspect the Increment and adapt the Product Backlog if needed. During the Sprint Review, the Scrum Team and stakeholders collaborate about what was done in the Sprint.
While it has more to say, the underlying goal of this Scrum event is to present completed work to stakeholders, and to gather feedback. Perhaps more importantly, the work presented should be in a potentially-shippable state. The Scrum Guide goes on to say:
At the end of a Sprint, the new Increment must be "Done," which means it must be in useable condition and meet the Scrum Team’s definition of "Done". An increment is a body of inspectable, done work that supports empiricism at the end of the Sprint. The increment is a step toward a vision or goal. The increment must be in useable condition regardless of whether the Product Owner decides to release it.
So, almost by definition, the contents of the Sprint Review should contain usable feature increments. While there are occasionally products or individual Sprints where visual or visible progress can't be demonstrated, it is more likely that the Backlog Refinement and Sprint Planning processes need work.
How to Fix Your Processes
In general, "we can't demo our increment" is a project smell that indicates that Backlog Refinement and/or Sprint Planning are not being done in a truly agile way. In many cases, these can be boiled down to a few common anti-patterns with common solutions.
The Product Backlog is not properly ordered.
It's hard to work incrementally when the backlog is an unordered list of work items and specifications, rather than a coherent set of features that represent an increment of value to the value consumer of a user story (e.g. the "As a , I want..." part of the Connextra format).
Help the Product Owner refactor the Product Backlog, and make sure that any user stories that are near the top of the backlog are broken down into stories. These prioritized stories should each be smaller than a single Sprint, if not even smaller.
User stories don't meet INVEST criteria.
Each user story should be:
- Independent
- Negotiable
- Valuable
- Estimate-able
- Small
- Testable
The last is actually the most important, and feeds directly into the advice to demonstrate the increment at the end of each Sprint. By thinking about how to test each story to very that it meets your Definition of Done, you ensure that the team takes a test-driven approach to design and implementation. In addition, by designing the tests first during Sprint Planning, you have the ability to bake the presentation of the work into your plans and estimates.
Sprint Planning for each increment lacks a cohesive Sprint Goal.
Sprint Goals are not optional. The framework requires them. The Scrum Guide says:
During Sprint Planning the Scrum Team also crafts a Sprint Goal. The Sprint Goal is an objective that will be met within the Sprint through the implementation of the Product Backlog, and it provides guidance to the Development Team on why it is building the Increment.
In addition, work selected by the Scrum Team for the Sprint should provide coherence:
The Sprint Goal gives the Development Team some flexibility regarding the functionality implemented within the Sprint. The selected Product Backlog items deliver one coherent function, which can be the Sprint Goal. The Sprint Goal can be any other coherence that causes the Development Team to work together rather than on separate initiatives.
So, the goal of a Sprint is never to just "do all the work." It must always be to deliver a functional increment of value. Many new Scrum Teams could solve a large host of framework implementation problems by ensuring that they always start each Sprint with a well-defined, measurable Sprint Goal.
The team and stakeholders lack a Definition of Done.
When a Scrum Team lacks a Definition of Done, it can't really meet stakeholder expectations about the functionality or quality of an increment. More importantly, it can't test "doneness" or communicate effectively about scope. Progress on the project then becomes a qualitative, rather than quantitative, metric. This leads to common pitfalls like saying "60% of stuff 40% done," which in the real world means the project is doomed.
The team must develop a test-first mentality for its work product, and codify those expectations into a testable (or at least "checklistable") format. This definition should also include criteria from stakeholders, so that everyone agrees on what done really looks like.
The Scrum Team is not working iteratively.
In most agile frameworks, work should be both incremental and iterative. The ideal unit of work in Scrum is a thin, vertical slice of functionality that can be developed, tested, and delivered within a single Sprint. When the work is not sized or sliced correctly, you lose essential benefits of the iterative/incremental development model and are essentially doing waterfall within an ersatz Scrum.
If you practice test-driven Sprint Planning, this issue will become much more visible. Implementing Sprint Goals and ensuring Product Backlog Items meet INVEST criteria will solve a lot of the sizing and scoping problems you're experiencing.
Sprint boundaries are not being respected.
While some advanced practitioners advocate decoupling delivery from deployment, Scrum is a time-boxing framework. Scrum Teams, especially new ones, often struggle with explicit time boxes and the iterative nature of the work. This leads to Sprints that lack clear boundaries for the work, and a lack of independence (remember the INVEST criteria?) between features or backlog items.
The absolute best advice for any new Scrum Team is to respect the Sprint boundaries. Each Sprint Goal should be something the Scrum Team thinks it can deliver within a single Sprint, and all the work the Development Team accepts into the Sprint should ideally be only a day or two's worth of effort. In no event should a Sprint Backlog item ever exceed the length of the current Sprint!
Work near the top of the Product Backlog should be decomposed into small, independent, testable user stories during Backlog Refinement. Any epics or user stories that won't fit within the Development Team's capacity estimate for the current Sprint should be decomposed further or renegotiated with the Product Owner during Sprint Planning.
Conclusion
Your current product roadmap and architecture do not appear to be test- or behavior-driven. The result is work that is hard to time-box, and almost impossible to work on iteratively/incrementally.
You should leverage the Sprint Retrospective and other Scrum events to identify which common implementation errors your process may currently have, and collaborate with the team and with stakeholders to correct them. By implementing the framework properly, and by adopting widely-accepted best practices such as test-driven design and just-in-time planning, you should be able to reduce the project's friction points significantly.
Demonstrations are Required; Visual Demos are a Best Practice
Scrum requires that work be demonstrated during the Sprint Review. The Scrum Guide says:
The Development Team demonstrates the work that it has "Done" and answers questions about the Increment[.]
While the framework doesn't require hands-on or visual demonstrations, it's widely considered a best practice. Such demonstrations are generally the most effective way to present useful product information to stakeholders, and to solicit their feedback. Thinking about how you might demonstrate the increment forces the Scrum Team to take a test-first (or perhaps "demo first") approach to the work during Sprint Planning.
The Scrum Guide
The Scrum Guide defines the Sprint Review as follows:
A Sprint Review is held at the end of the Sprint to inspect the Increment and adapt the Product Backlog if needed. During the Sprint Review, the Scrum Team and stakeholders collaborate about what was done in the Sprint.
While it has more to say, the underlying goal of this Scrum event is to present completed work to stakeholders, and to gather feedback. Perhaps more importantly, the work presented should be in a potentially-shippable state. The Scrum Guide goes on to say:
At the end of a Sprint, the new Increment must be "Done," which means it must be in useable condition and meet the Scrum Team’s definition of "Done". An increment is a body of inspectable, done work that supports empiricism at the end of the Sprint. The increment is a step toward a vision or goal. The increment must be in useable condition regardless of whether the Product Owner decides to release it.
So, almost by definition, the contents of the Sprint Review should contain usable feature increments. While there are occasionally products or individual Sprints where visual or visible progress can't be demonstrated, it is more likely that the Backlog Refinement and Sprint Planning processes need work.
How to Fix Your Processes
In general, "we can't demo our increment" is a project smell that indicates that Backlog Refinement and/or Sprint Planning are not being done in a truly agile way. In many cases, these can be boiled down to a few common anti-patterns with common solutions.
The Product Backlog is not properly ordered.
It's hard to work incrementally when the backlog is an unordered list of work items and specifications, rather than a coherent set of features that represent an increment of value to the value consumer of a user story (e.g. the "As a , I want..." part of the Connextra format).
Help the Product Owner refactor the Product Backlog, and make sure that any user stories that are near the top of the backlog are broken down into stories. These prioritized stories should each be smaller than a single Sprint, if not even smaller.
User stories don't meet INVEST criteria.
Each user story should be:
- Independent
- Negotiable
- Valuable
- Estimate-able
- Small
- Testable
The last is actually the most important, and feeds directly into the advice to demonstrate the increment at the end of each Sprint. By thinking about how to test each story to very that it meets your Definition of Done, you ensure that the team takes a test-driven approach to design and implementation. In addition, by designing the tests first during Sprint Planning, you have the ability to bake the presentation of the work into your plans and estimates.
Sprint Planning for each increment lacks a cohesive Sprint Goal.
Sprint Goals are not optional. The framework requires them. The Scrum Guide says:
During Sprint Planning the Scrum Team also crafts a Sprint Goal. The Sprint Goal is an objective that will be met within the Sprint through the implementation of the Product Backlog, and it provides guidance to the Development Team on why it is building the Increment.
In addition, work selected by the Scrum Team for the Sprint should provide coherence:
The Sprint Goal gives the Development Team some flexibility regarding the functionality implemented within the Sprint. The selected Product Backlog items deliver one coherent function, which can be the Sprint Goal. The Sprint Goal can be any other coherence that causes the Development Team to work together rather than on separate initiatives.
So, the goal of a Sprint is never to just "do all the work." It must always be to deliver a functional increment of value. Many new Scrum Teams could solve a large host of framework implementation problems by ensuring that they always start each Sprint with a well-defined, measurable Sprint Goal.
The team and stakeholders lack a Definition of Done.
When a Scrum Team lacks a Definition of Done, it can't really meet stakeholder expectations about the functionality or quality of an increment. More importantly, it can't test "doneness" or communicate effectively about scope. Progress on the project then becomes a qualitative, rather than quantitative, metric. This leads to common pitfalls like saying "60% of stuff 40% done," which in the real world means the project is doomed.
The team must develop a test-first mentality for its work product, and codify those expectations into a testable (or at least "checklistable") format. This definition should also include criteria from stakeholders, so that everyone agrees on what done really looks like.
The Scrum Team is not working iteratively.
In most agile frameworks, work should be both incremental and iterative. The ideal unit of work in Scrum is a thin, vertical slice of functionality that can be developed, tested, and delivered within a single Sprint. When the work is not sized or sliced correctly, you lose essential benefits of the iterative/incremental development model and are essentially doing waterfall within an ersatz Scrum.
If you practice test-driven Sprint Planning, this issue will become much more visible. Implementing Sprint Goals and ensuring Product Backlog Items meet INVEST criteria will solve a lot of the sizing and scoping problems you're experiencing.
Sprint boundaries are not being respected.
While some advanced practitioners advocate decoupling delivery from deployment, Scrum is a time-boxing framework. Scrum Teams, especially new ones, often struggle with explicit time boxes and the iterative nature of the work. This leads to Sprints that lack clear boundaries for the work, and a lack of independence (remember the INVEST criteria?) between features or backlog items.
The absolute best advice for any new Scrum Team is to respect the Sprint boundaries. Each Sprint Goal should be something the Scrum Team thinks it can deliver within a single Sprint, and all the work the Development Team accepts into the Sprint should ideally be only a day or two's worth of effort. In no event should a Sprint Backlog item ever exceed the length of the current Sprint!
Work near the top of the Product Backlog should be decomposed into small, independent, testable user stories during Backlog Refinement. Any epics or user stories that won't fit within the Development Team's capacity estimate for the current Sprint should be decomposed further or renegotiated with the Product Owner during Sprint Planning.
Conclusion
Your current product roadmap and architecture do not appear to be test- or behavior-driven. The result is work that is hard to time-box, and almost impossible to work on iteratively/incrementally.
You should leverage the Sprint Retrospective and other Scrum events to identify which common implementation errors your process may currently have, and collaborate with the team and with stakeholders to correct them. By implementing the framework properly, and by adopting widely-accepted best practices such as test-driven design and just-in-time planning, you should be able to reduce the project's friction points significantly.
answered 2 hours ago
Todd A. Jacobs♦Todd A. Jacobs
32.3k331115
32.3k331115
1
Thanks for response! We are developing components which takes more than 5 working days to fetch data from databases (multiple data entities from multiple databases) and process it to represent in UI. Hence when we follow 1 week (5 days) sprint, we unable to give visual demo to Stakeholders.
– user34755
2 hours ago
@user34755 Then you may need longer Sprint cycles. You either need to reduce work to fit your Sprints, or increase Sprint lengths to fit your work. If what you're doing isn't working, you need to change something.
– Todd A. Jacobs♦
43 mins ago
add a comment |
1
Thanks for response! We are developing components which takes more than 5 working days to fetch data from databases (multiple data entities from multiple databases) and process it to represent in UI. Hence when we follow 1 week (5 days) sprint, we unable to give visual demo to Stakeholders.
– user34755
2 hours ago
@user34755 Then you may need longer Sprint cycles. You either need to reduce work to fit your Sprints, or increase Sprint lengths to fit your work. If what you're doing isn't working, you need to change something.
– Todd A. Jacobs♦
43 mins ago
1
1
Thanks for response! We are developing components which takes more than 5 working days to fetch data from databases (multiple data entities from multiple databases) and process it to represent in UI. Hence when we follow 1 week (5 days) sprint, we unable to give visual demo to Stakeholders.
– user34755
2 hours ago
Thanks for response! We are developing components which takes more than 5 working days to fetch data from databases (multiple data entities from multiple databases) and process it to represent in UI. Hence when we follow 1 week (5 days) sprint, we unable to give visual demo to Stakeholders.
– user34755
2 hours ago
@user34755 Then you may need longer Sprint cycles. You either need to reduce work to fit your Sprints, or increase Sprint lengths to fit your work. If what you're doing isn't working, you need to change something.
– Todd A. Jacobs♦
43 mins ago
@user34755 Then you may need longer Sprint cycles. You either need to reduce work to fit your Sprints, or increase Sprint lengths to fit your work. If what you're doing isn't working, you need to change something.
– Todd A. Jacobs♦
43 mins ago
add a comment |
user34755 is a new contributor. Be nice, and check out our Code of Conduct.
user34755 is a new contributor. Be nice, and check out our Code of Conduct.
user34755 is a new contributor. Be nice, and check out our Code of Conduct.
user34755 is a new contributor. Be nice, and check out our Code of Conduct.
Thanks for contributing an answer to Project Management Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fpm.stackexchange.com%2fquestions%2f25651%2fdemo-in-sprint-review%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
2
I think this question is generating some of the longest answers I've seen on PMSE. This is probably because it's an important question that a lot of new Scrum practitioners have, but it may also indicate an X/Y problem that's rather broad. Since it's generating multiple good answers, I'm leaving it open.
– Todd A. Jacobs♦
2 hours ago