According to a study by CEB Research, 70% of software projects are delivered on time, but only on 38% meet stakeholder's expectations. In most cases the people that use your software will not even be aware of internal project deadlines. Case in point, think of all of the software the you use everyday. You are rarely aware of what the due date was. However, what you do notice is how well the software does it's job. It's it's of low quality, you'll very likely notice right away. Even if it appears to be of high quality, if it doesn't provide any major value to you, chances are you still stop using it.
With that in mind, then why do many software projects today have a set scope and a hard due date? When building software for a customer, it's only natural for them to what to know what exactly they are going to get and what is will cost them. That is a very reasonable thing to want to know. Even when developing software internally, there are often similar expectations.
Let's take a step back for a moment before attempting to answer that question. When I first worked as a software developer over 15 years ago, software was created in a much different manner. Typically projects lasted anywhere from a few months, to well over a year. Before development began, a detailed requirements document was created. Most often these documents were over 100 pages long, and attempted to include every possible detail of what was needed. Nothing, it was thought, would be unclear or left to the imagination. Project managers would create detailed project plans that included a breakdown for everything necessary to create the software. Each task was estimated and laid out in the plan. Most often the customer didn't see the software until it was close to being finished. Because the scope and due date was agreed to ahead of time, changes were not allowed. If a customer did want a change, an additional contract had to be drawn up to cover those changes, along with corresponding estimates and plans.
Needless to say this is not how we develop software anymore. The fact that it isn't used anymore is a good indicator of how well it worked. Instead today we develop software in short iterations. Change is embraced and accepted. To summarize it's a much more efficient way to create software, and it also usually results in a better product. Simply put the older approach to creating software was severely flawed.
The problem is that while we've evolved how we develop software, in some ways we're still holding on to the old way of thinking. If we accept that there are many unknowns in software development, we also need to accept that we can't reasonably know when a set of features will be delivered. The desire to want know those things is understandable, but one of the realities of life is that we can't always have what we want.
Going to back the 70% figure, it's clear that when a due date is the ultimate way we measure success, often we're able to meet that goal. However, the 38% also shows us that we're often cutting corners to meet dates. Often the result that software that isn't built reliably, isn't maintainable, or perhaps features that are met in name only but don't do the job well.
A better approach is to focus on the value side of the equation. Start with the most valuable features first. Build them with high levels of quality that meet or exceed expectations. Instead of starting a project with a set list of features and a set date, pick to focus on only one. For example, if you need to deliver a product in time for the holidays, pick a set date. Accept the fact that while you will have some features in places for that date, the total number of features delivered by that date will be unknown. Alternatively, you can focus on a set number of features. For example, let's say you're developing an app that has three critical features that must be included for a MVP. Focus exclusively on those features. While you will not know the specific date it will be completed, you will know when completed it should meet your expectations.
Keep in mind neither of these approaches guarantee success, but they will ensure that you will be focusing on what matters. If enough time passes and you're not happy with the results, then it may be time to revaluate the project. However, the reality is that no matter what you do, there's nothing you can do to guarantee success. Rather this ensures that you're focusing on what's important; the quality and value of the software.
At the end of the day the users of your software are the ones that will determine it's success. Perhaps in some circumstances mediocrity is ok, and the "guarantee" of a contract is more appealing. But if what you want to maximize the value of the software, then it's best to take a more flexible approach. There's many times when something that just works isn't good enough. Perhaps you need a product that sets itself apart from the competition. Perhaps you want to create something that more than just good enough; something that exceeds expectations. When that's the case, it's time for a different approach.
With that in mind, then why do many software projects today have a set scope and a hard due date? When building software for a customer, it's only natural for them to what to know what exactly they are going to get and what is will cost them. That is a very reasonable thing to want to know. Even when developing software internally, there are often similar expectations.
Let's take a step back for a moment before attempting to answer that question. When I first worked as a software developer over 15 years ago, software was created in a much different manner. Typically projects lasted anywhere from a few months, to well over a year. Before development began, a detailed requirements document was created. Most often these documents were over 100 pages long, and attempted to include every possible detail of what was needed. Nothing, it was thought, would be unclear or left to the imagination. Project managers would create detailed project plans that included a breakdown for everything necessary to create the software. Each task was estimated and laid out in the plan. Most often the customer didn't see the software until it was close to being finished. Because the scope and due date was agreed to ahead of time, changes were not allowed. If a customer did want a change, an additional contract had to be drawn up to cover those changes, along with corresponding estimates and plans.
Needless to say this is not how we develop software anymore. The fact that it isn't used anymore is a good indicator of how well it worked. Instead today we develop software in short iterations. Change is embraced and accepted. To summarize it's a much more efficient way to create software, and it also usually results in a better product. Simply put the older approach to creating software was severely flawed.
The problem is that while we've evolved how we develop software, in some ways we're still holding on to the old way of thinking. If we accept that there are many unknowns in software development, we also need to accept that we can't reasonably know when a set of features will be delivered. The desire to want know those things is understandable, but one of the realities of life is that we can't always have what we want.
Going to back the 70% figure, it's clear that when a due date is the ultimate way we measure success, often we're able to meet that goal. However, the 38% also shows us that we're often cutting corners to meet dates. Often the result that software that isn't built reliably, isn't maintainable, or perhaps features that are met in name only but don't do the job well.
A better approach is to focus on the value side of the equation. Start with the most valuable features first. Build them with high levels of quality that meet or exceed expectations. Instead of starting a project with a set list of features and a set date, pick to focus on only one. For example, if you need to deliver a product in time for the holidays, pick a set date. Accept the fact that while you will have some features in places for that date, the total number of features delivered by that date will be unknown. Alternatively, you can focus on a set number of features. For example, let's say you're developing an app that has three critical features that must be included for a MVP. Focus exclusively on those features. While you will not know the specific date it will be completed, you will know when completed it should meet your expectations.
Keep in mind neither of these approaches guarantee success, but they will ensure that you will be focusing on what matters. If enough time passes and you're not happy with the results, then it may be time to revaluate the project. However, the reality is that no matter what you do, there's nothing you can do to guarantee success. Rather this ensures that you're focusing on what's important; the quality and value of the software.
At the end of the day the users of your software are the ones that will determine it's success. Perhaps in some circumstances mediocrity is ok, and the "guarantee" of a contract is more appealing. But if what you want to maximize the value of the software, then it's best to take a more flexible approach. There's many times when something that just works isn't good enough. Perhaps you need a product that sets itself apart from the competition. Perhaps you want to create something that more than just good enough; something that exceeds expectations. When that's the case, it's time for a different approach.
Comments
But it's NEVER "OVER" because both are in the strategy for success and both are in the balanced sheet - "asset management recorded value," and the related cost of that asset, with the "booked value" - cost being the net value carried on the balance sheet.
With a business view of writing software for Money means "accounting" for the cost of producing that Value and recording the NET value produced from that spend.
When we speak about over, it is appropriate in some cases, but when spending other peoples money - not likely
If we look at how software was developed in the past, we defined detailed plans all up front before the start of the project. Over time it was determined that it was better to break software projects into short iterations. The main reason for this is that the upfront planning of software projects was not very effective, so people found a better way.
There seems to be an inconsistency between how we develop software and how most organizations do planning. Perhaps a more flexible and adaptive approach to planning is preferable. When we speak of "#NoEstimates" it's basically raising that point.
In terms of when such an approach would be appropriate, it's really more about when an org is willing to try something different. Yes, there will be risk involved but there's risk involved whenever people try new things. The people that created Scrum incurred risk when they used it on their first project.
This is the approach we use in Enterprise and Software Intensive Systems. Your description is literally "not allowed" in several of our acquisition paradigms> This process has not yet arrived in smaller IT shops.
Until those originally suggesting the hashtag show explicitly how these words are no longer in place - "#NoEstimates is a hashtag for exploring alternatives to estimates (meaning estimates are replaced with something else), for making decisions in software development. That is, ways to make decisions with "No Estimates." It seem there is no exploration but displacement. "We can make decisions without estimates."
Since there are no examples forthcoming that can be tested outside of personal anecdotes, this conjecture - "you can make (credible) decisions (in the presence of uncertainty) without estimates," is just that an untested conjecture.
Ignoring for the moment the violation of microeconomics of decision making from the finance point of view, there appears to be no actionable suggestions for how to decide between mutiple choices in the future without making an estimate of the cost and returned value produced from that cost.
This is basic business management, having little to do with those spending the money to write code.
You suggestion of the "org willing to try," does not address that actual business process. What is the value at risk, that the org is willing to experiment on? There is certainty a VAR where this is possible. The OP'ers have yet to address this issue as well.