Lt. Commander Geordi La Forge: Look, Mr. Scott, I'd love to explain everything to you, but the Captain wants this spectrographic analysis done by 1300 hours.
[La Forge goes back to work; Scotty follows slowly]
Scotty: Do you mind a little advice? Starfleet captains are like children. They want everything right now and they want it their way. But the secret is to give them only what they need, not what they want.
Lt. Commander Geordi La Forge: Yeah, well, I told the Captain I'd have this analysis done in an hour.
Scotty: How long will it really take?
Lt. Commander Geordi La Forge: An hour!
Scotty: Oh, you didn't tell him how long it would really take, did ya?
Lt. Commander Geordi La Forge: Well, of course I did.
Scotty: Oh, laddie. You've got a lot to learn if you want people to think of you as a miracle worker.
This might work in StarFleet, but the problem is that if you promise a manager something repeatedly, by overestimating, they will start to assume you are padding your results.
Having worked on scrum teams before, I find its a good structured way to have a dialog about how much work is left, and how long you expect it to take.
So rather than saying an hour. You would say this is as easy as changing a diaper, or its as hard as navigating through an asteroid belt at the speed of light ( having or not having done it ).
I would appreciate it if managers ask:
Have you done this before? (yes, no)
How similar is this to something you have done before? ( very similar, kinda similar, totally new )
What concerns you about the task, or what risks do you see coming your way? ( detailed answer )
What can I do to get you the resources you need to help you deal with the risks you can foresee now?
> the problem is that if you promise a manager something
The real problem is thinking of "estimates" as "promises" when really they're rough guesses. If you go around holding people to a guess as if it were some kind of gospel biblical contract then yes, they will lie their ass off to protect themselves. Combine that with the fact that most organizations attempt to control the platform used, computers used, monitor size, operating system, IDE, editors, revision control tools, languages, documentation systems, testing methodology, meeting requirements, pairing, and nearly everything they can, and you start to see the real problem is....
managers who don't know programming, motherfucker. :-)
Managers want promises because execs need promises because the board is critiquing their job performance based on when they ship the next product.
The board wants promises because they want to know when they get paid.
If you find it stressful that people are turning your vague estimate into a promise, imagine having to make promises on other people's vague estimates.
"At that meeting, SAP AG executives and engineers represented that the software was a mature solution and conducted a demonstration consisting of what they represented was the actual SAP Waste and Recycling software," the complaint states. The company later discovered that the software was a "mock-up version of that software intended to deceive Waste Management," according to the complaint. SAP has admitted to this in "internal documents," the complaint states.
"From the beginning, SAP assured Waste Management that its software was an 'out-of-the-box' solution that would meet Waste Management's needs without any customization or enhancements," the statement reads. "Unfortunately, Waste Management ultimately learned that these representations were not true."
You are in part outlining the start of some standard methodology for project time estimates. That's the good news. The bad news is that, for cases of never done this before and this is totally new, the methodology makes no estimates!
Time estimations is an industrial way of thinking applied to a post-industrial world.
In the post industrial world time isn't the problem but rather project definition and scoping.
In the industrial world the problem was already solved (machine was built, market often established and output depended on a few factors that could be adjusted. Need more output add more of X)
In the post industrial world every project is about problem solving and scoping.
To put it into comparison.
If we apply post-industrial thinking to an industrial world. It means that each time a product needed to be done if not the factory, then the machines would have to be developed.
It will take many many years before time estimation will die, but it will happen.
What's needed may just be to solicit a wide confidence interval (e.g. "this task will take between 20 minutes and 1 month, and I'm right 90% of the time recently when making such estimates"). You can drop the lower-bound part, probably.
People absolutely should be able to provide a rough estimated amount of effort for a task. The trouble is in using a single point to describe the whole probability distribution. You may be right, in that nobody seems to have a great way of soliciting a probability distribution (or even a single probability) that makes sense. Something like a 70% confidence interval bracketing the amount of effort would be useful, but not sufficient.
I also agree that even if people could describe their beliefs about the required effort rigorously, you have to wonder how much planning/analysis they should spend trying to come up with a tight estimate. An 'outside view' - http://wiki.lesswrong.com/wiki/Outside_view - could give something reasonable in cases that aren't too novel.
I think estimates will never become useless. It's just that we may decide to replace them with "time by which we'll be late only 1/6 of the time", and provide feedback and incentives for people to correct this estimate so that it's, for someone who's experienced in the domain, eventually relatively unbiased.
That is hard to say and it would be a book worthy to explain what could come next. All I know is that it is unsustainable.
The complexity is simply too high and it's not getting better. One of the reasons I think why you see the fail fast movement be so successful.
Once you accept that failure is part of the process, once you abandon the "zero mistake" policy that many large organizations instill internally and externally you will begin to approach projects differently.
The truth is that "zero mistake" organizations make as many mistakes as everyone else, they just have the financial strength to ignore them as long as economy of scale works in their favor.
I could write forever about projects that went wrong not because the developers where bad but because the premise that fuels product development is broken.
I blame primarily business schools and large parts of academia for this. But it could extend all the way into the way the stock market is structured.
If you buy my premise that post-industrial is different than industrial age. That project definition is primary and time is secondary today. Then it does put some doubt at least in me about whether the stock markets focus on growth and Q's is sustainable.
Nature seems to be doing a good job as pacing various processes. It takes nine months to give birth to a child. One cell at a time. But the process is ongoing.
Nature is the ultimate continues deployment strategy.
It sounds like you've thought deeply about this - have you written more on your ideas here elsewhere? Would be interested to hear how this would work practically as well as your ideas on how the focus on perpetual growth hinders successful project delivery.
I'd love to see you write more on this as well. You've clearly thought about it prior to this conversation in a way that I don't think a lot of us have.
I think you will find reading Hayek and Coase to be rewarding.
Hayek introduced the idea that no one individual planner can make flawless plans because the information to do so is widely dispersed; instead success or failure in the market gives rapid feedback about how to allocate resources. Indeed the market is a discovery process that unearths this information in a way that a planner never could[1].
Coase asked the question: if this is so, how do large firms emerge? He identified the cost of transactions to be the key. The higher the transaction costs, the higher the cost of loosely coupled economies. The size of the firm is thus based on the relative costs of transactions (searching, identifying, validating, negotiating etc) vs the inevitable waste caused by planning.
In essence, large companies are islands of command economics in the sea of the market.
[1] In a related argument, Mises said that even if a planner could know all the variables the resulting problem would simply be too large to solve rationally.
So things like prototyping, UML, use cases, and Agile were not a "serious attempt" to answer that? I think it's also unwise to tell your manager "it'll be done when it's done."
Largely those are things done by developers, not by the business. Perhaps in cases where the business is on board with them (eg. Agile and rapid iterations), but mostly they'd rather handwave and hope for the best.
Instead of replacing estimation, I would say that you should embrace the problem in estimates - uncertainty.
Convey the uncertainty you have about a task to those you work with, and then you can start to factor in the risks of uncertain tasks.
I build project management software for a living at LiquidPlanner. Everything we do be it development, design, or marketing is based on ranged estimates. We don't always get the estimate right, but that's the beauty of a range, I it takes into account the fact that you will miss it some of the time.
The other thing that can replace an estimate is... lots of estimates. If you're planning something, update your estimates as you gain more knowledge about the problem.
I used to work for big oil, and they would have 100-200 million dollar projects. Not software projects, actually building plants. Similar to software development, these were basically prototypes. Yes a lot of the technology was known, but they weren't exactly building cookie-cutter houses.
So the first round of estimates would be something crazy like +/- 80%, both money and timeframe.
If that seemed good, then they'ed fork over the money to get more specifications and more details, and come up with a new estimate. Maybe +/- 50%. Then they'd re-evaluate the viability of the project. And do that a few more times, spending more money and time each iteration, before they actually committed to the project or dumped it. At that point, the estimates were quite accurate.
So basically, the software version. How long will that feature take?
An experienced developer can spitball an answer. About a week. Then they start working on it. After a day-and-a-half, they're going to have a better idea if that's actually a three day project or a two-weeker or if it ain't gonna happen.
Or maybe they need a day-or-two or a week to do a spike to be able to provide that estimate. All well and good if you let them do the spike. Not so good if you demand an estimate when the developer has clearly said he has no idea.
I absolutely agree that interval arithmetic is the best way to deal with this, accountants and their budget systems always require estimates to be reduced to fix figures.
Actual effort to complete can be quite surprising, but effort spent per day also varies tremendously. For example, when things are going well, there may be a burst of joyful effort.
A big part of the problem is that management often isn't interested in honest estimates.
You can certainly find better practices in some organizations, but in many organizations, a certain team gets tasked with completing a certain assignment by an arbitrary deadline and no attempt is made to reconcile the triple constraints with reality.
In a context like that, developers may not be asked to make estimates, and if they do make estimates they don't believe that they'll be taken seriously. So they never develop the chance to learn how to estimate.
The accuracy of estimation also depends on the context.
If, for some reason, I had to fix a difficult-to-reproduce bug in a large system written in COBOL that runs on a mainframe, I'd have very little idea of how long it would take me to learn COBOL, understand the codebase, figure out the tools, and track the problem down. A COBOL jock would obviously do better.
On the other hand, I've worked on greenfield well-specified business CRUD apps based on a correctly aligned framework where I could estimate that something would take 21.25 hours and I'd really get it done in 19.75 or 22.50.
I've met developers who will absolutely refuse to estimate anything but I think more often developers don't believe their estimates will be listened to.
The good news for both developers and managers is that estimation is a learnable skill. If you get in the habit of making estimates and testing them, you'll get better amazingly fast. See the classic book
> The good news for both developers and managers is that estimation is a learnable skill. If you get in the habit of making estimates and testing them, you'll get better amazingly fast.
To an extent, though, the better you are at programming, the worse you'll be at estimating.
A bad programmer, given a one-week task similar to one he's done before (when it took him two and a half weeks), will estimate it at two weeks and do it in two weeks, plus or minus a bit.
A good programmer given the same task will try to apply the previous solution, or use a library that solves the problem, and usually get it done in two to four hours. But some of the time that won't work, and then they will spend three days doing what would have taken the bad programmer two weeks, and a fourth day generalizing it so that when they have to do a similar task the third time, they can do it in two to four hours.
The good programmer's estimate will, therefore, frequently be off by a factor of ten or more.
In short, the better you are at automating what you've previously done, the more time you spend doing things you haven't done before, and so the more uncertain your estimates are.
The central limit theorem says that if you add up enough independent random variables from whatever distribution, their sum will eventually start to look like a Gaussian normal distribution. Unfortunately, this isn't as useful as you might think for software project estimation, both because different tasks aren't independent and because when the underlying distribution is heavy-tailed (e.g. lognormal, exponential, etc.) "enough variables" can be much larger than the number of tasks in your project: in the hundreds or thousands.
> The good news for both developers and managers is that estimation is a learnable skill. If you get in the habit of making estimates and testing them, you'll get better amazingly fast. See the classic book
It's not hard to learn to make reasonable estimates. Even for unknown problems, I can often say that X has usually taken about Y time in the past and be right most of the time.
The real problem comes when the estimate you give is not the one management wants to hear.
Or when you're tasked to do something you've never done before, and your initial thoughts of "I bet y is like x" turn out to be terribly wrong.
Something so seemingly trivial as a different API can make two similar tasks take vastly different amounts of time, especially if you're approaching the second problem with the mindset induced by the first API.
"If you ask Fred, “When will you be done?” have you asked for an estimate or a commitment? What does Fred think you asked for? If Fred says, “Two weeks from today,” has he given an estimate or a commitment? What might happen if you want a commitment and Fred thinks you want an estimate? What might happen if you want an estimate and Fred thinks you want a commitment? How could you make it crystal clear whether you’re asking for an estimate or for a commitment? How could you make it crystal clear whether Fred is giving an estimate or a commitment?"
I like to think of myself as someone who has a lot of common sense, but reading McConnell's "Software Estimation" made me feel like an idiot. He lays out the most important stuff right at the start of the book: an estimate is not a commitment is not a plan to meet a target. Reading this, I realized that where I work, the term "estimate" is often used to mean any one of those three things at different times, but most often is used to mean "commitment." Not knowing any better, I fell right in line.
McConnell's basic strategy for delivering an estimate (give a minimum, a maximum, and an "expected time") is great for so many reasons, but the reason I like it the best is for its communication value - by giving three numbers, none of them being an "I can have it done by..." number, you reinforce the idea that you are delivering an estimate and that the point of delivering that estimate is to help the project manager control the project. If it turns out that the PM is lazy and they really do just want a commitment ("sure, sure, whatever, when will it be done?!"), you can adjust your communication strategy accordingly.
I'd say that people can't estimate time because it's an acquired skill and most people don't try to acquire it.
When I managed a team, I had people put their initial estimates in the tracking tool, and when closing out the item they'd also fill in how long it actually took. Then, some of the report generation tools would let them see how accurate they were. There were no review metrics associated with being accurate, but I found that within a couple of months in a new area, individuals started getting much more accurate without just lamely padding out schedules (their teammates would have called them out, and even I wasn't pointy-haired enough to be fooled that easily). An unexpected bonus is that people also got a lot better at describing their work and investigating the risky bits before throwing something onto the schedule for the sprint.
But, we didn't do this for purely investigative or experimental work (i.e. "try out a new immutable text region design in the editor"). We'd just timebox work like that and evaluate progress to decide whether to keep going or not.
I prefer to take my advice from someone who has demonstrated the ability to repeatedly run projects and estimate them in advance to surprisingly high accuracy. I therefore recommend Software Estimation by Steve McConnell, available at http://www.amazon.com/Software-Estimation-Demystifying-Pract....
If you like Steve McConnell's work, you should take a look at http://www.LiquidPlanner.com, we built our entire product around the notion of ranged estimation so that your schedule actually captures some of uncertainty involved in estimation.
If you give an estimate that's too far out, it'll be outside your client's planning scale. That means it's so far out, she doesn't know what will happen by then, she won't do the project. In my experience, this is scale is roughly 9-12 months, unless you're dealing with a MegaCorp.
So most estimates end up at 6-9 months, which then slip into 9-12 territory, and hopefully not much further =)
In my experience, coding time is usually easy to estimate, and that's what most estimates coming from developers should be taken to estimate. But then you have integration (nothing works), testing (everything is broken), and then realizing what the client meant when she said X (solved the wrong problem, optimized for wrong features), those are the tough ones.
You should take the estimate for coding time (coming from the developer), and multiply that at least by 2x. Of course, going back to my original point, that estimate may be outside the "human timescale", so you may not be able to tell the client that...
So, to answer your original question, developers can estimate primary development time up-front pretty well, but they suck at estimating integration, testing and shipping time.
A related thing I noticed is that if you ask a developer during the late(r) stages of the project when she'll be finished with the "next stable version", the answer usually hovers around "in 2 weeks", even if the actual answer is more like 3 months. The developer isn't lying, she genuinely believes that it can be done in 2 weeks.
That's why you want experienced engineers and not fresh college grads leading teams and giving estimates. They've gone through these experiences, have a lot of soft data metrics in their heads, and can come up with good multiplicative factors. I'd be curious if orgs like Google have a database of project data to aid planning.
she genuinely believes that it can be done in 2 weeks
This is where having good historical records is useful. If I have done 2 similar tasks before and they took a month each, then a 2-week estimate has no validity.
As far as general code estimation goes, lines of code and complexity taken together form a pretty good metric. If I estimate I have to write 50 lines of high-performance multithreading code, I might say it will take 8 hours. If it's 50 lines of fizzbuzz, estimate drops to 1 hour.
But like you say, that's just coding (and unit test). The full task probably requires (1) getting or understanding requirements, (2) at least some cursory thought about design and (2a) having a hallway review with somene if you're not sure. Then when we're done with code, unit test (don't forget to estimate how long it will take to create, debug and run the tests) and Code Review (how many reviewers do you need, how long will it take them to get around to inspecting your code, etc), someone has to estimate (4) how long the test team will take to write test cases and do the system test of the new feature, (5) tech docs has to update the user manuals, (6) sysadmin or the Release Master estimates how long deployment will take.
In short, at least in the environment I work in, that 50 lines of code that took eight hours to write, will easily require an entire week before any customer can see it. In my experience, it's often not so much that the developer didn't estimate accurately how long coding would take, it's that the developer never thought about all the ancillary tasks that have to be done (code review, unit test, documenting the design changes, etc), so they were not in the estimate.
The author makes the point nicely at the front: "because the nature of the work is creating."
I'm not a developer but I can confidently say that it's not just developers - NO ONE can estimate time when it comes to creative work. I use the term creative loosely, it could be anything from developing a business strategy, to crafting a story, to art directing an original visual, to writing code for a novel problem.
The problem is the same one that makes it difficult to say "when will we have a cure for Parkinson's," "when will we have fusion power," "when will we discover a viable alternative energy"? You often see estimates like 20 years, 40 years, 50 years - but properly interpreted, these aren't actual estimates of the amount of time it will take, but rather like a proxy for a much more probabilistic, uncertainty-oriented notion.
Hi hammock, thanks for the comment (I'm the author). For some reason you've reminded me of the motivation factors Dan Pink talks about. I wonder if trying to commit to estimates makes our creativity degrade in the same way as paying cash bonuses?
It is always a mistake to ask: "how long will this take you?"
I've spent my career as a lawyer dealing with this problem.
The solution is to choose a price that represents the value proposition of the output (whether it's code or contracts). As part of that price, you will bring a comprehensive process to the table – one that is very clear about what the client is expected to provide.
I have gathered a huge amount of timekeeping data from lawyers, and crunched it up, down, left, right and center over the years. Here is what I learned:
1. If a task can be defined, it has a predictable price.
2. Tasks should not be measured by how long they take start to finish, but rather by how much time is actually required to complete them.
3. A clearly defined task always takes just about the same amount of time. The start to finish time is variable because:
(a) the client is faster or slower at providing resources or information; and
(b) your other client work is causing bottlenecks in your own availability.
4. You can solve for (a) and (b) above. I'll leave it to your imagination how. That's my secret. ;)
There are 2 sides to this, one of which is detailed in most comments below, and I see it from a lot of devs I work with but mainly the less experienced ones, and that is the eternal optimism issue. That's been covered a lot already, so no point in belaboring that.
The OTHER side is that, at least given my experience in the financial software domain, managers don't WANT accurate estimates. They absolutely abhor them. And developers are punished for giving them. So they aren't given.
An accurate estimate, such that one can be made, is usually around a 70%-80% confidence interval. It includes many specifically unforeseen, but generally known issues such as problems with the environment, lacking specifications and time required to get them, including "making $#@! up" fudge factors, technical hurdles, cogitation and exploration time, etc.
But managers can't hear that. All they want to hear is something they can sell to their superiors, which is often the customer. An accurate estimate is almost always going to be larger than that, so they won't accept it. So, the development staff is forced to skimp on quality or features to make an artificial date. But that's ok! Why?
Well, it's partially an organization's willingness to accept "there's never enough time to do it right, but there's always enough time to do it twice" (or more), but that's not even half of the issue. The larger part is that NO ONE WANTS IT RIGHT THE FIRST TIME. I've seen this time after time.
It's a win/win to provide sub-par product. Why? It makes a manager look great to give low estimates (win), hold the developers to that, and deliver a product with less-than-promised quality, or scope, or both. That provides customers something to gripe about (which EVERY customer wants; it makes them appear "tough" or "thorough"; win), and it gives the managers something to "fix" and appear reactive to the customers' needs (win).
Who's holding the bag for all this win? The devs, and support. There are inevitable promises of being able to go back and fix things, but that never happens; once something is in production, no matter how crappy, it has the almighty momentum. If it's "working", almost no matter how fragile it is, there's no appetite to change something that works so that it can work better. That provides no revenue, and does impart considerable risk (with limited or no QA, having moved on to the next thing) and/or cost (keeping your QA around to regress fixes).
Although that mindset infuriates me, I don't honestly know that it's not the best way. It seems to have evolved, and companies that do it seem to do ok, so maybe it's the Darwinian process at work.
I think that one of the major problems people have when giving estimates is that developers realize there is a lot of uncertainty add in all of the "'making $#@! up' fudge factors, technical hurdles, cogitation and exploration time, etc.", and then managers mentally strip it out.
The frustrating thing is that it means the uncertainty gets lost, and people just end of with a number, which managers may treat a fact, not an estimate. That's why I think it's a really good idea to give a range including both a low and a high estimate so you can let people know: "If everything goes right, it might take two hours", but warn them that: "It could take 4 days". Once people have that kind of information, they can make much more resilient plans.
That's why we always tell our CEO that it will take 2-4 hours or 4h-5d, and let me tell you, he appreciates knowing when we don't know, because a layer of false certainty is removed.
Actually, no. It's a win in that case too, as I mentioned. The customer gets something to complain about, so they win by appearing thorough and tough on the vendor, and the manager gets something to fix, and wins by appearing reactive to customer needs/wishes. I've seen this scenario so many times over my last 25 years of doing this. To date, I've only had 1 manager admit that's what's actually taking place.
Well, yes, but too often "soft" requirements get lost, like performance or the ability to be supported by anyone who wasn't the author. I'm not IN support, but I've been there and man I feel for those guys. Getting crap dumped on them that has no documentation, architectural designs that were pulled from someones backside in the heat of the moment of getting it done quickly, inconsistent standards being applied, wheel reinvention, etc.
On the bright site, budgets for support are much higher, because application is in production and generates real money. So even though it's hard to support messy code, there is no risk that all these efforts will be wasted on something useless.
Prototype is really a dirty word in programming, but there's no reason you can't produce a small, well written app which does some of what's needed and build from there.
As opposed to a crappily written ball of mud which does most of what's needed, but poorly.
> but there's no reason you can't produce a small, well written app which does some of what's needed and build from there.
Depends on where you work, I guess. Part of my personal hell right now is our architecture (which was written "elsewhere" and is given to us from on high) prevents such luxuries. It's an all-in affair, and deviation is met with retribution. Unit testing is nigh impossible with it, so I try to unit test the best I can of MY stuff that gloms onto its bulbous, massive exterior.
I agree with your point; don't get me wrong; but sometimes there are external forces at play that make the "right way to develop" hard to impossible.
The 'right' answer there is to system test first (eg. set up a dev system and run things like selenium against it) and then you can start to pull stuff out into libraries and unit test it.
The standard way around the 'luxury' attitude is to start with fixing downtime, and grow from there.
It's because developers are eternally optimistic. How else can you explain someone hitting a "compile" button a hundred times a day, hoping each time to get no errors?
This reminds me of assembly language in college... Each assignment we were told to end the program on a different type of ABEND ("for this assignment I want you to generate a SOC8, integer overflow" etc, etc...)
Nicely complicated--- useful if the boss or client wants verbiage. If asked directly though, I usually say "If I've done it before, I know how long it takes; if not then not."
Pure coding is easy to estimate accurately. If you asked a developer to estimate how long it would take to hand write a class for every HTTP status code in v1.1 of the standard (and gave them a copy of the standard), they could probably give you an accurate figure.
Of course, most development projects are not like that. The pure coding part of it is interspersed with countless decisions about how the app should work, with each decision requiring input from lots of people, plus a bit of experimentation and rework. The process of getting to a decision is hard to estimate and the amount of development that each decision will create is impossible to know until the decision has been made.
Estimating a software project is like estimating the length of time it takes to get a law passed. It's not a computer science or even an engineering matter, its a political/ psychological one.
> If you asked a developer to estimate how long it would take to hand write a class for every HTTP status code in v1.1 of the standard (and gave them a copy of the standard), they could probably give you an accurate figure.
Yes, but that's explicitly demanding that they write the software incompetently. Writing software incompetently, by mechanically carrying out actions that ought to be automated, is easy to estimate. But if I thought I needed a Java class for each HTTP/1.1 status code, I'd write a Perl script to generate them, be done in a tiny fraction of the time, and have a vastly more maintainable codebase.
Your other point about multiple stakeholders is well taken, of course. But even when it comes to solving a well-defined problem, it's easy to misestimate by orders of magnitude.
Almost all bad estimates are due to a lack of information. In software development, the information that tends to be missing are the particulars of the task being estimated (which code needs to be modified, in what way). This missing information is highly unique and not to generally transitive from one programing assignment to the next. Additional information about the process is usually not the problem. Things like CMM tend to focus on the process information rather than the particulars of the coding problem at hand. Which, leads many developers to think: "You're doing it wrong."
Speaking from experience at a large multi-year project at a BigCo, I am quite familiar with the failures of estimation. Some information is absent because some things are simply unknown due to inexperience or inability to predict future changes or mistakes (reasonable). However, other things are lacking because there is no motivation or demand to produce them. For example, waterfall processes generally throw the software development lifecycle on a schedule for a system or subsystem: requirements, design, implementation, test. If the lifecycle stages are not divided into tasks upfront (which then should be further divided), then the complexity of each stage is basically not being factored into the estimates for each stage. This is setting the stage for failure unless it is a project that has been done repeatedly over many years in the organization.
Furthermore, continuous process improvement and monitoring is not done enough even in organizations that declare achievement of higher levels of CMMI. Our team had subsystems that overran their initial estimates by over 100%, causing nearly a year in delay. However, there were basically no penalties or major process changes despite the fact that multiple subsystems overran estimates multiple times. Process professionals, engineers, and managers are simply not aggressive in tackling these problems.
In addition, estimates tend to come from individuals. I imagine that a more collaborative team-based estimation approach would be better, factoring different levels of experience and sharing the burden of making estimations realistic. Also, recorded estimates need to be coupled with recorded justification.
agreed. we don't have a language to define the problem in the first place.
i am great at estimating - unfortunately i often go over because what i heard was being asked for is subtly different to what the client thought they were asking for
I estimate by considering an optimistic, pessimistic and normal value for each task. I then have a spreadsheet that does a modicum of statistical analysis to give you times against probability that the task will be finished.
Some of the benefits I find are:
1. Getting into the pessimistic mindset helps produce better estimates
2. Tasks with a large spread are things you aren't really clear about, and are an obvious target to be attempted first
3. Giving a range of values to the person asking for the estimate seems to help them remember it is an estimate.
Need to include "tasks which will take significant time but I can't tell you what they are because we won't know until we get there". Managers can't seem to deal with the notion that 10-30% of a project is completely unknown when writing initial schedules.
I am asked to give estimates all the time, sometimes based on something like "A guy down the hall says it crashed, how long will it take you to fix." or "Bob wants a portal site". So I've come up with a method that works for me. Add together hours based on what things I will need to fiddle with to get it to work.
•Involves strange business practice I will have to figure out or gleen from area experts +5h
•I didn't write the code +5h
•Involves something like wcf or msmq +3h
•It is a "it is too slow" bug +3h
•Involves stored procedures +2h
•Involves 3rd party UI component +10h
and so on, and so on.
I find that only bad managers ask "how long will this take you?" and idiot manager will take what ever time you give them and try to cut that short or judge time for the developers or engineers; like "this should be easy and will only take few day".
Good manager that have "developed" before will ask "how much work is left?" and "What are some of the out standing problems that need to be resolved?"
You're right that managers who understand your job can make life a lot easier. But this goes both ways. I find that understanding what constraints my management is under can help me to help them plan.
When you understand what their problem is (e.g. we promised X to the client and we have to make it work somehow), you might be able to rearrange things to make that possible. Of course, an ounce of prevention is worth a pound of cure: if you can help them avoid promising the impossible, you'll be a lot better off. Otherwise, you end up suffering due to arbitrary deadlines created for the sake of looking good. If you can find better ways for your boss to look good than impossible deadlines, you'll be way ahead. I grant, though, much of this is easier said than done.
May be because software development is a creative process or the same reason why Artists won't tell you how long they will take to finish a painting or Authors repeatedly miss publisher deadline for books.
In my opinion, software development is exactly a creative process. Many times, you have to come up with a creative solution that fits within certain constraints. Also, in the beginning usually only a vague image is known. The artist (developer) has to fill this in and make sure it is what the client wants.
Also, many creative works are collaborative efforts as well these days. I'm not sure how that changes anything. If anything, it makes estimates harder.
You could equally say "Just because math is involved doesn't make it engineering".
But I suppose it's something in between art and engineering.
With regard to "sell weird color splotches for millions" - Isn't that the same with code? You can sell a simple game or app to millions of people and become rich. The amount of money you earn doesn't have any connection to complexity.
You are right, there is a sub-gendre of entertainment software that does have similarity to art. But even there you'll never buy a product because it crashes, while a work of Art can be valuable because it is ugly
On the other hand, in code styles, languages, idioms, a lot depends on taste. One person might find Lisp code ugly, another one might love it. Some person might find some library useful as a modular object oriented system, while someone else might find it a hellish heap of macaroni code.
Artists might like to experiment with different styles, so might coders.
Also, it's not like most artists try to sell people ugly things. That's also a very small niche. Most of them simply work for companies and design the things that they are told you. Just like coders.
We could debate about this for ages, but it seems very clear that there is both art and engineering involved.
No one will buy a skyscraper design that has no doors or windows and crumbles under a breeze. Architecture and industrial design are two examples of things that need to cater to practical needs yet are usually regarded as forms of art.
Of course there's also people who think that art is only art if it doesn't do anything and isn't "needed". This also creates problems (for example, can a movie be art? if yes what of the "jobs" needed to create the movie are art? who are the artists?)
That's why we were using complexity estimates in points (say 1 to 5) instead of actual time estimates in my last startup. Once you have some worklog, it's easy to calculate how many points the team can get done in a week (or whatever your estimation cycle is). This little abstraction made estimates much more reliable.
I guess some "agile" methodologies propose a similiar approach, but we didn't really follow any specific one, just figured out what works best as we moved along and kept the process as lean as possible.
Just to clarify what k7d is saying when he quotes "agile" methodologies:
Any prescriptive agile methodologies (first, do X. then, do Y) are really antithetical to agile. True agile processes are exactly what his own startup did. They used what worked. They implemented a practice (complexity points) that was relevant to their principles (accurate estimation). Of course, you should also be tweaking your process as you go along.
There's another way as well; specify a range of time within which you expect the task to be completed. This also communicates your uncertainty about the estimate at the same time.
Accurate estimation is certainly possible and is practiced by professionals, I do it myself. It's a skill though and it requires ground work in advance. Those interested may find discussion of some methods in McConnell's "Software Estimation: Demystifying the Black Art". Humphrey's PSP materials are an earlier but useful source of information.
The skill is irrelevant in most workplaces though. In most cases time to do an estimation is forbidden, and in the rare cases where an accurate estimation is produced, it is replaced with management's wishful-thinking estimate instead, which insecure developers are often strong-armed into "agreeing" to.
Replacing an accurate estimate with wishful thinking and whipping the slaves to go faster does not mean accurate estimates are impossible. It means that there is a problem with endemic management incompetence and unprofessionalism throughout this industry. If you have not established a track record of accurate estimates, you shouldn't be managing in any capacity at all. I don't expect this to happen though. I fully expect incompetent unprofessional management to continue to be the rule throughout most of the industry because there is little sincere interest in fixing things compared to maintaining empires ruled by fools. The preferred system is of blaming developers for bad estimates that were forced upon them by management, or that were produced by people who have no idea how to estimate and who are given no tools or training in how to do so.
What a relief it is not to be working for such incompetent management and be on my own. To those whose bosses are bullying them into giving bogus estimates and then blamed when things go wrong, you have my sympathies.
“Hofstadter’s Law: it always takes longer than you expect, even when you take into account Hofstadter’s Law.” — Douglas Hofstadter, cognitive scientist and Pulitzer Prize–winning author of Gödel, Escher, Bach: An Eternal Golden Braid
When I needed to estimate a new project at my previous job I used to pretty much always say "3 months". I made sure clients knew I was talking about a MVP even if they were thinking about making something like Microsoft Excel in-house. Then we would just build on top of that, one block at a time, never more than 3 months for each block.
Developer estimation improves when the developer is estimating the time for a task that is repeatable and they have experience or data on how much time the task took in the past. Substitute any title for "developer" in the previous statement and it remains true: civil engineer, drywall installer, plumber, etc.
The challenge is that relatively few developer tasks are highly repeatable, and very few developers are disciplined about collecting data on how long similar tasks required in the past. Also high performing developers get bored easily and frequently want to chase the new shiny tool/framework, where they have no past experience or data to rely on for estimating.
A first question in asking a developer about their estimate is "have you done this type of task before with the same technology you are planning to use". If no, disregard any estimates.
You can get good estimates for high volume of task teams, like maintenance or break/fix teams where the technology remains relatively static from one work iteration to the next, and the work items can be bucketed into meaningful and consistent sizes (T shirt sizing or what have you). Those are typically where you have your junior (or offshore) developers staffed as they pay their dues, and they can benefit greatly from having discipline around collecting data on their actual experience and using that data to reduce variance in estimates vs actuals in the future. This is also a great space for a new dev lead or manager to cut their teeth on managing the consistent collection and application of this data to reduce variance to plan.
For the really fun work though, where your high performing developers want to be operating so they can thrive and experience self-actualization (see Maslow), forget about the estimating angst and just timebox them or go agile -- build something, release it, refactor it, repeat. Don't waste your time creating the illusion of certainty about the future when you have nothing but optimistic gut feel as input to the estimating machine.
Good post Ash (good to see the discussion being consolidated)
IMHO - it's usually due to the lack of "up front planning" by Managers / Analysts.
If you compare building software to more general "building endeavors" (for example building the Olympic stadium) then, IMHO, developers are really the "construction workers" (the people actually doing the work). However no intelligent Olympic committee would countenance starting work without (for example) an architect making an incredibly detailed plan (and even a working replica model).
IMHO the problem we suffer from as developers is most often diving in too fast because:
a) we're the construction workers
b) when we do use Architects they are "construction workers too" (as opposed to being a trained with 'different but complimentary skills - it would be hard to imagine Sir Norman Foster laying bricks)
I think the main thrust of the article is factually correct (you can't estimate the creation of something new), but I don't think that's a very satisfying answer because you can't do anything about it.
In my opinion, the single biggest problem with developers doing estimates is that they think they're being asked for a commitment, or they are being asked for a commitment but the asker is calling it an estimate. This creates a large number of problems in expectations and communication, and it causes the whole project to break down.
There are some things that can be done on a project to help improve communication about estimates, commitments and targets:
- Ensure that estimates, commitments, and goals are understood to be three separate things.
- Ensure that everyone knows the benefits and uses of estimates, commitments and goals, as well as the non-uses. This is doubly true for estimates: many people treat them as commitments, but in reality they are supposed to be indicators that can be used to help control a project and make it successful.
- Encourage people to put effort into their estimates. Developers often don't like working on estimates because every minute they spend working on them (or on other stuff like timesheets and status reports) is a minute they could have spent thinking, designing and coding. Ensure that everyone knows that spending time on estimates and other activities is truly valuable to the project and the group, and helps ensure success.
- Do everything possible to remove stress and pressure around estimates. Reinforce the idea that estimates are used to help steer the project to success, not to hold peoples' feet over the fire (if a PM is trying to shrink estimates or pointing to estimates after a project has started and saying "tasks x, y and z are late," they're doing it wrong).
- Make it known that estimates are supposed to have some uncertainty. Don't accept or expect single-point estimates, and make sure the people delivering the estimates aren't pressured to do so.
It's because programming is as much an art as it is a science.
You don't ask a painter how many hours it will take to complete a masterpiece. Nor do you ask a composer how long it will take to write the next great hit. Or an author how long it will take to finish a great book.
Sure, we can ask an artist to be done by "next Tuesday", or in 20 hours. And the artist will say, no problem, but there is no guarantee of quality.
Programming is no different. Yes, there is a method, a science, to getting things done, but when it comes to creating a finished work of art, it is not a time oriented task.
You don't know the problems/limitations you will run into until you begin working on your project. We have immersed ourselves in a culture where it's expected to estimate the unknown.
Want to hire the most experienced programmer out of a group of candidates? Ask them to estimate a programming task, then pick the one that gives the longest estimate.
I'd say hire the one that gives you the best justification for their estimate. It's important for people to understand what is and is not involved in an estimate as well as why they think something is hard.
Once you communicate that, you can uncover a lot of potential problems and misunderstandings. For instance just the other day my co-worker was asked to estimate a task that seemed to our boss to be pretty small. He estimated it as being 2-8 days, it wasn't until they actually started talking about _why_, that they both agreed on the requirements. The resulting task will probably take 1-3 hours.
So sure, as a developer learns more they may estimate tasks as being longer, but communicating your assumptions is more important than throwing out huge estimates to cover your ass ;)
Wait a second. In the right environment, good developers can estimate time fairly well on most things, even for tasks they haven't done before. I'm as prey to the "eternal optimism" factor as anybody else, but it's easy enough to work around: start by doubling your own estimate, and over time learn what multiple is right for you.
Because they are doing something that has never been done before. The unknown is exactly that - unknown. You can make guesses, but they will never be anything but guesses until the problem has been solved a few times.
However, once a problem is solved, it does not need to be re-solved, and so the attention of the company turns to a new unknown problem.
> It's not just developers that are bad with estimates either. Everyone at some point is just winging it because it's something they've never done before and won't be able to successfully make a judgement until they have.
When my manager asks, "How long will this take," about half the time my honest answer is, "I don't know."
But from your manager's perspective, you probably do know. You know if it will take more than or less than a year. You know if it will take more than or less than a month. You have some idea of what needs to be done. You have some idea of what parts of it you don't know, etc.
One of the most important things I've learned about estimating is first asking how accurate an estimate is needed. Often the person just wants to know "can I have it tomorrow, or will it take 3 months?"
Depending on the task, it might not actually even be possible. Or, it's doable but not without rewriting major libraries or buying hundreds of servers.
I can't stand the rubbish advise to 'just double the estimate'. It just means you'll get an estimate 2^n times the size of the original guess from the guy n steps away from you.
Re "environmental contributions", that was part of the discussion, but thinking about it - estimating total effort and elapsed calendar time are very different!
Time is not a fixed unit of measure. I may be going closer to the speed of light than a project manager, so one week to me, might be like 5 years to them.
Once a developer has done a project, he can give a good estimate of how long it would take for him to do that project again!
Really, it's easy for developers to estimate time much the same way plumbers, carpenters, auto body shop workers, general contractors, dentists, etc. do: When a developer has done a very similar project more than 10 times recently, they can give a good estimate for how long it will take to do the project once more.
This situation is quite general in engineering and other work of wide variety.
Similarly, in general, when some project is being done for the first time, with no prior experience with comparable projects, then time estimates are tough to give.
In particular, for 'developers', now they work heavily with 'parts, pieces, and tools' from others, and working with these for the first time encounters standard problems: (A) How to use the parts, etc., needs good documentation rarely available. (B) Newer parts, etc. commonly have strange behavior or actual bugs that have to be encountered, diagnosed, and worked around. So a developer does not know how much time will be spent mud wrestling with (A) and (B).
Except that no two projects are ever really the same. Even with a very similar project you'll have different people, different systems to interface with and different scope.
Plus, any developer worth their salt will have built libraries as they went (or documented the existing ones), so the next similar project will take much, much less time.
Classic line... the better question is: "Why can't developers see into the future?"
Programming is one of the worst jobs for dealing with people. Managers assume you have no life and work you 12 hours a day for a software product, telling you that it might save your job. It's all a ruse; once they get what they want, why pay for an expensive developer to support your new site or application?
If anyone asks me if they should be a professional developer I tell them 'Run away!'. I don't wish the experience on anyone.
[La Forge goes back to work; Scotty follows slowly]
Scotty: Do you mind a little advice? Starfleet captains are like children. They want everything right now and they want it their way. But the secret is to give them only what they need, not what they want.
Lt. Commander Geordi La Forge: Yeah, well, I told the Captain I'd have this analysis done in an hour.
Scotty: How long will it really take?
Lt. Commander Geordi La Forge: An hour!
Scotty: Oh, you didn't tell him how long it would really take, did ya?
Lt. Commander Geordi La Forge: Well, of course I did.
Scotty: Oh, laddie. You've got a lot to learn if you want people to think of you as a miracle worker.