Velocity and Why You Shouldn't Watch It

(Updated June 9, 2011 to clarify relation between points and business value - thanks Mishkin!)

Back in the early mid-90s when I still had some dark hair, I started saving for retirement using the typical approaches of mutual funds and some stocks.  My advisor was my Dad's business partner in an insurance and estate planning company they founded in 1981, so I figured he wasn't going to steer me wrong.  One thing he said really struck me:
Whatever you do, DON'T look at the stock and mutual fund markets every day - you'll drive yourself crazy and make bad decisions!  You have to take a long-term view of your investments, and then you'll be fine.
Now, he wasn't saying don't look at your investments, but rather don't make rash decisions based on small sample sizes.

This same attitude should apply to how we view Velocity in Agile processes.  Velocity is very interesting - it shows value delivered in a certain time period.  Maximizing velocity is also interesting, since more points completed indicates more value delivered to stakeholders (although it's not a direct correlation), and that's a good thing, right?  Sure, but you have to take a medium to long-term view of a team's velocity rather than making decisions based on the velocity of an iteration or two.

Consider this velocity bar chart showing 17 iterations:

This chart shows constant fluctuations in velocity per sprint, which may lead people inside or outside the team to spend time & effort determining why they only achieved 8 points in iterations 1 & 6 and only 7 points in iteration 7!  Similarly, seeing 16 points in iteration 5 could lead to optimistic planning with the expectation of completing more in a release than a team is capable of delivering.

Contrast that view with this chart showing the same data in a line chart format:

You can still see small "ripples" in the line, but this long-term view of how a team is progressing towards a release doesn't show the large fluctuations of the bar chart.  In other words, the bar chart shows a short-term view while the line chart shows the long-term one.  If you were a decision maker for this project, which view would allow you to make better informed decisions about the overall direction?

The work that a team completes represents the investments of a project's stakeholders.  Watching those investments within a very small window of time will "drive them crazy and can cause them to make bad decisions".  Velocity provides an indication of the performance of those investments, but good investment decisions only come when looking at the medium to long term view of their performance.

Comments

Jon Archer said…
I like it Dave. A topic that's been in my head this last week or so too.
Jason Yip said…
This is pretty much about understandiing the concept of normal variation
Alexei Zheglov said…
The line on the burn-up chart is very similar to the bottom, "done" line on cumulative flow charts in Kanban. (Although in Kanban, those charts are done daily rather than once in an iteration, so the line is not as smooth.) The slope of the "done" line equals throughput. Consistent/increasing/decreasing slope means the same about the throughput.

Thank you for demonstrating how two seemingly different methods are, in a way, very similar.
Dave Rooney said…
@Alexei,

Yes - it's just the simplification of work being either done or not done with no WIP.
Dave Rooney said…
@Jason,

Yep, that pretty much nails it. While reducing variation as much as possible is good, you have to understand at what level to reduce it.
Anonymous said…
I agree, althugh to some extent this suggests that the period over which velocity is being measured is too small, or the pieces of work are too big (maybe).

In one of my recent posts, I attempted to make use of the variation in velocity to calculate risk envelopes. You can see that here: http://www.casualmiracles.com/blog/2011/05/16/burndown-prediction-confidence-and-risk/
Dave Rooney said…
@Lance,

Thanks for the comment. I dunno... my experience has been that most teams' velocity will stabilize during iterations 2 to 4 or possibly 2 to 5. If you take the average of those iterations, you will have a "pretty good" prediction of long-term velocity, team changes notwithstanding.

Of course, I'm also the guy who says that you should use Yesterday's Weather to plan an iteration (http://practicalagility.blogspot.com/2011/05/simplicity-planning-and-weather.html). :)
Vin D'Amico said…
Tracking the performance of a software development team is no different, statistically, than tracking the performance of an investment or a sports team. Day to day or week to week variances are not important. It is the longer term trend that matters.

Sure, consistency is a desirable trait but reality gets in the way. Some variation is expected. Wild fluctuations suggest a problem.

Nice post, Dave!
Mishkin Berteig said…
Hi Dave... I have a question. You wrote: "Velocity is very interesting - it shows valued delivered in a certain time period.". Can you explain to me how velocity shows value delivered?
Dave Rooney said…
@Mishkin,

First, thanks for showing a typo that I had missed! :)

Second, you can't claim points towards velocity unless a story is completed and accepted. That's business value delivered, at least from the perspective of the person who give the final acceptance.
Mishkin Berteig said…
@ Dave - no prob on the typo - even I didn't notice it :-)

Re: velocity and value. If you complete a story and you claim it's points, what is the relationship of those points to the value of the story?
Dave Rooney said…
@Mishkin,

Nothing... except that points completed "indicates" that value has been delivered.

If I hinted that there was a direct correlation, I apologize. It wasn't intentional.
Mishkin Berteig said…
@ Dave - you did imply there was a direct correlation: "Maximizing velocity is also interesting, since more value delivered to stakeholders is a good thing, right? "

This is a common language error in the agile community. Product owners should not be looking at velocity (alone) to determine value delivered. In fact, it is possible that a very high relative velocity sprint delivers zero value. This could happen if the stories delivered all end up being removed later on before the software is released, which could happen simply because the product owner made a mistake.

Velocity is useful for two things: prioritization and long-term planning. It is not useful for measuring value.

The charts you show with the average and the slope are good for understanding when things will be delivered. This is useful to stakeholders for setting expectations about time, but not about value.

For value, the product owner needs to put estimates of "expected return" on each of the stories. If you burn this up it should look like a graph of a log function (high slope at the start, gradually levelling out). This would reflect an effective prioritization of work by ROI (expected return / effort). Points represent effort or cost.
Dave Rooney said…
@Mishkin,

Thanks - I've updated that section of the post. Is it more clear now?

I absolutely agree with the rest of tour comment. I coach teams to, as much as possible, front load the release with the "most important" stories. "Most important" can mean high business value, high business risk or high technical risk. Some groups assign a relative number like points to business value, but I believe that measurement leaves out the two risk categories.
A comment says "Tracking the performance of a software development team is no different, statistically, than tracking the performance of an investment or a sports team."

Well I may disagree as former statistical process control engineer and also trader I do think it's much easier to track team velocity :)