Agile Metrics — The Good, the Bad, and the Ugly

TL; DR: Agile Metrics

Suitable agile metrics reflect either a team’s progress in becoming agile or your organization’s progress in becoming a learning organization.

At the team level, qualitative agile metrics often work better than quantitative metrics. At the organizational level, this is reversed: quantitative agile metrics provide better insights than qualitative ones.

Agile Metrics — Age-of-Product.com

Update 2020-08-24: I added more graphics and examples and fixed content debt.

Do you want to get this article in your inbox? Then sign up for the Food for Agile Thought newsletter and join 26k other subscribers.

Join the Agile Metrics Survey 2020 — Age-of-Product.com

Good Agile Metrics

Generally speaking, metrics help to understand the current situation better and allow us to gain insight on change over time. Without metrics, assessing any effort or development will be open to gut feeling and bias-based interpretation.

A metric should, therefore, be a leading indicator for a pattern change, providing an opportunity to analyze the cause in time. The following three general rules for agile metrics have proven to be useful:

  1. The first rule of meaningful metrics is only to track those that apply to the team. Ignore those that measure the individual.
  2. The second rule of meaningful metrics is not to measure parameters just because they are easy to track. This practice often is a consequence of using various agile project management tools that offer out-of-the-box reports.
  3. The third rule of meaningful metrics is to record context as well. Data without context, for example, the number of the available team member, or the intensity of incidents during a Sprint, maybe turn out to be nothing more than noise.

For example, if the (average) sentiment on the technical debt metric (see below) is slowly but steadily decreasing, it may indicate that:

  • A team may have started sacrificing code quality to meet (arbitrary) deadlines, or
  • A team may have deliberately built some temporary solutions to speed up experimentation.

While the latter probably is a good thing, the first interpretation is worrying. (You would need to analyze this with the team during a Sprint Retrospective.)

Upcoming Scrum and Liberating Stuctures training classes and workshops — Berlin Product People GmbH

Good Qualitative Team Metrics: Self-Assessment Tests

If you like to track a team’s progress in adopting agile techniques and processes, self-assessment tests are well-suited for that purpose. For example, I like to use the Scrum Checklist by Henrik Kniberg.

All you have to do, is to run the questionnaire every four to six weeks during a retrospective, record the results, and aggregate them:

Age of Product: The good, the bad, and the ugly –  Scrum self-assessment

In this example, we were using a kind of estimation poker to answer each question with one of the three values green, orange, and red. The colors were coded as follows:

  • Green: It works for the team.
  • Orange: It workes for the team but there is room for improvement.
  • Red: It either does not apply, for example, the team is not using burndown charts, or the practice is still failing.

If the resulting Scrum practices map is getting greener over time, the Scrum Team is on the right track. Otherwise, you have to dig deeper to understand the reasons why there is no continuous improvement and adapt accordingly. This form of data is valuable input for Sprint Retrospectives and discussions with the management, for example, to demonstrate the need for further training.

In addition to this exercise, I also like to run an anonymous poll at the end of every Sprint. The poll is comprising of four questions that are each answered on a scale from 1 to 10:

  1. What value did the team deliver the last sprint? (1: we didn’t deliver any value; 10: we delivered the maximum value possible.)Team metrics: Customer value delivered – Age of Product
  2. How has the level of technical debt developed during the last sprint? (1: rewrite the application from scratch; 10: there is no technical debt.)
  3. Would you recommend a job opportunity in this organization to a friend with an agile mindset? (1: No, I value the friendship more than this organization; 10: Without hesitation.)
  4. Are you happy working with your teammates? (1: No, I am already looking for a new job; 10: Yub, I cannot wait to get back to the office on Monday mornings.)

The poll takes less than 60 seconds of each team member’s time, and the results are of course available to everyone. Again, tracking the development of the four qualitative metrics provides insight into trends that otherwise might go unnoticed. Again, the data provides good input for a Sprint Retrospective.

Good Qualitative Team Metrics: The State of Scrum Values

Another good use of a regular anonymous poll is the state of Scrum values within a Scrum team. You create a poll of five questions, covering commitment, courage, focus, openness, and respect, using a Likert scale. (“1” might be “The Scrum value not practiced at all” and “5” might represent “the Scrum team is perfect at [Scrum value].”)

If you run the poll several times and visualize the data with a spider diagram, you can easily track the Scrum Team’s progress regarding Scrum values:

Team Metrics: The State of Scrum Values — Age-of-Product.com

Cannot see the subscription form?
Please click here

Good Quantitative Agile Metrics: Lead Time and Cycle Time

Ultimately, the purpose of any agile transition is to become a learning organization, thus gaining a competitive advantage over your rivals. The following metrics apply to the (software) product delivery process but can be adapted to various other processes accordingly.

In the long run, this will not only require to restructure the organization from functional silos to cross-functional, self-organizing teams, where applicable. It will also require analyzing the system itself, for example, figuring out where unnecessary queues impede value creation.

To identify the existing queues in the product delivery process, you start recording five dates:

  1. The date when a previously validated idea, for example, a user story for a new feature, becomes a product backlog item.
  2. The date when this product backlog item becomes a sprint backlog item.
  3. The date when development starts on this sprint backlog item.
  4. The date when the sprint backlog item meets the team’s ‘Definition of Done’.
  5. The date when the sprint backlog item is released to customers.
Agile Metrics: Lead time and cycle time – Age of Product

The lead time is the time elapsed between first and the fifth date, the cycle time the time elapsed between third and the fourth date.

The objective is to reduce both lead time and cycle time to improve the organization’s capability to deliver value to customers. The purpose is accomplished by eliminating dependencies and hand-overs between teams within the product delivery process.

Helpful practices in this respect are:

  • Creating cross-functional teams
  • Having feature teams instead of component teams
  • Furthering a holistic, whole-product perspective, and systems thinking among all team members.

Measuring lead time and cycle time does not require a fancy agile tool or business intelligence software. A simple spreadsheet will do if all teams stick to a simple rule: note the date once you move a ticket. The method even works with index cards, see below.

The following graphic compares median values of lead time and cycle time of three Scrum teams:

Agile metrics: How to master lead time cycle time – by Age of Product

The values were derived from analyzing tickets—both user stories as well as bug tickets—from a period of three months. The Sprint length was two weeks.

Other Good Agile Metrics

Esther Derby suggests to also measure the ratio of fixing work to feature work, and the number of defects escaping to production.

Another good source for actionable agile metrics is the State of DevOps Report.

Bad Agile Metrics

A controversial, yet traditional agile metric is team velocity. Team velocity is a notoriously volatile metric, and hence actually only usable by an experienced team itself.

Some of the many factors that make even intra-team sprint comparisons so difficult are:

  • The team onboards new members,
  • Veteran team members leave,
  • Seniority levels of team members change,
  • The team is working in unchartered territory,
  • The team is working on legacy code,
  • The team is running into unexpected technical debt,
  • Holiday & sick leave reduce capacity during the sprint,
  • The canteen serves bad food,
  • The team had to deal with serious bugs.

Actually, you would need to normalize a team’s performance each Sprint to derive a value of at least some comparable value. (Which usually is not done.)

Additionally, velocity is a metric that can be easily manipulated. I usually include an exercise on how to cook the “agile books” when coaching new teams. And I have never worked with a team that did not manage to come up with suitable ideas on how to make sure that it would meet any reporting requirements based on its velocity. You should not be surprised by this—it is referred to as the Hawthorne effect:

The Hawthorne effect (also referred to as the observer effect) is a type of reactivity in which individuals modify or improve an aspect of their behavior in response to their awareness of being observed.

To make things worse, you cannot compare velocities between different teams since all of them are estimating differently. This practice is acceptable, of course, as estimates are usually not pursued merely for reporting purposes. Estimates are no more than a side-effect of the attempt to create a shared understanding among team members on the why, how, and what of a work item.

Read more on velocity here Scrum: The Obsession with Commitment Matching Velocity and here Faking Agile Metrics or Cooking the Agile Books.

Ugly Agile Metrics

The ugliest agile metric I have encountered so far is ‘story points per developer per time interval.’ This metric equals ‘lines of code’ or ‘hours spent’ from a traditional project reporting approach. The metric is completely useless, as it doesn’t provide any context for interpretation or comparison.

Equally useless “agile metrics” are, for example, the number of certified team members, or the number of team members that accomplished workshops on agile practices.

Scrum Master Interview Question: free download of the most popular ebook on Scrum Master job interviews — by Age-of-Product

Conclusion

If you can only record a few data points, go with start and end dates to measure lead time and circle time. If you have just started your agile journey, you may consider also tracking the adoption rate of an individual team by measuring qualitative signals, for example, based on self-assessment tests like the ‘Scrum test’ by Henrik Kniberg.

What do you measure to track your progress as a team? Please share your agile metrics with us in the comments, or join our Slack team “Hands-on Agile”—we have a channel for agile metrics.

✋ Do Not Miss Out: Join the 8,000-plus Strong ‘Hands-on Agile’ Slack Team and Learn more about Agile Metrics

I invite you to join the “Hands-on Agile” Slack team and enjoy the benefits of a fast-growing, vibrant community of agile practitioners from around the world.

Join the Hands-on Agile Slack Group

If you like to join now all you have to do now is provide your credentials via this Google form, and I will sign you up. By the way, it’s free.

Related Posts

Agile Metrics Survey 2020

How to Kick-off Your Agile Transition

How to Build Offline Boards.

6 thoughts on “Agile Metrics — The Good, the Bad, and the Ugly”

  1. How can we calculate Time-To-Market metric in our projects ? Is there any formula to use ?

  2. We use a relative value by defining the “highest value sprint” of the past and then comparing other sprints to that one. It is admittedly a bit fuzzy, but all three questions are supposed to be answered without much thinking. It takes usually less than 15 seconds to do so.

Leave a reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.