Other Considerations

Besides Agile-specific measures, keep in mind these other keys to successful software measurement.

While using Agile-specific measures, leaders of Agile teams should keep in mind a few general keys to successful software measurement.

Set expectations about measurement

Be transparent about why you’re measuring and how you intend to use the measures. Software teams are concerned that measures might be used unfairly or incorrectly, and many organizations’ track records make that a valid concern. Be clear that the measures are intended to support each team’s self-improvement—this will help adoption of the measures.

What gets measured, gets done

If you measure only one thing, people naturally optimize for that one thing, and you can experience unintended consequences. If you measure only velocity, teams can cut retrospectives, skip daily scrums, relax their Definitions of Done, and increase technical debt in the attempt to improve velocity.

Be sure to include a balanced set of measures for teams to optimize against, including quality and customer satisfaction, so that teams don’t optimize for velocity at the expense of other objectives that count as much or more.

Similarly, it’s important to measure what’s most important, not just what’s easiest to measure. If you could have a team deliver half as many story points but twice as much business value, that would be an easy choice, wouldn’t it? So be sure that measuring story points doesn’t inadvertently undermine your team’s focus on delivering business value.

Use data from tools with caution

Organizations invest in tools, and technical staff enters defect data, time-accounting data, and story point data. The organization naturally believes the data collected by the tool is valid. This is often not the case.

We worked with one company that was certain it had accurate time-accounting data because it had been requiring staff to enter their time for years. When we reviewed the data, we found numerous anomalies. Two projects that should have had similar amounts of effort varied by a factor of 100 in the hours entered for them. We found that staff didn’t understand why the data was being collected and viewed it as bureaucratic. One staff member had written a script to enter time-accounting data, and that script was being widely used—unaltered—so everyone was entering the same data! Other staff members were entering no time-accounting data at all. The data was meaningless.

Get hands-on with 1400+ tech skills courses.