A health cooperative had been having problems with the software package used in the management of its operations. This caused operational problems and damage to his image. Its management decides to set up a subsidiary company with the mission of developing its own software solution that is better suited to its needs.
The methodology for the development of the new solution was Scrum. The need for accountability of the new company and transparency requirements in the process are incompatible with technical metrics based on team backlog items.
The team backlog consists of three types of items. Basically, little user stories; architecture activities; and epics impossible to measure by the same techniques used for stories.
How to give satisfaction to the cooperative about the efficiency in the use of its resources without compromising the gains in agile development?
The productivity target was set at a level 19% higher than the baseline.
The establishment of the goal was based on the measurement in function points of the releases delivered, the collection of the effort invested with its compatibilization considering the scope of activities with benchmarking references and improvement objectives.
The improvement goals arose from the discovery of some productivity offenders, until then hidden because there was no metric that would isolate the measurement of production from the way sprints and backlog items are organized by the team.
The squads continue to self-manage from prioritizing the team backlog and sprint backlog. It uses its own methods to estimate in the micromanagement of each sprint (or even not estimate as the case may be). Every about 3 sprints, there is the release of a new release.
At each release, overall productivity is calculated and evaluated in terms of the goal. The team can evaluate your progress from a higher perspective in the space between releases from the story point conversion factor to updated function points with each release. It is a forum to discover new opportunities for improvement, obstacles during release development and solution alternatives.
There is greater transparency about how resources are applied; discussions on possible problems in operational efficiency allow for early identification and solution. The rework is reduced to what is necessary for a cycle of experimentation, feedback and learning and there is a concern in better organizing the backlog in this sense.
How the parts have been integrated
Our Budget Center has developed a system to bring external visibility to the controller by setting productivity targets at function points for planning and monitoring development at a roadmap level.
We apply Analytics & Machine Learning technologies to establish statistical relationships between the metrics of the squad, similar to the story point, and the measurement at function points at the roadmap level. We used linear regression techniques, variance analysis (ANOVA) and statistical tests.
Our Metrics Factory performed the measurements in the releases using the Analysis of Function Points – IFPUG – by cfps certified professionals.
All management monitoring tools were implemented in planning, monitoring and control spreadsheets.