Metrics & Measuring Performance in QA

I’ve always thought of metrics in QA as a bit of a tricky subject, as I find it difficult to identify and attach meaningful numbers to performance in a role based around providing information.
Technically, there is stuff we can quantify, but I’m dead against keeping track of statistics like personal bug counts, numbers of tests executed and so on. They bring about pointless bugs, endless raising of non-issues, and underhanded tactics all over the place, so they don’t give any true measure of an individual’s performance in the QA field. As I mentioned, the real measure of QA’s effectiveness and value is in the information they provide to their customers – The rest of their development team, the product and business teams they work with, and indeed, anyone else who is a stakeholder in the work the team carries out.
Even when trying to compare the performance of one person against another, the nature of QA means that, due to different pressures, time constraints, relative state of the system under test and so on, you will never get to see different folks running the same test in exactly the same set of circumstances. So, it’s unfair to consider using that sort of thing as measure or comparison of performance too.
But I do understand the need to monitor performance, particularly for new hires or new additions to a team, and there are a few things I use to measure the performance, throughput and relative value of folks in QA. While many of these metrics are geared towards the performance of new members of a team, they could easily be adapted to track the progress and performance of established team members too.
Bug Quality
The general quality of bugs raised should be spot checked, with closer attention being paid to bugs raised due to issues missed in testing (indicative of areas where testing and detection methods should be improved) and bugs returned as ‘Will Not Fix’ (indicative of areas where priorities and understanding of requirements / product needs / customer needs should be improved). For new hires, I’d expect numbers of such issues to decrease over time as the QA ramps up in their new domain. Also, keep an eye open for any issues which have failed to detect or report incorrect system behaviour, have been assigned an inappropriately low priority, or that otherwise understate the significance of a problem. These will highlight areas where coaching is required to improve understanding of the system under test.
 
Critical Bugs in Test vs Production
Keep an eye on the ratio of critical bugs (>=P2) raised in Test vs Production. Customer satisfaction is the true North of quality, and if there are more than a handful of instances of critical bugs being identified post-sprint, this could be indicative of a coaching need.
 
Test Coverage for Applications
Whenever a new hire fills a vacancy, I’d expect to see an increase in test coverage over time. Establish the current base line as the areas the team currently cover, and track this for increases. But, importantly, you must track for increases in areas where increases are expected. Don’t forget that, particularly with automation, there are upper limits for test coverage, so don’t make the mistake of setting a coverage target without first discussing and identifying the areas it is actually possible to cover. And of course, as well as the expected coverage levels, any timescale implemented must also be realistic, or you risk setting an unachievable target.
Load Shift
When a new QA comes on board, overall team output should increase as the new member of the team ramps up and takes on more of the testing load. This one is a bit arbitrary, and not entirely dependent on the new QA, but it’s still worth monitoring as an identifier for potential issues and bottlenecks in your team’s workflow, as well as the performance of QA.
 
Overall increase in Story Turnaround & Completion
Keep track of the team’s commitments for each sprint, and of how many of those commitments were delivered with a high standard of quality. Again, this isn’t always going to be directly in the hands of QA, but where a team has recently filled a vacancy, I’d expect a month-on-month increases in the number of stories committed to, percentage of commitments met, and an increase in the speed with which stories are completed. Take the current averages as a baseline, and monitor for the expected increases.
 
Engagement
Not a ’numbers’ metric, but arguably the most important one. Are the QA team making meaningful contributions to Retrospectives? Planning & Estimation? How are they communicating the information they’re finding during the course of their work? For new team members, I’d look for their engagement to increase as they ramp up in their new domain and adapt to the team and company culture. But as QA professionals bringing a fresh pair of eyes to the team, I’d expect there to be some level of insight and engagement from the very beginning. I’d also expect the QAs to be actively involved in the solutions to any bugs / issues they raise – So, conferring with developers working on solutions, discussing how the fix should be retested etc. to improve their knowledge of the system under test and its workings.

Regardless of how you decide to measure performance in QA, it is worth remembering that any metric should be used as an informational tool rather than any kind of absolute measure. The reality is that there is no substitute for getting to know what your folks are doing, the problems they encounter, how they handle those problems, and how they communicate with the people around them. These are the things that the team and your customers will be assessing their performance on, and the truest measure of success is also the most simple – ‘Is the customer happy?’

Being Seen To Be Doing

It’s fair to say that I’ve neglected this blog a little – so it’s time to rectify that, and give a bit of an update on what’s going on in my world.

First things first – The SDET & Automation Network Meetup, hosted by Testing Circle in collaboration with Hotels.com & Expedia.com is undergoing a bit of a revamp. The plan is to relaunch in March, with a wider scope to cover all of software testing rather than just the SDET & Automation area. We’re still working on some finer details, and I’ll shout when we have something concrete to share. But we will be looking for speakers on all subjects connected with software testing to come and present to our audience.

Things are changing in my role as QA Evangelist too. I spent a decent part of the last year acting as… well, the best analogy I can come up with is that I’ve been kind of a roadie for the teams I work with. Staying out of the way, working out of sight to make sure that my charges can get on with what they do best with minimal disruption.

This, of course, is not the best course of action for a QA Evangelist. Mine is a role that requires active engagement, visibility, and promotion of what I’m doing, as well as providing support and sharing my knowledge and experience with my teams. While I’ve had good ideas about things I can be doing as an Evangelist, I haven’t really followed through on them, and as a result, I’ve got a lot of half-finished stuff floating around, and concepts that I haven’t fully fleshed out. So rather than concentrating solely on working behind the scenes to make my teams look good, this is to be a year of execution, and of visibility through achievement – and I’m already making moves in that direction.

First of all, I’ve also established a series of ‘Lead Amigos’ meetings with the Dev Managers and Scrum Masters in each of the teams I work directly with. This is a fortnightly series of 30-minute catch ups to discuss what’s happening in the teams, how we can support each other in our roles as technology managers and leaders. This will help keep lines of communication open with my teams, to get information about their achievements and milestones, and to pick up information and techniques that may be relevant to the rest of the company.

In terms of communication and presentations, I’ve now resurrected our Technology Podcast. This is published directly to the company’s internal blog, and is a series of sit-down talks with folks from all areas of our Tech Org to talk about their career path, their team and its function, the good stuff they’re doing, and whatever else comes to mind. The feedback for the first couple of episodes has been good, and I’ve already recorded a few more so I have a buffer in case I can’t get guests to sit down with me at any point. Speaking to folks on the record about their areas of technology has served to increase my knowledge about other areas of the business, and putting the podcasts out on an internal platform has both shared that knowledge, and helped to raise my profile within the business.

I currently have six presentations in the development stage, covering a number of skills and disciplines within software testing. My plan here is to get them finished, and deliver them as internal coaching sessions. I can also open up these sessions for other folks who want to refine talks they’re developing, and deliver them to a soft audience. And I’ll use the feedback from those sessions to further develop my talks to take them to external meetups and prepare them as conference submissions.

Oh, yes – Conferences and meetups. I don’t just intend to raise my profile internally (as important as that is) – I also plan to get out and about more, to speak at more events, and to meet other folks who I can bring in to speak at our company. The revamped Testing Meetup I mentioned before is going to be an important part of this – stay tuned!

And of course, this blog is going to be an important channel for me to float ideas, provoke thought and discussion of our discipline, and to generally get things noted down somewhere to keep myself honest in terms of actually getting things done.

While I’ve talked about raising my profile and being more visible in what I do, there’s an important caveat here – The difference between visibility through achievement, and making noise for the sake of making noise is very important. There are few things more annoying than someone who is always jumping up and down, demanding attention and yelling about everything they’re doing just for the sake of being heard. I have two young children, so I see plenty of that at home!

This is just the first batch of things I’m planning to work on in the coming year – I’m not treating this as an exhaustive, or indeed, concrete list. As priorities change and new ideas crop up, things may get added, improved, moved around, reprioritised etc. But this selection gives me a good base of useful projects that will benefit my company, and will improve my profile within it.

Just what an Evangelist should be doing.