2020 Tech Leaders’ Report

Top Trends in Software Monitoring

Introducing the Tech Leaders’ report: How software leaders bring code closer to the customer

In today’s world where software is all around us, a poor software experience can mean the difference between winning or failing. Tech leaders need visibility into problems that exist in mobile applications, websites, and other digital channels, making monitoring more important than ever.

But the question is, with so much data available, how do tech leaders use data to understand how their software is performing for those that matter most —their customers? And, in today’s hyper-competitive landscape, how can tech leaders innovate quickly while ensuring that software quality is maintained?

These questions were at the heart of Raygun's Tech Leaders’ panel series. Launched in 2019, we wanted to create an open conversation around monitoring best practices. And the response was incredible.

We had tech leaders from prominent development teams like Nike, Microsoft, and Xero discuss monitoring on a live panel to over 500 participants. We hosted events in five major cities; Wellington, Auckland, Christchurch, Seattle, and Portland.

During our events, we surveyed our attendees to get a general understanding of how others are measuring software quality, software performance, and digital customer experience.

There were some great insights, and we are excited to share our findings and recommendations about monitoring. We hope you find these results insightful on your journey to monitoring success.

Metrics used by tech leaders

Most tech leaders prioritize measuring the total number of bugs as an indicator of software quality

Our data shows that measuring the number of bugs in software is the most important metric for tech leaders when measuring software quality, more so than testing, Net Promoter Score (NPS), and uptime.

There is, however, a clear divide between tech leaders measuring the number of errors vs. other measures of software quality. The ‘Other’ category contained a plethora of different metrics that represent less than 1% of total respondents, including usability and stability.

It is interesting to note that the metrics fall into two categories — proactive and reactive, or customer-facing and pre-deployment. A notable number of tech leaders measure software quality with metrics that will have already impacted customers — NPS, uptime, support tickets, and retention.

Recommendation

Total error counts can be misleading. If you have 10,000 errors that affect one customer, it’s not as bad as 500 errors affecting 250 customers. Therefore, the number of users affected by bugs is a better measure of software quality. Measure the affected customers every month, with a goal to reduce.

“Everyone expects software to work all the time at 100%. To me, at least 95% of your customers should be having the best experience.”
Kathy Lee, Software Engineer at Microsoft

Which metric do you use to measure software quality?

Speed and response time is the most popular way to measure software performance

While speed and response time is an obvious metric for software performance, we didn’t expect to see such a variety of answers, and it’s clear that software performance has a different measure of success for different companies.

Customer-facing metrics like uptime and NPS still have a prominent role to play, and at the extreme, 2% reported they don’t use any metrics to measure software performance.

Recommendation

Software performance is an imperfect science — it can’t be represented in a single metric. But, for many websites, improvement to user experience by speeding up page load time aligns with an improvement in conversion rates. When considering how to measure web performance, tech leaders should focus on giving better user experiences, and in doing so, consider newer user-centric web performance metrics like first paint. By using more user-centric performance metrics, tech leaders can make more meaningful progress in contributing to better user experiences.

“It’s all about the timeline. If you’re always putting out customer fires, then your timeline is also wrong for how you are serving the customers.”
Rory Richardson, Head of Business Development, Serverless and Application Integration at AWS

How do you measure software performance? 

NPS is the most popular way to measure customer experience

Our data shows that customer experience is almost exclusively measured using Net Promoter Score (NPS) and customer feedback. The ‘Other’ category represents answers that fell below 1% and include folks measuring the number of sessions and feature usage as a measure of customer experience.

Some tech teams have no formal measures of customer experience in place. The difference may exist because customer experience is difficult to quantify and report on.

Recommendation

While NPS is valuable, it can be misleading when applied to the customer experience, mainly because there are plenty of teams that celebrate a good NPS score, even though their software is slow and buggy. While NPS is an excellent path to getting real insights, it can’t accurately represent your whole customer base.

Software teams can use NPS without leaning on it as an indicator of the customer experience. While NPS might be your current best indicator of a customer’s overall experience, support it with engineering metrics like time to first paint and median load time — your reports will stand out.

“Start with the basic top-down question of ‘how do I monitor customer experience.’ Then, go ground-up — infrastructure, network, cloud. If someone’s having a bad experience, and it’s not performing properly, how do I get to the root cause of that quickly?”
Niamh Cahill,  Solutions Architect Manager at Chef Software

How do you measure the digital customer experience?

There are still challenges to face when monitoring — resources and time being the major ones

We know that there can be many roadblocks to implementing a monitoring strategy from lack of buy-in company-wide to software complexity and technical debt, all of which were represented in our data. The most significant challenge for our tech leaders is limited time and resources, while the second is internal communication.

Scalability, technical debt, and complexity feature heavily in the responses to this question — perhaps a reflection of the attendees in the financial, government, and transport sectors that tend to fall victim to less modern development practices and older architecture.

Recommendation

While a lack of resources is always a problem, monitoring saves you money and time. Once monitoring is in place, you can leverage the data to communicate your needs.

“Everyone has a role to play when getting software into the hands of customers.”
Sonya Williams, Co-Founder and Director at Sharsies

What is the biggest challenge you face when monitoring what matters?

What do tech leaders wish they could measure?

A majority of tech leaders wish that they could measure the overall customer satisfaction of their software. Measuring software health and performance was also high on the wish list — we hope that in this report we have highlighted where to get started.

The ‘Other’ category represents answers that are less than 1% of the total data, and included testing and percentage of technical debt.

Recommendation

While customer satisfaction can remain difficult to measure, software health and performance can be measured by the methods outlined in this report.

What do you wish you could measure that you can't currently? 

Job descriptions and industries this report represents

In this report, most of the attendees surveyed had a leadership role within their company. Industry-wise, SaaS was best represented while the utility industry was the least represented.

Job descriptions
Industries

Methodology

Raygun conducted a total of five different surveys across five different cities over 2019 with the goal of gathering trends on the current practices among tech leaders and software monitoring. The survey was conducted in-person, and the data compiled anonymously post-event. There were 296 total responses.

Final thoughts and thanks

If there is any conclusion we can draw from this report, it’s that monitoring is no easy task. Software changes quickly, but that’s all the more reason to have the data gained from monitoring to give confidence to key leadership decisions like where the team should spend time and how to communicate success to the executives of the company.

About Raygun

Raygun gives you the data you need to put metrics to software quality and customer experience. Whether you are hunting down a bug or trying to speed up web pages vital for conversions, you’ll have the confidence to do it faster.

14-day - no credit card required
“I did implement Raygun on my iOS apps. When a user hits a crash, those crashes get centralized into one location and bucketed automatically. I would get a call stack and a line number where things were crashing. It felt to me that I got a gift from my customer in the form of a bug report.”
Scott Hanselman
Partner Program Manager at Microsoft