Companies are missing opportunities to grow revenue and improve margins by approaching the data they collect too narrowly, data specialists said in a SaaStr webcast last week.
Many software-as-a-service (SaaS) companies are investing heavily in data collection, but they’re only analyzing the most obvious results in their metrics, like churn. They can get a more accurate picture of performance, more quickly, by using AI to analyze a broader range of data, even if it’s not obvious how it relates to a metric, said Berit Hoffmann, vice president of product at data analysis software company Sisu.
One large business-to-business SaaS company Sisu worked with started seeing unusually heavy churn even though one of the main data points they look at regularly showed no change.
"The typical approach for them was to look at an attribute of customers that they already knew had an important influence on churn rate, and that was what insurance provider the customer was using," Hoffmann said.
Once the company looked at other data points, including the number of employees a customer has and the product features their customers were using, they discovered other patterns that could anticipate changes in churn.
"By getting into these more nuanced insights, and using all of their data, they were able to create a much more targeted and, as a result, much more effective engagement strategy," she said. "Which, for them, translated into seven figures of additional retained annual recurring revenue."
Limited data use
Despite the increasing focus on data, especially among SaaS companies, only 54% of executives use data analyses in their decisions, and 73% say they’re not getting what they expected out of their investment, Sisu CEO Peter Bailis said. "We have more data but somehow we're not putting it into action."
Limiting analysis to small and obvious components of the data is part of the problem, but the process could also be an issue, he added.
He recommended better integrating the data team with the business functions, or at least bringing the team into the strategic process earlier so they can help shape what questions to ask.
"Sending a request to the analytics team without sharing the business context, the strategic reason behind why a question’s being asked, leads to wasted cycles," he said.
At one company Sisu worked with, executives directed the data team to look at what product features generated the most engagement, but that wasn’t helpful to the products team, he said. The products team wanted to know what engagement thresholds led to adoption.
"When they moved [the product] over to general availability, in a self-service trial, the products team wanted to know what behaviors they should incentivize people to perform," Bailis said. "Should they invite three friends, five friends? Should they sign up for three subscriptions? They didn’t get the right granularity."
Limited report formats
Other problems stem from inconsistent and cumbersome reporting formats, the specialists said.
It’s not uncommon for executives to decide against seeking follow-up reports, even though they’re not getting the insight they need from the reports, because it’s too time-consuming and unwieldy to get what they want, Bailis said.
The problem can solve itself if the company uses a reporting system that enables executives to manipulate the data themselves in targeted ways, such as changing the reporting period from 14 days to 28 days.
"Many of the follow-ups that business leaders want to ask of their data are simple variances of the results presented," he said.
Inconsistent formats is another issue, especially if the data team is answering a question it hadn’t had to address before. If the reporting system isn’t flexible, the data team will create one-off reports that don’t work seamlessly with other reports.
"Data teams wind up spinning up ad hoc or situational analyses," Hoffmann said. "People jump into Excel, slides, Python notebooks [as they] come up with their own format to report results."
Delayed response
How quickly executives can act on their data reports is another big problem area, the specialists said. If there’s a week-long lag between generating the data and reporting on it, and then another week analyzing it, too much time will have elapsed to prevent an at-risk customer from leaving or otherwise solve a problem or capitalize on an opportunity.
Bailis pointed to a tech company that lost a strategically important customer to churn because it took them two weeks to identify the problem and reach out with a solution.
"When the product leader called up the customer, that person said, 'Sorry, we already made the decision to churn out,'" he said.
With all of these issues, Bailis said, it’s not that the data isn’t helpful in decision making; it’s the tools for processing and reporting the data aren’t making the data accessible in the most useful way.