Defining and calculating the ROI of learning and development is a complex topic, which we’ve explored at length. Like most chunky challenges, it’s easy to tackle if you break it down into smaller pieces. In terms of the L&D puzzle, one of those smaller pieces is customer education. Proving the business impact of your customer education programs is an important (and compelling) step towards telling the broader L&D ROI story.
When it comes to capturing the ROI of your customer education programs, here are three steps you won’t want to skip:
- Define business outcomes and align your customer education programs with them
- Create cross-functional teams that maximize the impact of programs
- Strategically analyze outcomes and craft compelling, data-driven stories
In a recent webinar, Adam Ballhaussen (Senior Director of Customer Education & Advocacy, Docebo) and Samantha Murray (Director of Solutions Marketing, Docebo), tackled the subject and gave participants a behind-the-scenes look at our approach to capturing customer education impact.
You can watch the full recording here, but for a five minute recap, read on!
Customer education business outcomes
Business outcomes typically fall into two broad categories: decreasing costs and increasing revenue. The path that customer education takes to arrive at these ubiquitous endpoints is what we must map out.
The folks at Catalyst.io developed a framework (which Adam shared in the webinar) to help illustrate this: GEAR.
- Growth efficiency. Customer education programs can reduce post-sale costs or improve margin
- Expansion. Tap into new revenue streams (e.g. education to upsell)
- Advocacy. Evangelizing customers through education to improve word of mouth.
- Retention. Customer education can mitigate churn risk.
The GEAR framework
These elements can be further broken down into more specific outcomes, for example:
- Increase brand awareness (A)
- Increase qualified leads (E, G)
- Improve conversion rates (E)
- Decrease time-to-value (G, R)
- Improve product adoption (G, R)
- Reduce reliance on support (G, R)
- Improve net promoter score (A, R)
- Increase renewal rates, upsells, and LTV (G, E, R)
- Improve loyalty (A)
- Feed user generated content into education programs (G, A)
Any or all of these outcomes can be traced back to the bottom line. This defines the path to impact, but these paths need to be built and maintained. This is where cross-functional teams come in.
The foundational role of cross-functional teams
Proving business impact for a complex system like customer education requires cooperation from departments all across the organization. Cross-functional teams necessitate the sharing of information, data, and processes that will:
- improve the overall efficiency of a customer education program
- provide insight and data critical to telling the impact story
If the business outcomes for customer education programs are all owned by L&D, then you simply won’t be able to explore some of them (e.g. upselling and renewals are a sales function).
If the business outcomes are spread out into silos, then you may be carving out several paths to impact, but everyone will be doing it in a different way, with different priorities, and varying levels of success. (It goes without saying, but this siloed approach is extremely inefficient.)
“Build strong cross-functional relationships across your organization to maximize the impact of your Customer Education programs.”
Cross-functional teams bring people from different areas together under a common mandate. For example, at Docebo, our Customer Experience team consists of groups including: Support, Professional Services, Customer Success, Renewal Management, and Customer Education. The Customer Experience team works very closely with Marketing as well.
Each group within the team owns one or two “paths” and business outcomes as described above, and each understands and relies on the others to do their job.
Docebo’s cross-functional CX team
Crafting compelling, data-driven stories
With business outcomes defined and the right team structure in place to deliver them, the true work of proving business impact begins.
Metrics are the raw ingredients for your story. Obtaining key metrics or performance indicators is much, much easier when outcomes are clear and there is strong communication between teams that have access to them.
“Prioritize Customer Education programs that are aligned to shared goals and will directly drive shared business outcomes across your organization.”
Examples of strong customer education success metrics
- Self-service score. A ratio of help tickets submitted to people helped by static knowledge base (KB) content. Higher is better. For example, a score of 1.9 means that for every ticket submitted, almost two people were able to self-serve.
- Search-to-ticket ratio. Similar to the above, but expressed as a percentage. For example, 5% means that for every 100 searches on a KB only 5 tickets were created. Lower is better for this metric.
- Content views. Track how many views and how much time was spent on customer education content (KB, courses, etc.) More views and time typically indicates that more education happened.
- Active accounts. What ratio of accounts within a learning community or LMS show activity? This can be time-gated (e.g. active within the last week). Higher numbers over shorter periods are better.
- Attendance rates. For live events, attendance rates are a common way to gauge interest and engagement.
Metrics like these are only the beginning. Popping the raw data into a report is absolutely not enough to truly prove business impact. Alone, these data points are merely observational. A connection must be tied to the business outcomes themselves, and wherever possible, a value (even if estimated).
Analyzing the data and telling the story
Deriving meaning from the raw metrics is a core challenge, and there’s no one-size-fits-all way to approach it. The angles you take and the focus of your story can depend on your organization’s priorities and focus.
A common example is ticket deflection. Your raw metrics (e.g. self service score, search-to-ticket ratio) can help your analysts estimate the number of tickets your programs helped avoid. It should also be possible to estimate the average “cost per ticket” in terms of a support person’s time and resources consumption.
In this case, the data-driven story goes something like: “Our customer education programs deflected an estimated 150 tickets this month, which amounts to about $50,000 in cost savings.” Once you have a value nailed down, compare that to the costs of the customer education program to get a sense of ROI.
“You have to manage up & out and tell the right story to ensure executives are aware of the success of your programs.”
Here are a few examples from our own internal analyses of our customer education programs to illustrate how the results from these programs programs feed into the broader L&D ROI story (check out the full webinar for even more behind-the-scenes info):
- 8 hours saved per CSM per week
- 24% increased product adoption quarter over quarter
- 6.4% reduction in implementation time
That last figure—the implementation time reduction—translated to $600,000 in value annually! Lead generation or direct revenue (e.g. from upsells or renewals) can also be a fairly straightforward data-driven impact story.
Want to dive deeper?
Join Adam and Samantha for a deeper dive into these topics, as well as a demo of some of our own customer education program tactics. We show you how we tie business outcomes to things like onboarding, release readiness programs, live training, how-to videos, and more.
The recording of our webinar, How to prove the impact of your customer education program, can be found here.