Use Case 1: Maximizing Knowledge Base ROI

Is your article work improving your knowledge base ROI?

Article optimization improves customer satisfaction and maximizes knowledge base ROI. But how does a company determine whether its knowledge base investments are justified? In this use case, we demonstrate how SmartTEST can be used to substantiate knowledge base investments, to show that work invested in rewriting an article translates into monthly ROI savings and improves customer satisfaction. The use case was generated from one of our customers.

Use Case Overview

Improving the knowledge base

In this use case, Safeharbor’s knowledge base experts who specialize in article optimization recommended a remake of the article titled ‘Online Account Updating’. The new test article was given a more appropriate title: ‘How do I update payment information?’ and was updated with more succinct and informative content as well as improved organization.

The Test Article

More Informative Content: Safeharbor’s knowledge base experts improved readability and added content to make the article more informative.

Improved Organization: Pictures and hyperlinks were added to facilitate navigation and help users understand how to update their account.

More Appropriate Title: The title was changed to better reflect the main purpose of the article.

Safeharbor’s knowledge base experts predicted that adopted changes would improve the user experience and improve the goal of call deflection. To substantiate these claims and determine the exact effectiveness of the new article, in particular the ROI, the knowledge base administrator ran a SmartTEST Experiment.

In this use case we look at how…

SmartTEST Experiment is set up.

Quantitative results are collected and interpreted.

SmartTEST demonstrates ROI savings from self service and visitor satisfaction improvements.

SmartTEST substantiates the work that went into optimizing the article.

Setting up the SmartTEST Experiment

Setting up the experiment takes just a few minutes! A new trial is created in SmartTEST and original and reworked articles are selected for the experiment.

Trial Customization

The administrator can easily customize how the trial will be conducted. The experiment can run for a predetermined amount of time, until significant improvements are seen, or whichever one comes first. After the experiment ends, the administrator can choose to keep the winner, stick with the control article, or continue sampling. Sampling rate can also be set if the experiment needs further customization. SmartTEST is fully automated; just set up the parameters and let SmartTEST do it work!

SmartTEST Outcomes

To determine whether a knowledge base article is successfully deferring support queries, SmartTEST tracks and analyzes on-page actions. To help SmartTEST determine which actions improve ROI, the administrator selects outcomes that have a positive, negative, or neutral effect. These outcomes form the basis for hard ROI feedback as well as qualitative feedback about user behavior and article effectiveness.

For example: Knowledge base administrators consider escalation to an alternate support channel (phone / email / case submission) a negative outcome because the company spends money on every direct customer support request.

Meanwhile, clicking an internal link or exiting the knowledge base page constitutes a positive outcome because it indicates that visitors are reading the article and following breadcrumb links or exiting the page as anticipated.

Examples of Negative Outcomes: Negative outcomes represent undesirable user actions and form the basis for quantitative ROI evaluation. Since the goal of SmartTEST is to demonstrate whether an experimental article is able to deflect calls, a decrease in negative outcomes indicates that the company is spending less money on direct support inquiries. Examples include: using the ‘Contact Us’ link, Email Us link, ‘Call Us’ link, submitting a Support Case, leaving a negative article rating, etc.

Examples of Positive Outcomes: Positive outcomes represent a desirable user action. Examples include: exiting the knowledge base page, leaving a positive article rating, clicking a specific link that an administrator is trying to direct users to, etc.

Examples of Neutral Outcomes: On-page links that don’t impact article ROI and are tracked for observational purposes. For example, the administrator might want to know which links on the page are most popular or how many users go to a specific page after reading the article.

Selecting Outcomes in SmartTEST

To analyze how users respond to the improvements adopted by the knowledge base experts, the SmartTEST administrator selects positive, negative and neutral outcomes. These are simply dragged and dropped from the article interface. To save you time, most common outcomes can be preselected and will automatically populate the outcome fields.

In this use case, some positive, negative and neutral outcomes have been set as defaults. For example, negative outcomes, which form the basis for hard ROI valuation, have already been assigned ROI values for call or interaction deflections.

Negative Outcomes

Submitting Support Case: Every time a visitor clicks the link and submits a support case, the company has to spend money on customer support. The SmartTEST administrator assigns this action a negative ROI value equal to the amount that is spent on an average support case.

Contact Us link: Clicking the ‘Contact Us’ link indicates that the knowledge base user was not able to find the answer. Contacting the company uses customer support resources; therefore, the SmartTEST administrator gives this action a negative ROI value equal to the average amount the company spends on phone support.

Negative article rating: A decreased customer satisfaction rating indicates that the user was not satisfied with the knowledge base article. Thus, this action is selected as a negative outcome.

Positive Outcomes

Close Browser: SmartTEST generally considers closing the page an indicator that the customer found the answer or chose not to contact the company directly. This is a general observation that proves out the better article with sufficient page views.

My Account link: The SmartTEST administrator lists the ‘My Account’ link as a positive outcome because it indicates that a user read the article, learned how to update their account, and is now doing so. A successful outcomes occurs here if more users of the test article proceed to update their account directly versus calling the company as they have done in the past.

Positive article rating: An improved customer satisfaction rating that is submitted voluntarily is a strong predictor of whether the knowledge base was effective at providing the right answer.

Neutral Outcomes

For neutral outcomes, the knowledge base administrator simply wants to observe user behavior. Knowing where users go after reading the article provides administrators with insight into how the knowledge base can be further optimized.

Browse by Subject links: the knowledge base administrator selects various ‘Browse by Subject’ links to see if the new page has any influence on where users proceed after reading the article. You can select as many outcomes as you like ensuring that every user action gets tracked.

Running the experiment

With the control and the experimental pages selected and the positive, negative, and neutral outcomes set, SmartTEST testing can begin. A running experiment displays live reports allowing knowledge base administrator to see the results as they come in.

SmartTEST Report

The dashboard – illustrated on the right – provides a general overview of the trial and displays the experiment’s Completion Status, Results, and Estimated ROI.

The full SmartTEST Report – which is accessed from the dashboard – presents an in-depth analysis of the experiment, providing information about the Average Time Spent on Page, Total Page & Unique Page Views, Percentage of the Page Scrolled, Estimated ROI, and user link interaction i.e. Positive, Negative, and Neutral Outcomes.

You can view the full Use Case 1 Report here.

Results of the experiment

Looking at the SmartTEST Report, the knowledge base administrator can see that the experimental article produced the positive improvements the company expected to see:

Dramatic Improvements in ROI: The SmartTEST dashboard shows that the experimental article produced a significant ROI improvement, saving the company $6,876 since the start of the trial. The quantitative data on this article alone demonstrates that the work invested in the knowledge base is paying off and gives the knowledge base administrator a foundation from which systematic article improvement can continue.

Significant Decrease in the number of Negative Outcomes: ‘Submitting Support Case’ and ‘Contact Us’ outcomes decreased on the experimental article page compared to the control. Because SmartTEST shows that less support cases were submitted, the administrator has quantitative evidence that the new article is saving money by deflecting calls!

Significant Increase in the number of Positive Outcomes: More users navigated to the ‘My Account’ page which indicates that the article was successful at explaining how to update payment information. We can also see from the increased number of clicks that more users exited the knowledge base and proceeded to the account update page after reading the article – yet another indicator that users were able to find the answer they were looking for!

Improvement in Article Ratings: SmartTEST shows that article ratings have improved and users are more satisfied with the new version of the article – more credit to the work the company invested in improving the new page!

Conclusion

SmartTEST is a powerful tool for optimizing knowledge base performance and maximizing monthly ROI. In this use case, we demonstrated how a company was able to substantiate a remake of an under-performing knowledge base article and produce quantitative improvements.

SmartTEST is a must-have tool for a knowledge base administrator. Without SmartTEST, evaluating the work that knowledge base experts put into improving this knowledge base article would pose a daunting challenge.

A knowledge base administrator can look at a few available metrics such as survey results or the number of submitted support requests to gauge whether the knowledge base work produces a positive or a negative net effect. However, these numbers offer little help in determining the precise ROI cost-savings and more importantly, leave administrators without any quantitative foundation on which to substantiate further investments in the knowledge base or discover customer usage behavior.

SmartTEST gives your company real tools for measuring knowledge base ROI and provides an empirical foundation for any work that goes into improving a company’s website. Watch the money you invest in your knowledge base generate real ROI savings and systematically improve customer service!

About Safeharbor