Use Case 2: Improving Navigation and Directing Users

Are your customers finding their answers quickly and intuitively? Direct them to the best answer with the help of SmartTEST.

Directing users to the most relevant answer in the shortest time possible should be the goal of any company that relies on customer self service. In this use case, we demonstrate how SmartTEST can substantiate improvements in article organization and navigation, to quickly direct customers to the best possible answer. The use case was generated from one of our customers.

Use Case Overview

Improving the knowledge base

An article titled ‘Bill Summary’ was a high traffic page for this customer. The company’s knowledge base administrator didn’t want to do a complete re-work of the article, as demonstrated with Use Case 1: Maximizing Knowledge Base ROI. Instead, the administrator wanted to improve article navigation and organization to ensure that users were heading in the right direction to find the most relevant information.

The administrator optimized the article by making two changes to the original page: improving navigation & organization of the article, thereby giving users a clearer path to the information they are looking for and changing the title of the article to ‘Understanding Your Bill’ – a title that better represents the main purpose of the article.

The Test Article

Improved Navigation and Organization: Solid blocks of text were broken down into ‘Commercial’ and ‘Residential’ sections to improve organization. Hyper-linked titles were added at the top to give users a quick shortcut to these sections and to facilitate article navigation. The ‘Commercial’ and ‘Residential’ sections were further refined with additional hyperlinks and illustrations for improved mobility.

More Appropriate Title: A new title ‘Understanding Your Bill’ was chosen because the article thoroughly explains every part of the bill rather than just giving a summary.

The knowledge base administrator predicted that adopted changes would improve user navigation and direct users to the appropriate answer faster and more efficiently. To substantiate these claims, the work invested in the remake of the page, and to determine the exact effectiveness of the new article and the monthly ROI savings, the knowledge base administrator set up a SmartTEST Experiment.

In this use case we look at how…

SmartTEST Experiment is set up.

Quantitative results are collected and interpreted.

SmartTEST demonstrates improvements in the organization and navigation of the article and visitor satisfaction improvements.

SmartTEST substantiates the work that went into optimizing the article.

Setting up the SmartTEST Experiment

Setting up the experiment takes just a few minutes! A new trial is created in SmartTEST and original and reworked articles are selected for the experiment.

Trial Customization

The administrator can easily customize how the trial will be conducted. The experiment can run for a predetermined amount of time, until significant improvements are seen, or whichever one comes first. After the experiment ends, the administrator can choose to keep the winner, stick with the control article, or continue sampling. Sampling rate can also be set if the experiment needs further customization. SmartTEST is fully automated; just set up the parameters and let SmartTEST do the work!

SmartTEST Outcomes

To determine whether a knowledge base article is successfully deferring support queries, SmartTEST tracks and analyzes on-page actions. To help SmartTEST determine which actions improve ROI, the administrator selects outcomes that have a positive, negative, or neutral effect. These outcomes form the basis for hard ROI feedback as well as qualitative feedback about user behavior and article effectiveness.

For example: Knowledge base administrators consider escalation to an alternate support channel (phone / email / case submission) a negative outcome because the company spends money on every direct customer support request.

Meanwhile, clicking an internal link or exiting the knowledge base page constitutes a positive outcome because it indicates that visitors are reading the article and following breadcrumb links or exiting the page as anticipated.

Examples of Negative Outcomes: Negative outcomes represent undesirable user actions and form the basis for quantitative ROI evaluation. Since the goal of SmartTEST is to demonstrate whether an experimental article is able to deflect calls, a decrease in negative outcomes indicates that the company is spending less money on direct support inquiries. Examples include: using the ‘Contact Us’ link, Email Us link, ‘Call Us’ link, submitting a Support Case, leaving a negative article rating, etc.

Examples of Positive Outcomes: Positive outcomes represent a desirable user action. Examples include: exiting the knowledge base page, leaving a positive article rating, clicking a specific link that an administrator is trying to direct users to, etc.

Examples of Neutral Outcomes: On-page links that don’t impact article ROI and are tracked for observational purposes. For example, the administrator might want to know which links on the page are most popular or how many users go to a specific page after reading the article.

Selecting Outcomes in SmartTEST


To analyze how users respond to the improved article, the SmartTEST administrator selects positive, negative and neutral outcomes. These are simply dragged and dropped from the article interface. To save you time, most common outcomes can be preselected and will automatically populate the outcome fields.

In this use case, some positive, negative and neutral outcomes have been set as defaults. For example, negative outcomes, which form the basis for hard ROI valuation, have already been assigned ROI values for call or interaction deflections.

Positive Outcomes

Close Browser: SmartTEST generally considers closing the page an indicator that the customer found the answer or chose not to contact the company directly. This is a general observation that proves out the better article with sufficient page views.

Positive article rating: An improved customer satisfaction rating that is submitted voluntarily is a strong predictor of whether the knowledge base was effective at providing the right answer.

Internal Article Links: The knowledge base administrator considers interaction with links within the articles a sign of positive behavior. Every click represents another step taken in the direction of the right answer, therefore an article that has a high rate of user interaction indicates that users are clicking the links and getting to the information they are looking for. This is a great way to test whether article organization and navigation has improved!

Negative Outcomes

Submitting Support Case: Every time a visitor clicks the link and submits a support case, the company has to spend money on customer support. The SmartTEST administrator assigns this action a negative ROI value equal to the amount that is spent on an average support case.

Contact Us link: Clicking the Contact Us link indicates that the knowledge base user was not able to find the answer. Contacting the company uses customer support resources; therefore, the SmartTEST administrator gives this action a negative ROI value equal to the average amount the company spends on phone support.

Negative article rating: A decreased customer satisfaction rating indicates that the user was not satisfied with the knowledge base article. Thus, this action is selected as a negative outcome.

Neutral Outcomes

For neutral outcomes, the knowledge base administrator simply wants to observe user behavior. Knowing where users go after reading the article provides the administrator with insight into how the knowledge base can be further optimized.

Browse by Subject links: The knowledge base administrator selects various ‘Browse by Subject’ links to see if the new page has any influence on where users proceed to after reading the article. You can select as many outcomes as you like ensuring that every user action gets tracked.

Running the experiment

With the control and the experimental pages selected and the positive, negative, and neutral outcomes set, SmartTEST testing can begin. A running experiment displays live reports allowing knowledge base administrator to see the results as they come in.

SmartTEST Report

The dashboard – illustrated on the right – provides a general overview of the trial and displays the experiment’s Completion Status, Results, and Estimated ROI.

The full SmartTEST Report – which is accessed from the dashboard – presents an in-depth analysis of the experiment, providing information about the Average Time Spent on Page, Total Page & Unique Page Views, Percentage of the Page Scrolled, Estimated ROI, and user link interaction i.e. Positive, Negative, and Neutral Outcomes.

You can view the full Use Case 2 Report here.

Results of the experiment

Looking at the SmartTEST Report, the knowledge base administrator can see that the experimental article produced the positive improvements the company expected to see:

Improved Organization and Navigation: The most important data in the experiment is user interaction with internal article links since it demonstrates that visitors are successfully navigating to their answers. The SmartTEST report shows a significant number of user actions with the links that were added in the experimental article.

Improved navigation allowed visitors to easily spot navigation links and use them to get to the answers in the article. SmartTEST helped the knowledge base administrator quantitatively measure navigation improvements and substantiate his work.

Dramatic Improvements in ROI: The SmartTEST report shows that the experimental article has generated an estimated monthly ROI of $1,950. Negative Outcomes – Case Submission and Contact Us, links that are used to measure how much the company spends on customer support ticket resolution and support phone calls, both show a lower number of user actions. SmartTEST helped the administrator calculate how much much money was saved and where the savings came from!

Unexpected Learning: SmartTEST reveals an important finding: ‘Making a Payment’ links, which appear both in the original and experimental articles, went from having almost no views to over 200. This observation illustrates that article organization was improved and users were able to use a link that was poorly visible in the original article.

Most importantly, this finding shows that many of the visitors reading the article are trying to pay their bill – something the knowledge base administrator did not suspect. Knowing this, the company will put an easily accessible link to the ‘Making a Payment’ page at the top of the article, thereby further improving knowledge base navigation and performance. In this case, SmartTEST revealed an important piece of information that will help the company further maximize ROI and improve article optimization!

Additional Improvements: SmartTEST also shows that article ratings improved. This indicates that in improving article’s organization and navigation, the knowledge base administrator was able to boost article performance and customer satisfaction.

Conclusion

SmartTEST is a powerful tool for improving article organization and user navigation. In this use case, we demonstrated how an energy company was able to substantiate improvements in the organization of one of its articles, improve article navigation, and also learn something new and important about the visitors to the page.

SmartTEST is must-have tool for knowledge base administrators. Without SmartTEST, evaluating the work that knowledge base experts put into improving this knowledge base article would pose a daunting challenge. Knowledge base administrators can look at a few available metrics such as survey results or the number of submitted support requests to gauge whether the changes produce a positive or a negative net effect. However, these numbers offer little help in determining precise ROI cost-savings and more importantly, leave administrators without any quantitative foundation on which to substantiate further investments in the knowledge base or discover customer usage behavior.

In this use case, knowing whether article navigation improved would be impossible without SmartTEST because the knowledge base administrator needs to know if the visitors are using links within the article to get to their answers. SmartTEST provides a highly useful set of tools for any knowledge base administrator who is trying to improve their knowledge base.

SmartTEST revealed an important piece of information that otherwise would have remained undiscovered! Seeing a vast amount of visitors go the ‘Making a Payment’ gave the knowledge base administrator valuable insight into how the article can be further improved. SmartTEST gives administrators valuable knowledge about user behavior, priceless information that saves company money and delivers premium customer service.

SmartTEST gives your company real tools for improving navigation and provides an empirical foundation for any work that you invest into improving the support channel. Rest assured that your visitors are finding the answers they are looking for and learn how to further optimize your knowledge base.

About Safeharbor