Analyze user behavior and learn how visitors are using your support site.
Knowledge base optimization is an ongoing process. The most difficult part of improving the support channel is understanding what your visitors are looking for and how their customer support experience can be improved. With SmartTEST, knowledge base administrators can learn more about user behavior and use this knowledge to optimize their knowledge base.
In this use case we demonstrate how SmartTEST can be used to track and analyze user behavior to optimize a knowledge base.
A company wants to analyze a high-traffic knowledge base article to better understand customer behavior. The company is trying to gain a better understanding of how users go about using the knowledge base page in order to find ways to optimize the article and improve performance.
The article – ‘Ways to pay your bill’ – provides information on the various ways a customer can pay their bill. The knowledge base administrator uses SmartTEST to gain a better understanding of how visitors interact with this page, what links they go to, and what can be done to improve the page.
SmartTEST Experiment is set up.
Quantitative data is collected and interpreted.
SmartTEST reveals new findings about user behavior and helps knowledge base administrators optimize the article.
Setting up the experiment takes just a few minutes! A new trial is created in SmartTEST and original and reworked articles are selected for the experiment.
The administrator can easily customize how the trial will be conducted. The experiment can run for a predetermined amount of time, until significant improvements are seen, or whichever one comes first. After the experiment ends, the administrator can choose to keep the winner, stick with the control article, or continue sampling. Sampling rate can also be set if the experiment needs further customization. SmartTEST is fully automated; just set up the parameters and let SmartTEST do it work!
To determine whether a knowledge base article is successfully deferring support queries, SmartTEST tracks and analyzes on-page actions. To help SmartTEST determine which actions improve ROI, the administrator selects outcomes that have a positive, negative, or neutral effect. These outcomes form the basis for hard ROI feedback as well as qualitative feedback about user behavior and article effectiveness.
For example: Knowledge base administrators consider escalation to an alternate support channel (phone / email / case submission) a negative outcome because the company spends money on every direct customer support request.
Meanwhile, clicking an internal link or exiting the knowledge base page constitutes a positive outcome because it indicates that visitors are reading the article and following breadcrumb links or exiting the page as anticipated.
Examples of Negative Outcomes: Negative outcomes represent undesirable user actions and form the basis for quantitative ROI evaluation. Since the goal of SmartTEST is to demonstrate whether an experimental article is able to deflect calls, a decrease in negative outcomes indicates that the company is spending less money on direct support inquiries. Examples include: using the ‘Contact Us’ link, Email Us link, ‘Call Us’ link, submitting a Support Case, leaving a negative article rating, etc.
Examples of Positive Outcomes: Positive outcomes represent a desirable user action. Examples include: exiting the knowledge base page, leaving a positive article rating, clicking a specific link that an administrator is trying to direct users to, etc.
Examples of Neutral Outcomes: on-page links that don’t impact article ROI and are tracked for observational purposes. For example, the administrator might want to know which links on the page are most popular or how many users go to a specific page after reading the article.
To analyze how users respond to the improvements adopted by the knowledge base experts, the SmartTEST administrator selects positive, negative and neutral outcomes. These are simply dragged and dropped from the article interface. To save you time, most common outcomes can be preselected and will automatically populate the outcome fields.
In this use case, some positive, negative and neutral outcomes have been set as defaults. For example, negative outcomes, which form the basis for hard ROI valuation, have already been assigned ROI values for call or interaction deflections.
Since in this experiment the administrator only wants to track user actions, only the Neutral Outcomes need to be completed. Once the desired links have been selected, SmartTEST can begin collecting and providing qualitative feedback about user behavior.
SmartTEST studies a large amount of variables and can help knowledge base administrators learn a great amount of information about their website visitors:
Neutral Outcomes: Knowledge base administrators choose which links they want SmartTEST to observe. Once the test is completed, the data can be analyzed to find which payment options users are clicking on the most. From this, administrators can learn a number of things: which payment option is most confusing to customers and leads to most case submissions & ‘Contact us’ Outcomes, which one is most popular, etc.
Article Ratings: Knowledge base administrator use article ratings feedback to determine customers satisfaction rates. If the visitors are happy with the answers on this page an administrator may decide to leave the article as is.
Page Views / Average time on page / Average % of page scrolled: Knowing the number of page views, time spent on the page, and percentage of the page scrolled, are all useful metrics that help determining whether investments in the knowledge base are worthwhile.
Continuous Experimentation: The administrator can also use the article to make incremental changes. For example, in trying to improve the article title or the introductory paragraph, he might find a change that leads to significant improvement customer satisfaction rating or lower case submission rate.
Looking at the SmartTEST Report, the company discovered some important findings. This information helped the knowledge base administrator determine how to improve the article and provided a quantitative base for these improvements.
Using SmartTEST findings to substantiate knowledge base work: The company was surprised to find out that most users clicked on the ‘Pay by Mail’ payment method. The company would prefer if visitors used the ‘Paperless e-Bill’ payment option instead because it’s quicker and cheaper
Discovering this user behavior is very important. The finding gives the knowledge base administrator a basis for optimizing the article. To direct users to the preferred payment option, the administrator might try reorganizing the page:
For example:The knowledge base administrator suspects the reason so many users are selecting the ‘Pay by Mail’ method is because of how the knowledge base page is organized. By moving the information about the ‘Paperless e-Bill’ payment method to the top of the page – where it is easy to find – and moving the Pay by Mail option to the bottom, the knowledge base administrator can direct users to the desired payment option.
Please see Use Case 2: Directing Users to see how such an experiment can be set up.
SmartTEST reveals important data about user behavior: The experiment collected a large amount of information that demonstrates the strengths and weaknesses of the knowledge base article. The knowledge base administrator can analyze the Positive, Negative and Neutral Outcomes; look at Article Ratings and User Feedback; view Page Views, Average Time Spent on Page, Average % of the Page Scrolled, and more to determine what is missing and how the article can be improved.
SmartTEST is a powerful tool for tracking and analyzing user behavior. In this use case, we illustrated how SmartTEST was able to help a knowledge base administrator discover a way to optimize a knowledge base article. Using observational data, the administrator can systematically monitor and optimize the company’s knowledge base.
SmartTEST gives knowledge base administrators valuable information for justifying investment in knowledge base. Without the help of SmartTEST, knowledge base administrators are left to make educated guesses on how to improve their articles. Why make knowledge base optimization more expensive and time-consuming than it should be?