Knowledge Management Institute

Maximizing and Measuring User Adoption

How would you like to be a Guest Blogger for KMI? Email us at: info@kminstitute.org and let us know your topic(s)!

Maximizing and Measuring User Adoption

Aug 30, 2017   |  By
Mary Little | Enterprise Knowledge LLC

Similar to the old adage, “you can lead a horse to water, but you can’t make him drink,” you can deliver a solution that uses the most cutting-edge technology and beautiful design, but you can’t guarantee that your stakeholders will embrace it. This blog offers practical tips on how to maximize and measure user adoption to ensure that your new tool or process is fully embraced by those for whom you’ve designed it.

To deliver a project success story backed with quantitative and qualitative data to support it, you should take an objective-first approach to change management. This requires a shift in focus from what the change is (e.g. the implementation of a new tool or process) to what you aim to achieve as a result of the change (e.g. increased productivity or improved work satisfaction). Rather than only highlighting the features of the new technology, you’ll want to focus on the benefits the users will gain from using it. Taking this approach is particularly critical for Knowledge Management initiatives, which are initially often met with skepticism and a broad sense of concern that there’s not enough time in the already busy day to acclimate to another new tool or process. By following these guidelines, you’ll be able to say “our users love the new tool and they are so much more effective and efficient as a result of it…” and “here’s the data to prove it.”

The way to accomplish this is by setting “SMART” objectives at the start of your project and developing an anaytics strategy that will help you measure your progress towards achieving those objectives. These objectives should clearly express desired changes in user behavior and the impact these new behaviors are expected to have on overall productivity and effectiveness. In the words of Stephen Covey, “start with the end in mind” so that all your efforts are aligned towards achieving your expected results.

Let me put this into context using one of my current projects. I’m working with a global manufacturing organization to design and implement a tool that will help the communications department work in a more centralized and collaborative way. The team is responsible for delivering content about events, programs, and news items to internal employees as well as external stakeholders. The team is used to working in silos and each team member uses different tools for storing, sharing, and finding information such as a basic team site, email, and desktop file folders.

From the very beginning of the project, change management has been a priority. We knew that if we wanted the communications department to adopt the new tool, we had to think of ways to encourage them to do so well in advance of them even having contact with it. Here are ways to apply what my team has done to your change effort to help you maximize and measure user adoption:

Step 1: Align your metrics with desired outcomes

To encourage a more centralized and collaborative way of working for the communications department, we’re using Microsoft O365 tools such as MS Teams, MS Planner, and modern SharePoint team sitesas a platform for the new system. We chose this suite of tools because it offers various features that, if used, could save the department a lot of time, reduce wasted effort, and ultimately elevate their role to a more strategic partner within the organization.

Here’s how we’ve expressed our primary objective:

“Increase the team’s efficiency by managing all campaign content, including digital assets, in the new tool within 90 days of launch.”

When content is stored in various places, not everyone has access to the latest versions. This causes a lot of confusion and re-work. The challenge is that people defer to the processes they’re most used to, which is often saving information in their local drives and sharing it via email. The new behavior we wanted to encourage was saving information in a centralized location (in this case a SharePoint team site), so that everyone has access to the latest version, edits are being made to the same copy, and there’s a tracking history of the edits, as well as who made them.

The objectives you identify will vary depending on the challenges you’re trying to solve, so your success metrics should be aligned accordingly. In this case, defining our objective leads us to what we should measure: the percentage of campaign content that is stored and shared in the tool vs. outside of it.

Step 2: Capture baseline metrics and keep it simple

In order to be able to tell a story about the impact of a new tool, you need baseline metrics for comparing your results. For this project, we had three categories of metrics and different approaches for capturing each:

  • Satisfaction Level: We deployed a survey that measured how useful users found their current system.
  • Proficiency Level: We deployed another survey that measured their self-rated proficiency levels with basic SharePoint functionality such as uploading and sharing documents.
  • Usage Level: We tracked activity on the system after launch. This includes number of active users, number of documents and multimedia files saved and shared via the tool, and number of interactions in the conversations space.

The key here is to keep it simple. We designed the surveys to be short and to the point, and only asked specific questions that would help inform the decisions we made on the project. We also didn’t measure everything. We kept it basic to start and the longer the users had to engage with the system, the more sophisticated our metrics became.

Step 3: Take actions that lead to measurable improvements

Our satisfaction survey, along with in-depth user analysis and testing, informed the features we included in our new tool. As we were prioritizing the features, we kept our objectives in mind. It was critical for us to ensure our tool had a separate space for managing content for each campaign. This space had to make it easy for the team to upload, edit, share, and find content, including text-based and multimedia assets.

Our proficiency survey helped us to design the training for the new tool. Had we made the assumption that our users were already familiar with SharePoint’s basic functionality, we would have gone into our training sessions ready to introduce all of its advanced features. Knowing that the team members were not as confident in their SharePoint abilities led us to design a basic SharePoint prerequisite training session for those that needed it. Meeting users at their proficiency level and guiding them towards the level they need to be to make the most of the new tool’s features prevents them from being so discouraged that they abandon the new tool prematurely. (Get more helpful tips on user training by watching Rebecca’s video, Top 5 Tips for Using Training to Promote Adoption).

This is important because we planned to deploy the satisfaction and proficiency survey again once we launched the new tool. Taking actions based on the results of the baseline survey created measurable improvements in how much the users liked the new tool(s) they were using and how confident they were in using it.

Step 4: Measure again once you’ve implemented your solution

This may seem like common sense, but let your users know that the tool is now available for them to use and train them how to use it! Often, the team members heavily involved in the project assume that users know it exists and will intuitively learn how to use it on their own. The team building the tool has spent the past few months or so immersed in the tool, so they are likely to overestimate other people’s awareness of the tool and underestimate the learning curve associated with it.

In our case, our baseline usage level was 0 team members because the tool was brand new. Our goal was to increase usage level to all 30 team members. Our strategy for getting all 30 team members to use the tool, rather than relapsing back to their old habits and systems, was the deployment of “early and often” messages about the tool, along with thorough training for each team member we expected to use it. Long before the tool was launched, we built excitement and awareness around the new tools via a teaser video, Yammer posts, emails, and messages from leadership during team meetings. Once the tool was launched, we conducted live training sessions and delivered helpful resources and guides.

Along the way, we were asking:

  • What percentage of the team watched the teaser video?
  • How many team members saw the Yammer posts? How many “liked” it, replied to it, or shared it?
  • How many of the team members heard and saw the presentation?
  • Did the team members react positively or negatively to the messages in the video, posts, and presentations?
  • How many of the team members completed the optional pre-work and basic training?
  • How many of the team members attended the live training sessions?

All of these metrics were indicators of the degree to which the users would adopt the new tool. You can then validate these indicators by measuring actual adoption, e.g. user activity within the tool and their satisfaction in using it.

Step 5: Give it some time, then measure again

As we were building the tool, the project team discussed how we were going to tell our success story. But, that really depended on how we defined our success. For us, did success mean that we launched the new tool on schedule and under budget? Or, did it mean that the communications team members were embracing the new tool and way of working? The latter for us was much more important so we developed a timeline for capturing feedback: one week after launch, one month after launch, 3 months after launch, and 6 months after launch. During these set time periods, we would capture metrics around how satisfied they are with the new tool and its impact on their work and how proficient they felt with their new skill sets. In addition to self-reported data, we would also track usage metrics such as what percentage of the team actively manages their campaign within the tool vs. outside of it.

Summary

Organizations invest large amounts of money on new technology with the intentions of improving employee productivity. The key to getting a significant return on these investments is to make sure your project team has what it takes to define, drive, and measure success. If you want to make sure the next solution you roll-out maximises user adoption and produces measurable results, contact Enterprise Knowledge at info@enterprise-knowledge.com.


About the Author:  Mary Little is an accomplished project manager, business analyst, and management consultant. She ardently believes in the value of strong leadership, effective communication, and maximized talent potential as means to organizational change and success.

 

How to Contact Us

3554 Founders Club Drive, 
Sarasota, FL, 34240 (USA)

Phone:         (US) 1-703-327-7096

Training: training@kminstitute.org
General Questions: info@kminstitute.org
Partnering: eric.weidner@kminstitute.org

Follow us on Twitter Connect to us on Linked In Like us on Facebook Join us on Slack

What's Coming Up

KM & User Engagement Certification
Live Online, Apr 23-24; click here... 

Certified Knowledge Manager Certification for Gulf/EMEA
Live Online, Apr 22-25; click here...

© 2024 KM Institute

All Rights Reserved.