How would you like to be a Guest Blogger for KMI? Email us at: info@kminstitute.org and let us know your topic(s)!

Five Things that Content Management and an Orchestra Performance Have in Common

December 1, 2020

Imagine that you are in a theater listening to an orchestra. Do you notice that all the musicians refer to the same set of music sheets to ensure that they play their instruments in sync? Just like an orchestra performance, organizations also require aligning various components so that there is a harmonious content management performance. This blog describes the elements that they both have in common.  

Click here for more...

The Magical Art of Tidying Up Content

February 12, 2019

If you have Netflix, you’ve probably seen Tidying Up pop up on your feed. It’s a new series based on Marie Kondo’s book The Life-Changing Magic of Tidying Up, which presents her methodology for clearing your clutter, getting organized, and creating space for only the things that still and will continue to serve you and “spark joy” in your home. The Konmari method for tidying up has transformed the lives of thousands of people around the world and when applied to your content strategy, can dramatically impact the quality and findability of your organization’s knowledge and information.

Here’s what you need to know about the Konmari method and how to apply it to your content strategy efforts:

1. It’s not magic.

In one of the episodes, Marie’s client expressed how excited she was to witness the magic of tidying up. Marie quickly clarified that there is no magic– although she would provide guidance in the right direction, there was a lot of work ahead. In fact, the more stuff her clients had in their home and the deeper their emotional ties to that stuff, the more work it would actually be. To make it even more complicated, the relationship between the people who lived in that home would make it increasingly challenging because they would have to work together to make decisions about the future state of their space.

Similarly, I’ve worked with clients to truly understand the challenges related to their content and we work together to develop strategies for improving their ability to create, manage, and share the resources necessary for their jobs. When it comes to content, it’s important to be intentional and make sure that what you have achieves the objectives and meets the quality standards you’ve set. Each content repository should have a purpose, e.g. the intranet stores information and resources for internal employees or the e-commerce platform contains copy that helps to sell products. Then, each content item that lives in the content repository should align with that objective.

Marie makes her clients hold each and every single item they own and ask themselves whether they want to keep the item, discard the item, or donate the item. When conducting a content clean-up effort, content owners will also have to evaluate each piece of content and determine whether they should keep it, update it, or archive/delete it. The more content you have, the longer this process will take, but the more you do it, the easier and faster it becomes to accomplish the content clean-up effort.

2. The amount of stuff you have will overwhelm you.

One of the very first exercises that Marie asks her clients to do is to pile up all their clothes in one area. All of it. After an initial look of disbelief, they begin to go through all of their closets, drawers, and other hidden places where they’ve stuffed clothing. The mountain of clothes that results from this exercise almost always shocks the owners — some of the items haven’t been worn in decades and others still have their tags. This is an important step in the Konmari method because coming face-to-face with all of the things you have and don’t use or want validates the need to undergo this effort.

Organizations are able to create content at a rapid pace and before they know it, they have terabytes of information that make it impossible to find what they need. In the same way that parents will begin to store their clothes in their children’s closets because they’ve run out of space in their own, people begin storing knowledge and information anywhere and everywhere they can without really thinking through the consequences. By conducting a content inventory, you are then able to see just how much content you have and where it all lives. It will overwhelm you, but when you prioritize your efforts and address each set of content one folder, space, or department at a time, it becomes more manageable and less daunting.

3. You should use little boxes to organize things and put like things together.

After a few days of clean up, Marie will come back to her client’s homes with a gift: a bunch of little boxes of different shapes. These boxes do wonders for the drawers and cabinets in their homes. Imagine the utensil and tool drawer in your kitchen. Is it a jumble of random items that you have to sift through to get to the wine opener? Marie suggests compartmentalizing your drawers with little boxes and putting similar things together by size or function. Through this exercise, you realize just how many batteries and light bulbs you have because your decentralized storage habits made it difficult to find them when you needed them, causing you to just buy more.

This process is similar to the Content Types I’ve designed for my clients. It’s often a hard concept to wrap your head around, but simply put, content types are “little boxes” for content items so that it’s easy to find the information you’re looking for because of the standard templates they now belong in. Imagine reading a procedure but you have to read the entire document to figure out whether this procedure is applicable to your circumstance. If all procedures across your organization followed a similar format (e.g. Purpose of the Procedure, Applicability, Steps, and Related Processes), you would save time because you would know immediately where to look and what to expect when you need information. Content Types also help you to group like-content together so in the future you can find it rather than having to recreate it.

4. The order in which you tackle your project matters.

Marie advises her clients to tidy up four categories of stuff: Clothes, Books, Documents (Paper), Komono (miscellaneous items), and Momentos (sentimental items). You wear clothes each day so they tend to be the easiest to sort through first. It gives you an opportunity to practice and strengthen your decision-making skills so that as you progress to other categories, it becomes easier to decide what to keep vs. get rid of. Sentimental items are much tougher to let go of, so those are saved for last because by the time you get to them, you have a stronger sense of what’s truly important to you.

As you implement your content strategy, start with quick wins so that you gain enough momentum to tackle some of the more complex aspects of your project. You can start with content repositories that have the least amount of content or day-to-day content that makes it easy to identify what’s outdated and no longer applicable, like a wiki. Then, you can take on the bulk of your content such as project files, process documentation, and reference materials. Lastly, you can review legal or auditable documentation which would have severe implications if you get rid of them prematurely. The order in which you tackle your project will vary depending on your organization.

After months of applying the Konmari method, with guidance and support along the way, Marie’s clients find themselves surrounded by only the things that are essential to their well-being. It brings them peace of mind and creates a sense of optimism for their future because they now live in a home that doesn’t frustrate them on a daily basis. At EK, our content strategists bring the same sense of order to our clients’ content repositories so that they can focus on the more important things in their businesses. Need help tidying up? Contact us at info@enterprise-knowledge.com.

Making KM Clickable With Search

January 8, 2019

I’ve been in the business of Knowledge Management Consulting for the vast majority of my career and, in my experience, one of the most challenging aspects to KM is its intangibility. I’ve helped an array of organizations to define their KM Success Metrics and KM Key Performance Indicators (KPIs) in order to make KM measurable and tie it to business value and hard return on investment. In these cases, though, many of these KM KPIs are only measurable over years and often have a stronger demonstration of value to the organization rather than the individual. 

Since good KM is integrated into the business, enterprise KM programs are often largely invisible when they work, and only visible when they’re causing the end user “pain.” For example, a seamless tacit knowledge capture program feels like natural conversation, whereas a badly designed program will feel forced and overtly time-consuming. A natural content governance plan will be integrated into the enterprise and simply feel like how business is done, whereas a poorly designed governance plan will slow down work and create barriers to sharing and connecting.

As a result, KM runs the risk of not being “felt” by the average end user in a way that inspires engagement and support. Though a KM effort may be meeting long-term organizational goals, it nonetheless runs the risk of a decreased focus or dwindling support over time if the individual business stakeholder doesn’t feel the benefit of it.

One key area where the individual, as well as the business, can experience meaningful value from KM on a daily basis is through enterprise search. Though I’m not suggesting a technology is necessary for all aspects of KM, the reality is that for large organizations, a great deal of KM will be enabled through supporting technologies. A well-designed, implemented, and governed enterprise search is one of the key systems where KM becomes real for the average end user.

Several exciting things are happening within the enterprise search world at this point:

  • Enterprise search tools are increasingly able to index both structured and unstructured information, creating greater linkages between different types of knowledge and information.
  • It is becoming easier to design more creative user interfaces within search that better reflect the needs of the end user and the actions they want to take.
  • Once advanced features, like type-aheads and faceting, are now readily available.

In order to really make enterprise search work, foundational KM activities are still critical. For instance:

  • Content Audits and Cleanup – Content has to be cleaned up and enhanced with tags to ensure the right content appears in search and is weighted appropriately. Content cleanup alone is time-consuming and dry, but linking it to a search effort shines a critical light on why it is important. Without a content cleanup, search will end up being “garbage in, garbage out” no matter how slick it is.
  • Taxonomy Design and Tagging – Taxonomies have to be designed and applied to key content repositories as well as integrated into the search design to ensure faceting works and different types of content from different sources can be seamlessly integrated. Taxonomy by itself can be esoteric and easily set aside, but when its value surfaces as faceted navigation, it becomes a critical tool for findability and discoverability.
  • Content Types – Content Types continue to be one of the more misunderstood elements of a KM architecture, despite our efforts to make them more approachable. Content Types can serve as templates, guide workflows and security, and inform tagging. When designed correctly, they can also translate into search hit types. That said, they tend to be relatively confusing until seen in action.
  • Tacit Knowledge Capture – Almost every organization with whom we’ve worked agrees Tacit Knowledge Capture is critical to ensuring expertise isn’t lost as employees leave and new employees are up-scaled faster and more effectively. Good Tacit Knowledge Capture can take a broad array of forms, from traditional mentor/mentee pairings, to email capture tools, to communities of practice (both live and virtual). Though there can be substantial visibility for a great deal of these mechanisms, their full value isn’t felt just in their existence. Tacit Knowledge Capture really only pays off when individuals can find and engage with the captured knowledge. Search can play a key role here, and can also allow for the integration of a range of result types in a manner that allows the end user to find the “official” published answer as well as related “social” answers from experts (as well as, potentially, the experts themselves).
  • Knowledge Sharing Culture – Developing a strong culture of knowledge sharing is one of the foundational activities we seek to implement in the early stages of any KM engagement. Specific activities for this venture vary greatly amongst organizations and depend on from where they’re starting. Approaches may range from a simple commitment from leadership, to the establishment of a KM Leadership group, and to more advanced gamification and analytics efforts. At the end of the day, however, nothing shines a light on good knowledge sharing behavior like something that will surface that newly shared knowledge in a form that is easy to find and discover.
  • Governance – Governance, specifically content governance, is another building block and truly foundational activity for enterprise knowledge management efforts. Like a culture of knowledge sharing, nothing helps to show the importance of governance as much as a search initiative that shows what happens in very real terms when people DON’T follow it. Content governance will get a huge boost in importance as soon as it’s easier to find and expose content.

Each of these pieces alone is an important part of a comprehensive KM strategy. Together, they make up many of the core KM foundations I seek to put on KM Roadmaps for my clients. Integrating a search pilot into that roadmap ensures the hard work that will go into the aforementioned efforts, as well as the overall KM transformation, will be seen, and made clickable, for your end users.

Information Architecture and Big Data Analytics

December 13, 2017

Information Architecture is an enabler for Big Data Analytics. You may be asking why I would say this, or how does IA enable Big Data Analytics? We need to remember that Big Data includes all data (i.e., Unstructured, Semi-structured, and Structured). The primary characteristics of Big Data (Volume, Velocity, and Variety) are a challenge to your existing architecture and how you will effectively, efficiently and economically process data to achieve operational efficiencies.

In order to derive the maximum benefit from Big Data, organizations must be able to handle the rapid rate of delivery and extraction of huge volumes of data, with varying data types. This can then be integrated with the organization’s enterprise data and analyzed. Information Architecture provides the methods and tools for organizing, labeling, building relationships (through associations), and describing (through metadata) your unstructured content adding this source to your overall pool of Big Data. In addition, information architecture enables Big Data to rapidly explore and analyze any combination of structured, semi-structured and unstructured sources. Big Data requires information architecture to exploit relationships and synergies between your data. This infrastructure enables organizations to make decisions utilizing the full spectrum of your big data sources.

Big Data Components

 IA Element                 Volume                                  Velocity            Variety

Content Consumption

Provides an understanding of the universe of relevant content through performing a content audit. This contributes directly to volume of available content.

This directly contributes to the speed at which content is accessed by providing initial volume of the available content.

Identifies the initial variety of content that will be a part of the organization's Big Data resources.

Content Generation

Fill gaps identified in the content audit by Gather the requirements for content creation/ generation, which contributes to directly to increasing the amount of content that is available in the organization's Big Data resources.

This directly contributes to the speed at which content is accessed due to the fact that volumes are increasing.

Contributes to the creation of a variety of content (documents, spreadsheets, images, video, voice) to fill identified gaps.

Content Organization

Content Organization will provide business rules to identify relationships between content, create metadata schema to assign content characteristic to all content. This contributes to increasing the volume of data available and in some ways leveraging existing data to assign metadata values.

This directly contributes to improving the speed at which content is accessed by applying metadata, which in turn will give context to the content.

The Variety of Big Data will often times drive the relationships and organization between the various types of content.

Content Access

Content Access is about search and establishing the standard types of search (i.e., keyword, guided, and faceted). This will contribute to the volume of data, through establishing the parameters often times additional metadata fields and values to enhance search.

Contributes to the ability to access content and the speed and efficiency in which content is accessed.

Contributes to how the variety of content is access. The Variety of Big Data will often times drive the search parameters used to access the various type of content.

Content Governance

The focus here is on establishing accountability for the accuracy, consistency and timeliness of content, content relationships, metadata and taxonomy within areas of the enterprise and the applications that are being used. Content Governance will often "prune" the volume of content available in the organization's Big Data resources by only allowing access to pertinent/relevant content, while either deleting or archiving other content.

When the volume of content available in the organization's Big Data resources is trimmed through Content Governance it will improve velocity by making available a smaller more pertinent universe of content.

When the volume of content available in the organization's Big Data resources is trimmed through Content Governance the variety of  content available may be affected as well.

Content Quality of Service

Content Quality of Service focuses on security, availability, scalability, usefulness of the content and improves the overall quality of the volume of content in the organization's Big Data resources by:
- defending content from unauthorized access, use, disclosure, disruption, modification, perusal, inspection, recording or destruction
- eliminating or minimizing disruptions from planned system downtime
making sure that the content that is accessed is from and/or based on the authoritative or trusted source, reviewed on a regular basis (based on the specific governance policies), modified when needed and archived when it becomes obsolete
- enabling the content to behave the same no matter what application/tool implements it and flexible enough to be used from an enterprise level as well as a local level without changing its meaning, intent of use and/or function
- by tailoring the content to the specific audience and to ensure that the content serves a distinct purpose, helpful to its audience and is practical.

Content Quality of Service will eliminate or minimize delays and latency from your content and business processes by speeding to analyze and make decisions directing effecting the content's velocity.

Content Quality of Service will improve the overall quality of the variety of content in the organization's Big Data resources through aspects of security, availability, scalability, and usefulness of content.

The table above aligns key information architecture elements to the primary components of Big Data. This alignment will facilitate a consistent structure in order to effectively apply analytics to your pool of Big Data. The Information Architecture Elements include; Content Consumption, Content Generation, Content Organization, Content Access, Content Governance and Content Quality of Service. It is this framework that will align all of your data to enable business value to be gained from your Big Data resources.

Note: This table originally appeared in the book Knowledge Management in Practice (ISBN: 978-1-4665-6252-3) by Anthony J. Rhem.

Maximizing and Measuring User Adoption

August 30, 2017

Similar to the old adage, “you can lead a horse to water, but you can’t make him drink,” you can deliver a solution that uses the most cutting-edge technology and beautiful design, but you can’t guarantee that your stakeholders will embrace it. This blog offers practical tips on how to maximize and measure user adoption to ensure that your new tool or process is fully embraced by those for whom you’ve designed it.

To deliver a project success story backed with quantitative and qualitative data to support it, you should take an objective-first approach to change management. This requires a shift in focus from what the change is (e.g. the implementation of a new tool or process) to what you aim to achieve as a result of the change (e.g. increased productivity or improved work satisfaction). Rather than only highlighting the features of the new technology, you’ll want to focus on the benefits the users will gain from using it. Taking this approach is particularly critical for Knowledge Management initiatives, which are initially often met with skepticism and a broad sense of concern that there’s not enough time in the already busy day to acclimate to another new tool or process. By following these guidelines, you’ll be able to say “our users love the new tool and they are so much more effective and efficient as a result of it…” and “here’s the data to prove it.”

The way to accomplish this is by setting “SMART” objectives at the start of your project and developing an anaytics strategy that will help you measure your progress towards achieving those objectives. These objectives should clearly express desired changes in user behavior and the impact these new behaviors are expected to have on overall productivity and effectiveness. In the words of Stephen Covey, “start with the end in mind” so that all your efforts are aligned towards achieving your expected results.

Let me put this into context using one of my current projects. I’m working with a global manufacturing organization to design and implement a tool that will help the communications department work in a more centralized and collaborative way. The team is responsible for delivering content about events, programs, and news items to internal employees as well as external stakeholders. The team is used to working in silos and each team member uses different tools for storing, sharing, and finding information such as a basic team site, email, and desktop file folders.

From the very beginning of the project, change management has been a priority. We knew that if we wanted the communications department to adopt the new tool, we had to think of ways to encourage them to do so well in advance of them even having contact with it. Here are ways to apply what my team has done to your change effort to help you maximize and measure user adoption:

Step 1: Align your metrics with desired outcomes

To encourage a more centralized and collaborative way of working for the communications department, we’re using Microsoft O365 tools such as MS Teams, MS Planner, and modern SharePoint team sitesas a platform for the new system. We chose this suite of tools because it offers various features that, if used, could save the department a lot of time, reduce wasted effort, and ultimately elevate their role to a more strategic partner within the organization.

Here’s how we’ve expressed our primary objective:

“Increase the team’s efficiency by managing all campaign content, including digital assets, in the new tool within 90 days of launch.”

When content is stored in various places, not everyone has access to the latest versions. This causes a lot of confusion and re-work. The challenge is that people defer to the processes they’re most used to, which is often saving information in their local drives and sharing it via email. The new behavior we wanted to encourage was saving information in a centralized location (in this case a SharePoint team site), so that everyone has access to the latest version, edits are being made to the same copy, and there’s a tracking history of the edits, as well as who made them.

The objectives you identify will vary depending on the challenges you’re trying to solve, so your success metrics should be aligned accordingly. In this case, defining our objective leads us to what we should measure: the percentage of campaign content that is stored and shared in the tool vs. outside of it.

Step 2: Capture baseline metrics and keep it simple

In order to be able to tell a story about the impact of a new tool, you need baseline metrics for comparing your results. For this project, we had three categories of metrics and different approaches for capturing each:

  • Satisfaction Level: We deployed a survey that measured how useful users found their current system.
  • Proficiency Level: We deployed another survey that measured their self-rated proficiency levels with basic SharePoint functionality such as uploading and sharing documents.
  • Usage Level: We tracked activity on the system after launch. This includes number of active users, number of documents and multimedia files saved and shared via the tool, and number of interactions in the conversations space.

The key here is to keep it simple. We designed the surveys to be short and to the point, and only asked specific questions that would help inform the decisions we made on the project. We also didn’t measure everything. We kept it basic to start and the longer the users had to engage with the system, the more sophisticated our metrics became.

Step 3: Take actions that lead to measurable improvements

Our satisfaction survey, along with in-depth user analysis and testing, informed the features we included in our new tool. As we were prioritizing the features, we kept our objectives in mind. It was critical for us to ensure our tool had a separate space for managing content for each campaign. This space had to make it easy for the team to upload, edit, share, and find content, including text-based and multimedia assets.

Our proficiency survey helped us to design the training for the new tool. Had we made the assumption that our users were already familiar with SharePoint’s basic functionality, we would have gone into our training sessions ready to introduce all of its advanced features. Knowing that the team members were not as confident in their SharePoint abilities led us to design a basic SharePoint prerequisite training session for those that needed it. Meeting users at their proficiency level and guiding them towards the level they need to be to make the most of the new tool’s features prevents them from being so discouraged that they abandon the new tool prematurely. (Get more helpful tips on user training by watching Rebecca’s video, Top 5 Tips for Using Training to Promote Adoption).

This is important because we planned to deploy the satisfaction and proficiency survey again once we launched the new tool. Taking actions based on the results of the baseline survey created measurable improvements in how much the users liked the new tool(s) they were using and how confident they were in using it.

Step 4: Measure again once you’ve implemented your solution

This may seem like common sense, but let your users know that the tool is now available for them to use and train them how to use it! Often, the team members heavily involved in the project assume that users know it exists and will intuitively learn how to use it on their own. The team building the tool has spent the past few months or so immersed in the tool, so they are likely to overestimate other people’s awareness of the tool and underestimate the learning curve associated with it.

In our case, our baseline usage level was 0 team members because the tool was brand new. Our goal was to increase usage level to all 30 team members. Our strategy for getting all 30 team members to use the tool, rather than relapsing back to their old habits and systems, was the deployment of “early and often” messages about the tool, along with thorough training for each team member we expected to use it. Long before the tool was launched, we built excitement and awareness around the new tools via a teaser video, Yammer posts, emails, and messages from leadership during team meetings. Once the tool was launched, we conducted live training sessions and delivered helpful resources and guides.

Along the way, we were asking:

  • What percentage of the team watched the teaser video?
  • How many team members saw the Yammer posts? How many “liked” it, replied to it, or shared it?
  • How many of the team members heard and saw the presentation?
  • Did the team members react positively or negatively to the messages in the video, posts, and presentations?
  • How many of the team members completed the optional pre-work and basic training?
  • How many of the team members attended the live training sessions?

All of these metrics were indicators of the degree to which the users would adopt the new tool. You can then validate these indicators by measuring actual adoption, e.g. user activity within the tool and their satisfaction in using it.

Step 5: Give it some time, then measure again

As we were building the tool, the project team discussed how we were going to tell our success story. But, that really depended on how we defined our success. For us, did success mean that we launched the new tool on schedule and under budget? Or, did it mean that the communications team members were embracing the new tool and way of working? The latter for us was much more important so we developed a timeline for capturing feedback: one week after launch, one month after launch, 3 months after launch, and 6 months after launch. During these set time periods, we would capture metrics around how satisfied they are with the new tool and its impact on their work and how proficient they felt with their new skill sets. In addition to self-reported data, we would also track usage metrics such as what percentage of the team actively manages their campaign within the tool vs. outside of it.

Summary

Organizations invest large amounts of money on new technology with the intentions of improving employee productivity. The key to getting a significant return on these investments is to make sure your project team has what it takes to define, drive, and measure success. If you want to make sure the next solution you roll-out maximises user adoption and produces measurable results, contact Enterprise Knowledge at info@enterprise-knowledge.com.