How would you like to be a Guest Blogger for KMI? Email us at: info@kminstitute.org and let us know your topic(s)!

Measuring the Effectiveness of Your Knowledge Management Program

February 14, 2018

The ability to measure the effectiveness of your Knowledge Management (KM) program and the initiatives that are essential to its success has been a challenge for all organizations executing a KM Program. Capturing the appropriate metrics are essential to measuring the right aspects of your KM Program. The right metrics will facilitate a clear and correct communication of the health of the KM program to your organization’s leadership. In this post, I will identify metrics (or measurements) of four key initiatives of most KM Programs. These initiatives are: Communities of Practice, Search, Lessons Learned, and Knowledge Continuity.

Community of Practice (CoP) Metrics

Typical CoP metrics include:

Average posts per day, Unique contributors (People posting at least once), Repeat contributors (People posting more than once) and Majority contributors (Min people for > 50% of posts).

Some points to consider:

  • Recognize the diversity of interests in those participating in the group, and that this is a voluntary undertaking for all involved.
  • Develop a stakeholder classification and perform a RACI assessment for each stakeholder group.
  • Through a collaborative process, arrive at coherent goals, objectives, principles and strategies for the group.
  • Develop a CoP plan with agreed upon moderator criteria and stakeholders that influence group behavior in ways that are congruent with the group’s goals and objectives.

Search Metrics

Search Metrics are determined through Tuning and Optimization

Site Owners/Administrators should constantly observe and evaluate effectiveness of search results. Site Administrators/Owners should be able to gather Search Results reports from the KMS administrator periodically (every two weeks). From these reports, they can analyze the type of keywords users are searching for and from which sites most of the search queries come from. Based on this, Site Administrators/Owners can add ‘synonyms’ for their sites. If any newly added metadata column needs to be available in Advanced Search filters then the request must be sent to the KMS administrator.

Search Metrics

  • Search engine usage – Search engine logs can be analyzed to produce a range of simple reports, showing usage, and a breakdown of search terms.
  • Number of Searches performed (within own area and across areas)
  • Number of highly rated searches performed
  • User rankings – This involves asking the readers themselves to rate the relevance and quality of the information being presented. Subject matter experts or other reviewers can directly assess the quality of material on the KM platform.
  • Information currency – This is a measure of how up-to-date the information stored within the system is. The importance of this measure will depend on the nature of the information being published, and how it is used. The great way to track this is using metadata such as publishing and review dates. By using this, automated reports showing a number of specific measures can be generated:
  1. Average age of pages
  2. Number of pages older than a specific age
  3. Number of pages past their review date
  4. Lists of pages due to be reviewed
  5. Pages to be reviewed, broken down by content owner or business group

User feedback – A feedback mechanism is a clear way to indicate if staff is using the knowledge. Alternatively, while many feedback messages may indicate poor quality information, it does indicate strong staff use. It also shows they have sufficient trust in the system to commit the time needed to send in feedback

Lessons Learned Metrics

Lessons Learned Basic Process: Identify – Document – Analyze – Store – Retrieve

Metrics are determined and organized by key fields from the lessons learned template and includes responses gathered during the session. Lessons Learned should be identified by Type of lesson learned captured (i.e., resource, time, budget, system, content, etc.). Summarize the lesson learned by creating a brief summary of the findings and providing recommendations for correcting the findings (i.e., Findings – a summary of the issues found during the review process; Recommendations – recommended actions to be taken to correct findings). In order to provide accurate metrics the approved actions should be documented and tracked to completion. In some cases the approved action may become a project due to high level of resources required to address the finding. Some metrics include: Impact Analysis (time (increased/decreased), improper resourced, budget constraints, software/system limitations, lack of available content, etc.); Applying lesson learned: % of Problem/Issue solved with lesson learned per category and overall.

Knowledge Continuity

The keys at the heart of knowledge continuity include:

  • What constitutes mission-critical knowledge that should be preserved?
  • Where is the targeted mission-critical knowledge and is accessible and transferable?
  • What decisions and action are required to stem the loss of valuable and in many cases irreplaceable knowledge?
  • Successfully obtaining, transferring, and storing the lessons learned and best practices from their most experienced and valuable workers to a knowledge-base or (KM Application) before employees depart or retire?

Some Metrics Include:

  • Percentage of knowledge harvested and stored from key employees.
  • Percentage of knowledge transferred to successor employees.
  • Cost associated with preventing corporate mission-critical knowledge from loss
  • Provides a structured framework and system to store, update, access, enrich, and transfer to employees to support their work activities
  • The amount of ramp-up time of new hires, moving them rapidly up their learning curves and making them more productive sooner

Let me know if you agree with the metrics identified here and/or if you know of additional metrics within these key initiatives that must be captured. I look forward to your responses.

Dancing With the Robots - the Rise of the Knowledge Curator

January 31, 2018

“Nearly half of HR and business leaders who were surveyed believe many of their core HR functions will be automated by 2022.” (Source: Harris Poll).

So let’s pause and reflect on this seemingly widespread sentiment. There are indeed a number of largely Administrative HR functions that easily lend themselves to automation, such as: providing the latest regulatory position on Maternity Leave and Maternity Pay, booking a Holiday, or providing basic payroll information. It makes perfect sense to deliver this information to employees via automation (e.g. a chatbot) on a self –service basis. And well designed and deployed technology can analyse results and improve the quality of the answers it provides.

Of course this rapid, exponential automation is not going to be restricted to employees in fact it is going to occur even faster when we look at consumers. Last year Amazon in the UK extended its warehouse automation to its delivery service with its first successful drone delivery. The delivery took 13 minutes and did not involve any humans from the supplier side. 

From 3D printing to self-driving cars, automation is happening and it is happening fast and it is unstoppable. It will therefore affect all aspects of our lives in our different incarnations as worker, citizen and consumer.

The problem is that we as humans want (even need) to have it both ways. When it suits us we want and prefer to interact with a machine over a human but again when it suits us we want to be able to immediately switch the service mode and interact with a human.

David's Story

To illustrate this problem, let’s go back to the HR automation situation. In our story a successful sales person (David) is going through a difficult divorce that is affecting his health. David is a highly valued employee and the company wants to be supportive of him during this time. However, quite understandably he does not want his marital difficulties and medical treatment discussed with anyone other than his current direct manager (Kathy). Between them they agree that in the short term David should be allowed to take time off whenever he feels unable to perform his company duties to an acceptable standard. Kathy says she will speak to someone senior in HR and let them know that this arrangement has been approved by the business but that the matter is to be treated with the utmost discretion. 

In the weeks that follow David books days off via the HR Chatbot whenever he needs it. In fact, the somewhat impersonal, discreet aspect of the chatbot works really well for him given his situation. His colleagues are none the wiser which is important to his self-esteem and position at work. 

One day David puts in for another day off and this time the request is rejected by the machine. It has been programmed to spot certain patterns and reject requests accordingly. The machine tells him that the matter has been escalated to his manager (Scott?!) along with David’s absence pattern report. Unfortunately Scott was David’s former manager and is someone with whom David has a strained relationship and who he would definitely NOT want to know about his personal circumstances.  David tries desperately to stop the machine and reverse the request but to no avail. Matters are made worse when his attempts are seen by the machine as a sign of an unsatisfied employee which then triggers an automated ‘rate this service’ survey email. David’s emotions switch from anxiety to anger and he completes the survey is a very negative fashion.  The low NPS (Net Promoter Score) as it were further generates a courtesy call from someone in HR in charge of low NPS follow up who neither knows David nor his current situation.  David has to deflect the caller and is forced to make up a cover story in order to avoid revealing to a relative stranger why he is unhappy with the system and why he is putting in for so many days off so close together. He then has to write an email to Scott and cc Kathy explaining the error and asking him to kindly ignore the absence report (but alas the damage is done).

Dancing with the Robots 

David’s story is hardly unusual and one can cite many more scenarios where the machines will fail to read the individual and their specific circumstances correctly. We are humans after all and our ever changing needs are precisely what define us as humans not machines.  Accordingly it is a no brainer that we will need to have people in place that can ‘dance with the robots’ and thereby offer a hybrid service which encompasses the best of humans (compassion) and the best of machines (efficiency).

OK so let’s play David’s story out again but this time with what I call a Knowledge Curator in place, in other words a human who is trained to dance with the robots (we’ll call her Cyd as in the great dancer Cyd Charisse).

When David first raises his personal situation with his manager (Kathy) and they come to an agreed arrangement, Cyd is either directly or indirectly notified of this employee change. Now Cyd does not need to know why this pattern change has occurred and been approved, she simply needs to know that it has happened and to think about its implications. Her job as a K-Curator is to identify ALL points (human and machine) that will be affected by or will impact this change. So knowing that David will be putting in for unplanned days off at short notice she is aware that the chatbot and its connection to the holiday booking system will not be set-up to cater for this. Cyd will make the system adjustments and at the same time she will also double check on all the supporting information for David which will help her identify that the system has the wrong direct manager details for him. She can also think ahead and remove the trigger for the NPS survey. It is interesting to note that this K-curator role is cross departmental and requires the ability to understand the business, HR policy and process automation. 

Conclusion

With the exponential increase of company automation and the reduction of human capital expenditure, it is completely logical and essential for this process to be managed by an increase in headcount of these new Knowledge Curators. They will cover all matter of cross departmental scenarios when human needs internal and external are serviced in part by robots.  Like Cyd in our story they will all become superb dancers able to lead and follow their robotic partners as the company pursues its irreversible path to higher levels of automation. 

KMI Interviews with Recent CKM Students

January 2, 2018

As we enter the new year, KM Institute begins our 17th year of offering the flagship Certified Knowledge Manager program.  Well over 8,000 students have now taken our courses globally.  We have expanded our global reach with public and private courses conducted in West Africa, Europe, India and the Middle East, in addition to the United States.  New partnerships are forming in these regions as well as South America and South Africa.

To keep pace with this growth, we have expanded our instructor base beyond our Chief Instructor and KMI Founder, Douglas Weidner, to include KM experts both here in the U.S. and abroad.  At the conclusion of a recent CKM class, we interviewed a few of the students to get their feedback.

Please take a few moments to view the video below.  Enjoy!  And thank you to our students for participating in the interviews.

For more info on the CKM program, click here.

Information Architecture and Big Data Analytics

December 13, 2017

Information Architecture is an enabler for Big Data Analytics. You may be asking why I would say this, or how does IA enable Big Data Analytics? We need to remember that Big Data includes all data (i.e., Unstructured, Semi-structured, and Structured). The primary characteristics of Big Data (Volume, Velocity, and Variety) are a challenge to your existing architecture and how you will effectively, efficiently and economically process data to achieve operational efficiencies.

In order to derive the maximum benefit from Big Data, organizations must be able to handle the rapid rate of delivery and extraction of huge volumes of data, with varying data types. This can then be integrated with the organization’s enterprise data and analyzed. Information Architecture provides the methods and tools for organizing, labeling, building relationships (through associations), and describing (through metadata) your unstructured content adding this source to your overall pool of Big Data. In addition, information architecture enables Big Data to rapidly explore and analyze any combination of structured, semi-structured and unstructured sources. Big Data requires information architecture to exploit relationships and synergies between your data. This infrastructure enables organizations to make decisions utilizing the full spectrum of your big data sources.

Big Data Components

 IA Element                 Volume                                  Velocity            Variety

Content Consumption

Provides an understanding of the universe of relevant content through performing a content audit. This contributes directly to volume of available content.

This directly contributes to the speed at which content is accessed by providing initial volume of the available content.

Identifies the initial variety of content that will be a part of the organization's Big Data resources.

Content Generation

Fill gaps identified in the content audit by Gather the requirements for content creation/ generation, which contributes to directly to increasing the amount of content that is available in the organization's Big Data resources.

This directly contributes to the speed at which content is accessed due to the fact that volumes are increasing.

Contributes to the creation of a variety of content (documents, spreadsheets, images, video, voice) to fill identified gaps.

Content Organization

Content Organization will provide business rules to identify relationships between content, create metadata schema to assign content characteristic to all content. This contributes to increasing the volume of data available and in some ways leveraging existing data to assign metadata values.

This directly contributes to improving the speed at which content is accessed by applying metadata, which in turn will give context to the content.

The Variety of Big Data will often times drive the relationships and organization between the various types of content.

Content Access

Content Access is about search and establishing the standard types of search (i.e., keyword, guided, and faceted). This will contribute to the volume of data, through establishing the parameters often times additional metadata fields and values to enhance search.

Contributes to the ability to access content and the speed and efficiency in which content is accessed.

Contributes to how the variety of content is access. The Variety of Big Data will often times drive the search parameters used to access the various type of content.

Content Governance

The focus here is on establishing accountability for the accuracy, consistency and timeliness of content, content relationships, metadata and taxonomy within areas of the enterprise and the applications that are being used. Content Governance will often "prune" the volume of content available in the organization's Big Data resources by only allowing access to pertinent/relevant content, while either deleting or archiving other content.

When the volume of content available in the organization's Big Data resources is trimmed through Content Governance it will improve velocity by making available a smaller more pertinent universe of content.

When the volume of content available in the organization's Big Data resources is trimmed through Content Governance the variety of  content available may be affected as well.

Content Quality of Service

Content Quality of Service focuses on security, availability, scalability, usefulness of the content and improves the overall quality of the volume of content in the organization's Big Data resources by:
- defending content from unauthorized access, use, disclosure, disruption, modification, perusal, inspection, recording or destruction
- eliminating or minimizing disruptions from planned system downtime
making sure that the content that is accessed is from and/or based on the authoritative or trusted source, reviewed on a regular basis (based on the specific governance policies), modified when needed and archived when it becomes obsolete
- enabling the content to behave the same no matter what application/tool implements it and flexible enough to be used from an enterprise level as well as a local level without changing its meaning, intent of use and/or function
- by tailoring the content to the specific audience and to ensure that the content serves a distinct purpose, helpful to its audience and is practical.

Content Quality of Service will eliminate or minimize delays and latency from your content and business processes by speeding to analyze and make decisions directing effecting the content's velocity.

Content Quality of Service will improve the overall quality of the variety of content in the organization's Big Data resources through aspects of security, availability, scalability, and usefulness of content.

The table above aligns key information architecture elements to the primary components of Big Data. This alignment will facilitate a consistent structure in order to effectively apply analytics to your pool of Big Data. The Information Architecture Elements include; Content Consumption, Content Generation, Content Organization, Content Access, Content Governance and Content Quality of Service. It is this framework that will align all of your data to enable business value to be gained from your Big Data resources.

Note: This table originally appeared in the book Knowledge Management in Practice (ISBN: 978-1-4665-6252-3) by Anthony J. Rhem.

Transforming an Employee Portal to a Digital Workspace

November 29, 2017

When is a company both a brand-new startup and an established and mature company at the same time?

In November 2016, Johnson Controls spun out its auto seating business, creating a new, wholly independent company called Adient.  Adient was born as a fully mature company, with established clients, operations, and market share.  But Adient was also born with a strong desire create a new corporate culture of information sharing and collaboration.

How could the new company do both?  How could Adient continue to maintain the content and information that was needed to sustain operations, while simultaneously rethinking how employees communicated, collaborated, and exchanged information?

Immediately after the launch of the new company, Adient Communications and Enterprise Knowledge LLC (EK) teamed up to rethink and redesign how the old-style employee portal could be transformed into a new digital workspace.

From the outset, the team identified a set of competing priorities that it needed to rationalize during the transformation process.  They needed to:

  • Maintain continuous access to legacy information used in day to day operations;
  • Clean up the mountain of obsolete information that Adient had inherited from its parent company;
  • Make information access and maintenance easier and more intuitive;
  • Change the paradigm for how the company communicates to employees;
  • Change the dynamics of how employees exchanged information and collaborated, breaking down traditional barriers to information exchange by using a more social model.

The portal transformation team designed a way to gradually transform the portal in real time, rebuilding it while maintaining access to current information.  The transformed digital workspace would:

  • Disseminate “official” company announcements using social tools rather than email or static content;
  • Delineate spaces for all company content vs. team based content;
  • Align content access interfaces with how users look for information;
  • Align content maintenance interfaces with content owners;
  • Enable contextual, search driven navigation to simplify the overall portal architecture and move users to target content quickly;
  • Include refreshed – and dramatically simplified – content about company functions and tools.

A New Paradigm for Internal Communications

Instead of using traditional static tools for internal communications such as long form news articles and email announcements, the portal transformation team shifted toward new uses for traditional social tools.  The company’s social network (in this case Yammer) was redeployed in the service of traditional internal communications by putting in place two innovations:

The employee communications “avatar.” The portal transformation team created an employee communications avatar within Yammer and used the avatar for official announcements from the company.

Social network style employee announcements were now available both on the portal home page and via the Yammer app.  The announcements were shorter, more immediate, and easier to scan; they allowed for employee comments and “likes,” and were available through a variety of user interfaces, both desktop and mobile.

Internal “twitter-style” leads for traditional articles. The team replaced the traditional static “news quilt” on the portal home page with a news stream that essentially served as a rolling inhouse twitter feed. Leads to traditional featured news articles or videos were linked to the full versions of the item, again with access from more interfaces and with more mobility and scan-ability.

Realigned Search Scopes

To improve search results in the short term, search scopes were adjusted away from the default “index everything” approach, to a more finely tuned set of search results that returned more reliable content.  The revised search scopes:

  • Separated the all company facing portal content from team content;
  • Focused on destination content rather than landing or index pages;
  • Pointed to content that was actively curated by content owners;
  • Used “best bets” to match selected content to specific tasks.

Transforming Traditional Portal Content

To transform old style portal content for the full range of company functions, the portal transformation team created three tracks of work, namely:

  • Design and implement a common information infrastructure as a foundation;
  • Segment and prioritize the work to gain control over the tasks ahead;
  • Extend the infrastructure to each functional area and content owner via a standardized process.

Information Infrastructure

A common information infrastructure formed the foundation of portal transformation.  The infrastructure included enterprise metadata and content types, common search facets, and a patterned approach for portal design that could be repeated for each functional area of the portal.

Search driven navigation at each functional level dramatically simplified the page hierarchy and reduced the level of effort needed to transform each functional area.  Each area was now a “content engine,” designed to consume content from content owners and serve content to users based on browsable search facets.  Content owners were now focused on maintaining content rather than on managing an individually designed portal site.

The enterprise metadata, content types, and search refiners were extended and deployed to each functional area as they were addressed.  The common page hierarchy followed the pattern in the diagram below:

 

 

 

 

 

 

 

 

Prioritized Approach

Not all functional areas of the company could be addressed at the same time.  

Prioritizing and then ordering the individual functional areas allowed the portal transformation team to transform the portal one functional area at a time. Before engaging with the individual content owners, the portal was segmented by functional area and then prioritized across three criteria:

  • End user value and importance;
  • Relative size and complexity;
  • Readiness of individual content owners.

High value, low complexity areas of the portal were identified using a magic quadrant and then further prioritized by considering readiness of individual content owners to engage in the process.

The result was a prioritized listing of legacy portal sites that began with high value / low effort sites where the content owner was ready and willing to undertake the transformation process.

Rolling and Repeatable Process

For the actual conversion of legacy portal content to the new digital workspace model, the portal transformation team established a process that could be repeated again and again with each new functional area that was addressed.  The end-to-end process was highly regimented and could be staggered with different functional areas at different stages in the process at the same time.  For a functional area with an average amount of content (e.g. between 100-200 information items) the entire process from engagement to launch took about 6 weeks.

Here are the individual process steps:

  • Engage with content owners to describe the process;
  • Guide content owners through a cleanup of their legacy content, identifying content that should be archived, the content that should be moved to the transformed portal space, and the content that should be moved after modification;
  • Extend the enterprise metadata and content types to accommodate the new function;
  • Build out of standardized information and document repositories for that function;
  • Migrate refreshed content into the new repositories;
  • Build contextual landing and search results pages;
  • Launch and announce the transformed functional area.

Conclusion and Results

The transformation work at Adient is ongoing.  Even with this highly structured and repeatable approach, changing an old-style employee portal to a more agile and flexible digital workspace takes time.  Structuring the work in this way, using standards and a repeatable process, allows for a sense of momentum and continuous improvement that is obvious and observable by employees throughout the company.  Portal transformation is no longer a chore that runs in the background, it is a company-wide initiative that is changing the way that Adient works.