How would you like to be a Guest Blogger for KMI? Email us at: info@kminstitute.org and let us know your topic(s)!

Dancing With the Robots - the Rise of the Knowledge Curator

January 31, 2018

“Nearly half of HR and business leaders who were surveyed believe many of their core HR functions will be automated by 2022.” (Source: Harris Poll).

So let’s pause and reflect on this seemingly widespread sentiment. There are indeed a number of largely Administrative HR functions that easily lend themselves to automation, such as: providing the latest regulatory position on Maternity Leave and Maternity Pay, booking a Holiday, or providing basic payroll information. It makes perfect sense to deliver this information to employees via automation (e.g. a chatbot) on a self –service basis. And well designed and deployed technology can analyse results and improve the quality of the answers it provides.

Of course this rapid, exponential automation is not going to be restricted to employees in fact it is going to occur even faster when we look at consumers. Last year Amazon in the UK extended its warehouse automation to its delivery service with its first successful drone delivery. The delivery took 13 minutes and did not involve any humans from the supplier side. 

From 3D printing to self-driving cars, automation is happening and it is happening fast and it is unstoppable. It will therefore affect all aspects of our lives in our different incarnations as worker, citizen and consumer.

The problem is that we as humans want (even need) to have it both ways. When it suits us we want and prefer to interact with a machine over a human but again when it suits us we want to be able to immediately switch the service mode and interact with a human.

David's Story

To illustrate this problem, let’s go back to the HR automation situation. In our story a successful sales person (David) is going through a difficult divorce that is affecting his health. David is a highly valued employee and the company wants to be supportive of him during this time. However, quite understandably he does not want his marital difficulties and medical treatment discussed with anyone other than his current direct manager (Kathy). Between them they agree that in the short term David should be allowed to take time off whenever he feels unable to perform his company duties to an acceptable standard. Kathy says she will speak to someone senior in HR and let them know that this arrangement has been approved by the business but that the matter is to be treated with the utmost discretion. 

In the weeks that follow David books days off via the HR Chatbot whenever he needs it. In fact, the somewhat impersonal, discreet aspect of the chatbot works really well for him given his situation. His colleagues are none the wiser which is important to his self-esteem and position at work. 

One day David puts in for another day off and this time the request is rejected by the machine. It has been programmed to spot certain patterns and reject requests accordingly. The machine tells him that the matter has been escalated to his manager (Scott?!) along with David’s absence pattern report. Unfortunately Scott was David’s former manager and is someone with whom David has a strained relationship and who he would definitely NOT want to know about his personal circumstances.  David tries desperately to stop the machine and reverse the request but to no avail. Matters are made worse when his attempts are seen by the machine as a sign of an unsatisfied employee which then triggers an automated ‘rate this service’ survey email. David’s emotions switch from anxiety to anger and he completes the survey is a very negative fashion.  The low NPS (Net Promoter Score) as it were further generates a courtesy call from someone in HR in charge of low NPS follow up who neither knows David nor his current situation.  David has to deflect the caller and is forced to make up a cover story in order to avoid revealing to a relative stranger why he is unhappy with the system and why he is putting in for so many days off so close together. He then has to write an email to Scott and cc Kathy explaining the error and asking him to kindly ignore the absence report (but alas the damage is done).

Dancing with the Robots 

David’s story is hardly unusual and one can cite many more scenarios where the machines will fail to read the individual and their specific circumstances correctly. We are humans after all and our ever changing needs are precisely what define us as humans not machines.  Accordingly it is a no brainer that we will need to have people in place that can ‘dance with the robots’ and thereby offer a hybrid service which encompasses the best of humans (compassion) and the best of machines (efficiency).

OK so let’s play David’s story out again but this time with what I call a Knowledge Curator in place, in other words a human who is trained to dance with the robots (we’ll call her Cyd as in the great dancer Cyd Charisse).

When David first raises his personal situation with his manager (Kathy) and they come to an agreed arrangement, Cyd is either directly or indirectly notified of this employee change. Now Cyd does not need to know why this pattern change has occurred and been approved, she simply needs to know that it has happened and to think about its implications. Her job as a K-Curator is to identify ALL points (human and machine) that will be affected by or will impact this change. So knowing that David will be putting in for unplanned days off at short notice she is aware that the chatbot and its connection to the holiday booking system will not be set-up to cater for this. Cyd will make the system adjustments and at the same time she will also double check on all the supporting information for David which will help her identify that the system has the wrong direct manager details for him. She can also think ahead and remove the trigger for the NPS survey. It is interesting to note that this K-curator role is cross departmental and requires the ability to understand the business, HR policy and process automation. 

Conclusion

With the exponential increase of company automation and the reduction of human capital expenditure, it is completely logical and essential for this process to be managed by an increase in headcount of these new Knowledge Curators. They will cover all matter of cross departmental scenarios when human needs internal and external are serviced in part by robots.  Like Cyd in our story they will all become superb dancers able to lead and follow their robotic partners as the company pursues its irreversible path to higher levels of automation. 

KMI Interviews with Recent CKM Students

January 2, 2018

As we enter the new year, KM Institute begins our 17th year of offering the flagship Certified Knowledge Manager program.  Well over 8,000 students have now taken our courses globally.  We have expanded our global reach with public and private courses conducted in West Africa, Europe, India and the Middle East, in addition to the United States.  New partnerships are forming in these regions as well as South America and South Africa.

To keep pace with this growth, we have expanded our instructor base beyond our Chief Instructor and KMI Founder, Douglas Weidner, to include KM experts both here in the U.S. and abroad.  At the conclusion of a recent CKM class, we interviewed a few of the students to get their feedback.

Please take a few moments to view the video below.  Enjoy!  And thank you to our students for participating in the interviews.

For more info on the CKM program, click here.

Information Architecture and Big Data Analytics

December 13, 2017

Information Architecture is an enabler for Big Data Analytics. You may be asking why I would say this, or how does IA enable Big Data Analytics? We need to remember that Big Data includes all data (i.e., Unstructured, Semi-structured, and Structured). The primary characteristics of Big Data (Volume, Velocity, and Variety) are a challenge to your existing architecture and how you will effectively, efficiently and economically process data to achieve operational efficiencies.

In order to derive the maximum benefit from Big Data, organizations must be able to handle the rapid rate of delivery and extraction of huge volumes of data, with varying data types. This can then be integrated with the organization’s enterprise data and analyzed. Information Architecture provides the methods and tools for organizing, labeling, building relationships (through associations), and describing (through metadata) your unstructured content adding this source to your overall pool of Big Data. In addition, information architecture enables Big Data to rapidly explore and analyze any combination of structured, semi-structured and unstructured sources. Big Data requires information architecture to exploit relationships and synergies between your data. This infrastructure enables organizations to make decisions utilizing the full spectrum of your big data sources.

Big Data Components

 IA Element                 Volume                                  Velocity            Variety

Content Consumption

Provides an understanding of the universe of relevant content through performing a content audit. This contributes directly to volume of available content.

This directly contributes to the speed at which content is accessed by providing initial volume of the available content.

Identifies the initial variety of content that will be a part of the organization's Big Data resources.

Content Generation

Fill gaps identified in the content audit by Gather the requirements for content creation/ generation, which contributes to directly to increasing the amount of content that is available in the organization's Big Data resources.

This directly contributes to the speed at which content is accessed due to the fact that volumes are increasing.

Contributes to the creation of a variety of content (documents, spreadsheets, images, video, voice) to fill identified gaps.

Content Organization

Content Organization will provide business rules to identify relationships between content, create metadata schema to assign content characteristic to all content. This contributes to increasing the volume of data available and in some ways leveraging existing data to assign metadata values.

This directly contributes to improving the speed at which content is accessed by applying metadata, which in turn will give context to the content.

The Variety of Big Data will often times drive the relationships and organization between the various types of content.

Content Access

Content Access is about search and establishing the standard types of search (i.e., keyword, guided, and faceted). This will contribute to the volume of data, through establishing the parameters often times additional metadata fields and values to enhance search.

Contributes to the ability to access content and the speed and efficiency in which content is accessed.

Contributes to how the variety of content is access. The Variety of Big Data will often times drive the search parameters used to access the various type of content.

Content Governance

The focus here is on establishing accountability for the accuracy, consistency and timeliness of content, content relationships, metadata and taxonomy within areas of the enterprise and the applications that are being used. Content Governance will often "prune" the volume of content available in the organization's Big Data resources by only allowing access to pertinent/relevant content, while either deleting or archiving other content.

When the volume of content available in the organization's Big Data resources is trimmed through Content Governance it will improve velocity by making available a smaller more pertinent universe of content.

When the volume of content available in the organization's Big Data resources is trimmed through Content Governance the variety of  content available may be affected as well.

Content Quality of Service

Content Quality of Service focuses on security, availability, scalability, usefulness of the content and improves the overall quality of the volume of content in the organization's Big Data resources by:
- defending content from unauthorized access, use, disclosure, disruption, modification, perusal, inspection, recording or destruction
- eliminating or minimizing disruptions from planned system downtime
making sure that the content that is accessed is from and/or based on the authoritative or trusted source, reviewed on a regular basis (based on the specific governance policies), modified when needed and archived when it becomes obsolete
- enabling the content to behave the same no matter what application/tool implements it and flexible enough to be used from an enterprise level as well as a local level without changing its meaning, intent of use and/or function
- by tailoring the content to the specific audience and to ensure that the content serves a distinct purpose, helpful to its audience and is practical.

Content Quality of Service will eliminate or minimize delays and latency from your content and business processes by speeding to analyze and make decisions directing effecting the content's velocity.

Content Quality of Service will improve the overall quality of the variety of content in the organization's Big Data resources through aspects of security, availability, scalability, and usefulness of content.

The table above aligns key information architecture elements to the primary components of Big Data. This alignment will facilitate a consistent structure in order to effectively apply analytics to your pool of Big Data. The Information Architecture Elements include; Content Consumption, Content Generation, Content Organization, Content Access, Content Governance and Content Quality of Service. It is this framework that will align all of your data to enable business value to be gained from your Big Data resources.

Note: This table originally appeared in the book Knowledge Management in Practice (ISBN: 978-1-4665-6252-3) by Anthony J. Rhem.

Transforming an Employee Portal to a Digital Workspace

November 29, 2017

When is a company both a brand-new startup and an established and mature company at the same time?

In November 2016, Johnson Controls spun out its auto seating business, creating a new, wholly independent company called Adient.  Adient was born as a fully mature company, with established clients, operations, and market share.  But Adient was also born with a strong desire create a new corporate culture of information sharing and collaboration.

How could the new company do both?  How could Adient continue to maintain the content and information that was needed to sustain operations, while simultaneously rethinking how employees communicated, collaborated, and exchanged information?

Immediately after the launch of the new company, Adient Communications and Enterprise Knowledge LLC (EK) teamed up to rethink and redesign how the old-style employee portal could be transformed into a new digital workspace.

From the outset, the team identified a set of competing priorities that it needed to rationalize during the transformation process.  They needed to:

  • Maintain continuous access to legacy information used in day to day operations;
  • Clean up the mountain of obsolete information that Adient had inherited from its parent company;
  • Make information access and maintenance easier and more intuitive;
  • Change the paradigm for how the company communicates to employees;
  • Change the dynamics of how employees exchanged information and collaborated, breaking down traditional barriers to information exchange by using a more social model.

The portal transformation team designed a way to gradually transform the portal in real time, rebuilding it while maintaining access to current information.  The transformed digital workspace would:

  • Disseminate “official” company announcements using social tools rather than email or static content;
  • Delineate spaces for all company content vs. team based content;
  • Align content access interfaces with how users look for information;
  • Align content maintenance interfaces with content owners;
  • Enable contextual, search driven navigation to simplify the overall portal architecture and move users to target content quickly;
  • Include refreshed – and dramatically simplified – content about company functions and tools.

A New Paradigm for Internal Communications

Instead of using traditional static tools for internal communications such as long form news articles and email announcements, the portal transformation team shifted toward new uses for traditional social tools.  The company’s social network (in this case Yammer) was redeployed in the service of traditional internal communications by putting in place two innovations:

The employee communications “avatar.” The portal transformation team created an employee communications avatar within Yammer and used the avatar for official announcements from the company.

Social network style employee announcements were now available both on the portal home page and via the Yammer app.  The announcements were shorter, more immediate, and easier to scan; they allowed for employee comments and “likes,” and were available through a variety of user interfaces, both desktop and mobile.

Internal “twitter-style” leads for traditional articles. The team replaced the traditional static “news quilt” on the portal home page with a news stream that essentially served as a rolling inhouse twitter feed. Leads to traditional featured news articles or videos were linked to the full versions of the item, again with access from more interfaces and with more mobility and scan-ability.

Realigned Search Scopes

To improve search results in the short term, search scopes were adjusted away from the default “index everything” approach, to a more finely tuned set of search results that returned more reliable content.  The revised search scopes:

  • Separated the all company facing portal content from team content;
  • Focused on destination content rather than landing or index pages;
  • Pointed to content that was actively curated by content owners;
  • Used “best bets” to match selected content to specific tasks.

Transforming Traditional Portal Content

To transform old style portal content for the full range of company functions, the portal transformation team created three tracks of work, namely:

  • Design and implement a common information infrastructure as a foundation;
  • Segment and prioritize the work to gain control over the tasks ahead;
  • Extend the infrastructure to each functional area and content owner via a standardized process.

Information Infrastructure

A common information infrastructure formed the foundation of portal transformation.  The infrastructure included enterprise metadata and content types, common search facets, and a patterned approach for portal design that could be repeated for each functional area of the portal.

Search driven navigation at each functional level dramatically simplified the page hierarchy and reduced the level of effort needed to transform each functional area.  Each area was now a “content engine,” designed to consume content from content owners and serve content to users based on browsable search facets.  Content owners were now focused on maintaining content rather than on managing an individually designed portal site.

The enterprise metadata, content types, and search refiners were extended and deployed to each functional area as they were addressed.  The common page hierarchy followed the pattern in the diagram below:

 

 

 

 

 

 

 

 

Prioritized Approach

Not all functional areas of the company could be addressed at the same time.  

Prioritizing and then ordering the individual functional areas allowed the portal transformation team to transform the portal one functional area at a time. Before engaging with the individual content owners, the portal was segmented by functional area and then prioritized across three criteria:

  • End user value and importance;
  • Relative size and complexity;
  • Readiness of individual content owners.

High value, low complexity areas of the portal were identified using a magic quadrant and then further prioritized by considering readiness of individual content owners to engage in the process.

The result was a prioritized listing of legacy portal sites that began with high value / low effort sites where the content owner was ready and willing to undertake the transformation process.

Rolling and Repeatable Process

For the actual conversion of legacy portal content to the new digital workspace model, the portal transformation team established a process that could be repeated again and again with each new functional area that was addressed.  The end-to-end process was highly regimented and could be staggered with different functional areas at different stages in the process at the same time.  For a functional area with an average amount of content (e.g. between 100-200 information items) the entire process from engagement to launch took about 6 weeks.

Here are the individual process steps:

  • Engage with content owners to describe the process;
  • Guide content owners through a cleanup of their legacy content, identifying content that should be archived, the content that should be moved to the transformed portal space, and the content that should be moved after modification;
  • Extend the enterprise metadata and content types to accommodate the new function;
  • Build out of standardized information and document repositories for that function;
  • Migrate refreshed content into the new repositories;
  • Build contextual landing and search results pages;
  • Launch and announce the transformed functional area.

Conclusion and Results

The transformation work at Adient is ongoing.  Even with this highly structured and repeatable approach, changing an old-style employee portal to a more agile and flexible digital workspace takes time.  Structuring the work in this way, using standards and a repeatable process, allows for a sense of momentum and continuous improvement that is obvious and observable by employees throughout the company.  Portal transformation is no longer a chore that runs in the background, it is a company-wide initiative that is changing the way that Adient works.

Taking an Agile Approach to Adoption

October 25, 2017

Ensuring the adoption of new knowledge management programs, systems, and tools requires thorough planning well in advance of actually launching a new initiative. It also takes an agile approach to designing your solution so that you can adapt what you deliver based on what your employees truly need to help them get their job done.

In this presentation, you’ll learn how to develop, refine, and execute the following critical plans, which will ultimately maximize employee engagement with the “new way of doing things.”

  • Charter and Project Plan
  • User and Stakeholder Analysis Plan
  • Communications and Change Management Plan
  • Training Plan

Based on her experience managing a successful initiative to design and implement a content management tool for the communications department of a large manufacturing organization, Mary Little will share step-by-step guidelines for improving the adoption of your knowledge management solutions.

Mary gave this presentation to the October 16-20 CKM Class in Tysons, VA.

Please click here to view her full video presentation...