Posts in Category: Learning and Performance

General posts about Learning and Performance topics

Cereal Killer App

Remember Sugar Bear? He was the coolest of the early 70s cartoon cereal ad mascots, way cooler than the Trix rabbit or the Fruit Loops toucan. At CISCO TAC Bootcamps we adopted him as our guru and I code-named this one project Super Sugar Crisp, mainly so I could hear serious network engineers refer to the thing by that name.

The Problem

in 2012 CISCO’s Technical Assistance Center (TAC) was a huge operation contributing to the Service and Support organization accounting for 1/3 of CISCO’s overall revenue. As with most call centers, support calls are initially handled by Level 1 engineers, and escalated to Level 2 if they aren’t able to resolve the issue. L2 escalations were carefully tracked and cost-accounted and at CISCO scale, any reduction in the number of escalations would have a huge bottom-line impact.

TAC engineers had access to a massive amount of support and training content, but it was distributed across a number of legacy sites managed by a variety of teams. This included support content like user and administrator guides, training content, data sheets, New Product Introduction (NPI) videos and presentations, release notes, software version lookups and more.

Finding the appropriate content in a timely manner while a customer was on the line was nearly impossible as all of these sites were organized for browsing rather than searching.

The Solution

Instead of trying to re-aggregate all this content under a single system, I designed and developed what amounted to a federated search engine tied to a metadata database with URL references to each content resource, document or file. Working primarily with Josh McCarty under Steve Roche and his boss John Rouleau, we scraped 15 or so source repositories and built the metadata database using MySQL. Metadata included summary description, URL, publish date, product and product family, version numbers, content type and more.

Then we built the back end application using an object-oriented, MVC-like model, with PHP Zend and PEAR frameworks and Zend Lucene as the search engine. Lucene is incredibly powerful for an easy-to-implement, open source search engine. We crawled the database and source repositories weekly to keep the data current.

The front end used the Domo javascript framework to display results (JQuery was just coming into fashion) as this allowed us to easily integrate into the CISCO intranet. We used a simple RESTful API and returned results in JSON format that was easily parsed into Domo widgets.

The Results

Within the first six months of deployment we found the system in widespread use by the TAC and L2 escalations being reduced in some areas by as much as 40%. Steve ran the numbers and found we were saving the overall TAC around $20M on an annualized basis. He submitted the project for a CISCO Pinnacle Award from Ana Pinczuk’s TAC organization under Joe Pinto, and we won in early 2013.

How Learning drives value: Kirkpatrick meets 4DX

So how do you know that your Learning and Development efforts are paying off?

I had a conversation with a Sales Operations team a while back, asking whether they saw any correlation between the sales training that was being required (and tied to comp!) and actual sales performance. They said they’d never been able to show even a correlation between learning and actual results.

So is sales training a complete waste of effort and resources?

This is the double-edged sword we swing as L & D professionals when we try making the business case for learning. Part of the problem is trying to correlate learning “completions” directly with sales bookings, deal size or other metrics that drive the business. It’s too large of a leap.

A possible solution is to break the problem down, and this is where The Four Disciplines of Execution (4DX) comes in. If you are not familiar with this book or methodology, you can check out a quick introduction by author Chris McChesney on YouTube.

The Second Discipline, the discipline of Leverage, tells us that for every Wildly Important Goal, there are “lead” activities that we control that can drive, influence or predict the “lag” results– those results beyond our direct control we only know about after they occur– like quarter sales numbers. By focusing on the “leads” (and measuring the leads and lags) we can clearly see our impact on the outcomes. This assumes, of course, that the lead activities actually determine, or at least predict, the lag results.

For the Learning professional, what are the leads and lags? And how do these translate to the leads and lags for our target audiences? By splitting Learning Leads and Lags from Performance Leads and Lags we get the following matrix, which maps nicely to Kirkpatrick’s Evaluation model:

Kirkpatrick-1-4-table

Working backwards from L4 in the lower right quadrant, we need to show that performance of the selected Lead measures (L3) actually drive, influence or predict the L4 business results. This is often the job of Sales Operations, using statistical correlation or regression analysis to show the connection.

We can use the same tools in the Learning row to show that L1 Evaluations of courseware quality, instructor prep and delivery and applicability of the content to actual job duties drives L2 Learning outcomes and improvement.

We only need to show that L2 drives, influences or predicts L3 performance lead metrics to indicate the value to the business. Our mistake has been trying to show a relationship between L1 and L4 directly without considering the intervening levels. 

Performance is the key pivot point here: I’ll have more to say on the actual methods for determining these connections between Leads and Lags, and from Learning Lags to Performance Leads in future posts.

15 questions to help guide Learning & Performance strategy

Over the last several years, as I’ve moved from individual contributor to manager and director-level roles, I’ve had to think more strategically about what learning and performance mean in the context of overall organizational growth. These ideas have been forged in the last five years in late-stage, high growth startups. In these companies, L & P strategy is often an afterthought, or “something that HR does.”

The following is a condensed list of my strategic planning questions:

  1. What would a 1% increase in Sales productivity (or customer retention) mean to your bottom line? 2%? 5%?
  2. Do you know whether your investment in learning is paying off? How do you (or would you) know?
  3. What’s the best way to invest your limited L & P budget?
  4. How do you measure the 90% of learning that happens outside formal e-Learning, virtual and classroom events?
  5. How are you embedding learning into individual work streams?
  6. Are you asking the right questions in your course evaluation surveys?
  7. What is the shelf-life of your e-Learning content? How are you updating and maintaining it?
  8. Do you have a learning content governance model? What is the value-add of quick access to accurate content?
  9. Do you have a job-role competency model? Is it used across recruitment, talent management, performance and learning systems?
  10. Which stages of ADDIE do you skip or gloss over?
  11. What are your top performers doing better, differently, more of or less of than average performers?
  12. Do you use proficiency levels and rubrics to guide both assessment and development of individuals?
  13. How is performance improvement built into your organization’s day to day work?
  14. How are you leveraging your learning resources for content marketing?
  15. How is your L & P strategy aligned with overall company goals and objectives?
  16. Is training part of your product development, go to market and commercialization planning?

In following posts I’ll tackle some of these questions in more depth, to get at the critical considerations for each and their relationship and cross-dependencies.