Evaluating informal learning – some more thoughts

Yesterday I started talking about how we might look at evaluating informal learning and I suggested that there were a number of steps or processes we might need to implement if we really wanted to look seriously at getting meaningful metrics out of the informal learning that was occurring within our workforce.  I suggested the following list of steps which might make up the process of evaluation;

  1. Skills Outline for each role type within the organisation,
  2. Assessment of Staff member against skills outline – there are a range of options here, but I think there has to be at least self-assessment + manager assessment at the very least,
  3. System for capturing staff informal learning activities,
  4. Regular (6-12 Month) updates of Staff skill assessments,
  5. Data capture of changes in skills levels across the organisation,
  6. Method of mapping skills changes to competencies. and
  7. Methodology of converting skills changes to organisational metrics and ROI.

Today I want to look at some of the options and some other ideas around some of the items on this list.    I think there are a number of key issues here, firstly, the issue of establishing a baseline skill level, the issue of measuring changes from that baseline and finally the issue of taking the data relating to those changes either at an individual level or at an organisational level and translating them into something meaningful.  I think that, as I said yesterday, there are a number of ways in which that initial baseline data can be captured, however the more I think about it the more it becomes clear to me that there is a definite need to have robust skill groups determined for each role type within an organisation.  These skill groups are entirely separate to any particular position description and are tied to role types and levels rather than to specific staff or positions.  Now to what level of granularity an organisation is going to need to go to is going to have to be determined by each organisation.  My thinking however is that for most organisations there would be general role types to which skill groups could be attached.  The skills contained in these skill groups would also have some similarity through the hierarchy of the organisation, everyone in the organisation needs to be able to communicate, but the level of skill expected may well be different.

Once we have the skill groups and have assessed staff members against their relative group (through whatever method we choose), we then need to come up with a method or a process of regularly assessing individual’s fluctuations in the skills within their skill group.  Again it is going to be up to an individual organisation how robust they make these assessments, at the lowest end would sit, I think, a 6-12 month self-assessment by staff of where there skills now sit.  At the highest end would be some form of regular, controlled, formalised testing process which provided solid evidence in relation to changes in key skills.  If we were thinking about say desktop applications such as Word, staff members could at set time intervals be required to undergo a standardised external test of their skills and knowledge, which should give solid, meaningful data on changes to skill levels.

At some point however, we are going to have to decide what this data means either on an individual level, an organisational level or both.  It is here, I think, that the real challenge may lie.  If we look at the following example; an organisation has been capturing data on its staff skill levels across, they have also been providing staff with access to and encouragement to utilise and learn through informal methods.  After 12 months they see an average rise in skill levels across the organisation of 5% and over the same period a rise in sales of 3%.  Can we make a connection between the two, is there any correlation between the increases.

At least at this point I think you would be hard pressed to make much out of it.  It may even be difficult to make the correlation between the informal learning and the rise in skill levels, why?  Well simply because unless all they have been doing is engaging in informal learning and there has been no formal activities, or coaching and mentoring, supervision etc., then we are going to need to come up with a process of separating out the gains make from formal learning and the gains made from informal learning.  At this point we might be tempted to fall back on something like 70:20:10 and say something like, well 70% of learning is informal so 70% of the skills increase is due to informal learning, namely a 3.5% increase.  I that I would and I think that a lot of other people would find an argument like that to be somewhat less than satisfying.  It could be the case that it was the 10% formal learning component that produced the entire 5% rise in skill and informal learning did not contribute to the skills increase at all.

Even if we can make a strong case that informal learning contributed to the skills increase, we then still have to make the case that the skills increase contributed to the sales increase and unless we can show a link between the learning, the skills and the increase in sales that is going to be difficult.  Here I think it is the level of granularity that we apply to the data we are collecting.  If we can show that members of the sales department accessed a range of informal learning resources all related to closing sales, and that their skill level at closing sales went up (by whatever means of assessment we are using) and then the overall sales figures for that sales department increased, then I think we might have a strong enough case to suggest that the informal learning the staff did, had a correlative effect on the sales figures.


Evaluating Informal Learning

As some of you know the problems associated with capturing the organisational impact and increases in competency (if any) gained by staff through their informal learning has occupied my thinking for some time now, and I have posted and spoke about it on a number of different occasions.   I really got to thinking about it again after the recent learnX conference, particularly after some stimulating conversations with Con and Saul amongst others.    The problem for me is that even if you don’t believe the numbers  in 70:20:10 (which I don’t)  there is still a lot of informal learning that happens in a person’s life and at least some of that learning has to relate to their job roles.  Just before I go on however,  I want to set some shall we say boundary conditions to what I am talking about here and that is informal learning that has some impact on the day-to-day operation of the organisation.  If you choose to go off in your own time and as one of my colleagues loves to call  it, study underwater basket weaving, I am really not interested in the fact that you have done that, unless you can show be some tangible link to your day-to-day work.

So for me there are two types of organisational informal learning, that learning that simply increases, builds or improves a skill and that learning that does that and in addition provides some ‘formal’ recognition pathway for the learner.  When I think about this however it is more  the job role or the organisational imperative that moves us towards the path of recognition rather than in general the needs of the staff member.  Again however,  when I am talking about recognition I am talking about formal recognition, where there is some ‘qualification’ style outcome as a result of the learning, usually in an Australian sense a Unit of competency for example.  I am not talking about badges and other like devices to capture learning, be they peer-reviewed or whatever, for from an organisational point of view they in my opinion are (at least currently) meaningless.

Why do I say this; I have often recounted a story of being asked as part of a formal investigation into an incident involving a member of staff, “how did you know this person was competent?”  Now if my answer had been, well he has a badge for it, I think I may have gotten a much different reaction to saying, as I did, “Well, they have completed all of the assessment tasks, including a third-party observation, necessary for us to be satisfied that they were competent under the rules of evidence set out in the legislation pertaining to the operations of Registered Training Organisations.”

Not all informal learning though is going to lead a staff member to a qualification however, some of it is not related to or captured by the range of qualification available, some simply adds to the skill set they already have, making them better at their role, but not providing them with a new skill. So for me there are a number things that we need to know in order to be able to begin to evaluate informal learning, and I am indebted to Saul Carliner for some of his thinking around this;

  1. A baseline – what is the staff members current skill level,
  2. What they have learned,
  3. How they have learned it,
  4. New skill level,
  5. Is there a competency attachable to the learning,
  6. Effect of learning on organisational metrics – reduction in customer complaints, less injuries etc, and maybe
  7. Return on Investment?

Now you might look at that list and say well isn’t that what we would want from any kind of organisational learning process, and in fact is not exactly the kinds of data that we want from our formal processes.  I don’t see why we should be treating the outcomes of informal learning differently to how we would treat formal learning.  Now it may not be relevant, we may not or able to or we may not want to, capture the standard ‘smile’ sheet satisfaction style data that we collect from formal training and yes, the natural of the learning, pull not push, driven by the individual, just in time etc, all make the nature of the process of learning different.  However, when we look at it from an organisational point of view aren’t we not looking for the same thing as with formal learning.  We are looking for an increase in the skill level of the staff member, such that increase in skill will have an effect on the relevant workforce metrics that relate to their role in the organisation.  If we aren’t looking to improve the skills of our staff and the organisation as a whole, what are we investing in informal learning systems in the workplace, and why has it become so important.

So what are some ways in which we can achieve this?  If we start with the idea of a baseline we might be able to sort out some structure and processes around this idea.  So, where might a baseline come from;

  • Position Description,
  • Performance and Professional development plans,
  • Self Assessment,
  • Formalised Assessment, or
  • Job skills analysis plus a rating system.

There are a range of ways in which we can establish this baseline, but how can we do it without it being onerous on everyone involved.  If we use position descriptions as our starting point, we have the problem of there not being enough detail or they are not skills based or we don’t assess the person against them in a really formal way, that gives us any real data to work with in the first place.  They could be coupled with self assessment, and direct manager assessment to give a fuller picture of the skill set and levels of an individual staff member.  PPD plans can be seen in the same light, in order to make them more useful in terms of presenting us with a baseline we need to capture more granular detail about the role and the staff members skills relating to that role.

So I have a bit of a rough process around my thinking in relation to this and it goes something like this;

  1. Skills Outline for each role type within the organisation,
  2. Assessment of Staff member against skills outline – there are a range of options here, but I think there has to be at least self assessment + manager assessment at the very least,
  3. System for capturing staff informal learning activities,
  4. Regular (6-12 Month) updates of Staff skill assessments,
  5. Data capture of changes in skills levels across the organisation,
  6. Method of mapping skills changes to competencies. and
  7. Methodology of converting skills changes to organisational metrics and ROI.

So as I said that is my thinking currently, I would be really interested in getting everyone’s feedback on what they think, so feel free to chime in and let me know what you think.

Running Training Like a Business

I am a big believer that organisational training should be run like a business. Thinking of L&D units as simply cost centers devalues the work being done.

NIIT Managed Training Services

Watch Edward Trolley, Vice President of NIIT’s Managed Training Services and co-author of the bestseller, “Running Training Like a Business” talk about helping customers run learning like a business.

View original post

Has L&D become its own worst enemy?

I was reading Ryan’s blog post on Face time in online learning, this morning and it got me thinking about how sometimes I feel that as L&D professionals we have shot ourselves in foot so to speak in terms of our relevance to organisations.

I was responding to a comment from Con, about the fact that often organisation and management balk at the costs associated with face to face delivery or having an informal meetup of participants and i started to think that perhaps in our rush to embrace and to ‘sell’ online learning to the business, a lot of time on the grounds that it is more cost-effective than face-to-face delivery, that we may have made a rod for our own backs when we wanted or needed to include face-to=face components.

It goes a little deeper than that however, and I think back to LearnX and some of the conversations I had there and a particular presentation by Saul Carliner when he said 70:20:10 is not an investment strategy. (as most of you know I am not a 70:20:10 believer, there is no rigorous evidence to back up the numbers.  Yes a lot of learning is informal but putting numbers around it when there is nothing by anecdotal evidence to substantiate it casts us all in a bad light.)  Even though it isn’t an investment strategy it is often ‘sold’ that way, even if the selling of it is unintentional.  If you talk to the business about 70% of all workplace learning being informal and that we need to invest in technology to ensure our staff access to this avenue of learning. promote it, link it to our talent management and retention and development strategies then it starts to sound like it is an investment strategy.  The problem is that I am not sure I know too many organisations who are going to increase their learning budgets by 70% to incorporate informal learning, in fact i think it is probably going to come from the L&D budgets that are already there, which will have an impact of course on our ability to delivery the formal training that the organisation needs.

Hmm, perhaps if we move all of that formal training online and not worry about face-to-face then we can free up budgets to increase our staffs access to informal learning, and there we have it we have shot ourselves in the foot again.

I see it everywhere the more I think about it, formal qualifications through the VET sector are devalued because we chase funding for their delivery to make them free or heavily subsidised for staff and the organisation and then when the organisation wants to have staff do a qualification that is not funded they choke when they see the price tag.  Other training reduced to online only because of the cost savings associated with delivery, (and don’t get me wrong online delivery can certainly be a huge cost saving) where having even a half day of face-to-face time would greatly improve the outcomes for staff and the organisation.

Now I am not suggestion that everything go back to face-to-face and L&D should get massive increases in budgets (although that would be nice wouldn’t it) what I am saying is that if we spend all of our time talking about the next big thing, the new way of learning.  If all we talk about is how cost-effective online delivery is and how informal learning is the way of the future, can we really than be surprised when the business turns back to us and replaces the L&D unit with free mooc’s from the cheapest provider.

Face time

Interesting reflection piece from Ryan, about the value of face time in online learning.

E-Learning Provocateur

My wife and I are studying Foundations of Business Strategy together. And by “studying”, I mean we are watching the videos – which are excellent, by the way.

My wife is a marketing professional, while I’ve been in the corporate sector for most of my working life, so we find ourselves regularly pausing the videos and launching into conversation about what was said. And it’s great!

I’m learning from her, she’s learning from me, and we’re both learning from the professor. Much more so than if either of us were studying alone.

A Moai in a field on Rapa Nui

Of course, peer-to-peer interaction isn’t a novel concept in e-learning. We have asynchronous tools such as online discussion forums, synchronous tools such as instant messaging, and semi-synchronous tools such as Twitter.

To add voice to the conversation we can use teleconferencing or VoIP. To add faces we can use webcams and maybe, one day, holograms.

But what strikes…

View original post 300 more words

Stop doing Training for the sake of Training – and stop funding it as well

So as some of you are already aware I attended the first of the QLD Governments Industry Skills Forum today. Firstly I am going to say if you get the opportunity to attend one of these forums (and apparently there will be more to come) you should.  If for no other reason than to ensure that you know what is going on.

The morning was hosted by Brett Schimming from  Construction Skills Queensland, and I will come back to what Brett said a little later.  Assistant Minister Saxon Rice spoke, outlining the government’s position on training  and TAFE.  The keywords were;

  • Engagement,
  • Accessibility, and
  • Quality

with the main engagement piece being around the creation of the Ministerial Industry Commission, an independent body providing advice directly to the Minister for Education.  Assistant Minister Rice and every one else who spoke, pointed out quite strongly that this would not be a representative commission.  It would not be a table around which all of the industry groups and sectors and other interest groups sat.  Its purpose would be rather to look the evidence are training and employment needs in the state and on the basis of the evidence it would advise the Minister, in particular on skills and workforce development priorities.

So where will that evidence come from the various sectors and industry and other stakeholder groups, through consultation and submission to the commission which will then utilise that and a range of other data to decide on priority occupations and other workforce development needs.

The biggest takeaway if you will from the morning came from Brett, when he said and I am paraphrasing here a little (sorry Brett feel free to correct me if I have got to badly wrong)  ‘the VET system is not the main game, it is not the center of the universe for business, it is the benefits derived from training not the training itself that is important, we need to stop doing training for the sake of doing training.’

This position seem strong through everyone’s talks and hits the nail on the head at least in my opinion.

There are too many RTO’s out there who continue to say that they cant stay in business because the government has changed the funding model.

It is not about you (or us as an RTO) it is about industries, the organisations, the business and the individuals, who derive value from the training.  Training for the sake of training, (at least funded training in the VET sector where there are supposed to be employment outcomes) provides very little benefit to industry, organisations or the individuals who utilise it.  Giving someone a Diploma of Management just because there is government funding available to do it (and trust me that is the pitch of almost every RTO that has cold called me in the last 2 years) is pointless unless there is going to be some benefit derived from that training and some tangible benefit, not one of these oh so common increased productivity calculations that are nothing by trumped-up nonsense.  There must be strong, evidence based reasons for the funding of training, we should be able to show what the benefits to the business or individuals are in terms of employment or productivity or workforce participation, we should have strong and robust evaluative systems that allow us to actually show this value.

If training is not linked to an actual employment outcome and strongly linked (and let’s be serious is a personal training certification really an employment outcome when I can’t walk to the train station without tripping over people currently doing the qualification) then why should it be funded.

LearnX 2013

Rather than blog about LearnX directly I am just going to be lazy 🙂 and reblog Helen’s observations. My one observation is that my standout for the conference has been Professor Saul Carliner. Who was more than happy to challenge some accepted myths, but also had a range of interesting and insightful things to say.

Activate Learning Solutions

It’s not everyday that you get the opportunity to go to a learning conference with your colleagues. I jumped at the chance when I was asked as they were going to receive an award for the best accessible eLearn course at LearnX.

As it was in Sydney, a city I don’t like to visit often, it was more like an adventure for me. I like to see what other companies are doing in learning and development as well as meet more of my personal learning network – maybe even win an iPad mini too.

This year’s LearnX conference seemed smaller than the one in Melbourne last year. Sometimes I go to these expecting huge turn outs much like DevLearn and I’m slightly disappointed. I have to remind myself that our market is a lot smaller.

Our first guest speaker was Dr Gary Renker who spoke about the evolution of coaching…

View original post 1,263 more words

%d bloggers like this: