Measuring outcomes in Vocational Education – Are we doing enough?

With the deadline for Total VET activity reporting fast approaching us my mind has turned to how we evaluate outcomes in Vocational Education in Australia, or at least the data that we collect nationally which seeks to provide us with information on the Vocational Education Industry in this country.  Now we all know and love the Learner Questionnaire and the Employer Questionnaire, but really what are we capturing here and is it really giving us any real useful information.  Lets be honest, when we look at the Questionnaire’s, even the employer one, they really nothing more simple level one satisfaction (smile) sheets and anyone who has been in the L&D industry for even a few weeks knows how resounding useless smile sheets are if you want to convince anyone about anything that has to do with training.  Now I know that these are only one source of data, but if I had used that sort of data to convince an executive team as to why they should invest in a learning program I would have been laughed out of the boardroom.

If we look at the well know Kirkpatrick model of evaluation with its four levels

  1. Reaction.
  2. Learning.
  3. Behavior.
  4. Results.

realistically here all we are doing is evaluating reaction.  We are not really looking at anything that comes close to the other 3 steps, at least not on any consistent regular basis.  Now of course, someone could suggest that levels 2-4 and even level 5 (ROI) if we want to include that is something that should be done in the organisations that are sending their employees to be trained and that is a fair point and I would suggest that most organisations with L&D units are already doing this.  However, what about all of the people who are undertaking training not at the behest of their employer, they may be unemployed, casual, wanting to change careers, we seem not to be collecting and analysing any higher level data on these people.  We also seem not to be asking those employers who are collecting robust data on the learning of their employees for their data either.  Now admittedly in Queensland, and this may be the case in other states as well there is some collection being done of employment outcomes for students undertaking funded programs, but really if we are trying make a better system don’t we need some real data to allow us to make decisions about what is working and what isn’t.  (if we are collecting this data and someone knows about it and how we are doing it please let me know)  I come back again to the point that if I was to try and justify an organisational L&D budget on the kinds of data that is being collected on training, I would be told to go away and come up with some real figures.

So what should we be capturing, If we look at the Kirkpatrick model (I will talk about some other options later)  we should at least be capturing whether or not the student actually learnt something, where the learning objectives met, and how much change has there been in their skills, knowledge or attitude.  Now of course to do this properly, we need to collect data before and after training, and it could be suggested that by virtue of the person going from not having a qualification to having a qualification (becoming competent) that there is an increase in knowledge, but is this actually the case, has their actually be a change in the persons skills, knowledge or attitude.  With the data we are collecting at the moment we just don’t know we are making assumptions, but without good data those assumptions could very well be wrong.

What about whether or not we are seeing any changes in behaviour, are the learners putting what they ‘learnt’ into action in the workplace, are their behaviours changing as a result of the training that was delivered to them.  This is not something that can be answered with a smile sheet at the end of training, or an employer questionnaire with general questions about perceived improvement.  This is data that needs to be collected once the learner has been in the workplace for a while, once they have had time to adjust and integrate their new learnings into their work environment.

Then of course there is the question of what the result was, did this training improve anything organisationally, on a state level, on a national level.  Was it good for the organisation, the country, the learners, the bottom line, whatever.  We need to figure out what areas we a seeking to see improvements in and then track and see whether or not we are seeing those improvements.  Again this is a step up in complexity in terms of the types of data needed and how we might collect it.

Finally of course there is the question of Return on Investment.  It lets us begin to answer the question, has the billions of dollars the government has handed out in VET-FEE HELP and other programs and the money spent by business and organisations on top of that has been worth it, have we got the return on that money that we expected or did it just swirl away into a black hole where it didn’t improve anything which we would consider enough to measure.

Is all of this easy to do? No.  Am I suggesting that the reporting requirements on RTO’s change so that they are required to capture a whole range of additional data over a timeframe long after the student has finished their courses?  No.  Am I suggesting that perhaps we need to do a yearly survey, akin to the kind done by ATD (what was ASTD) on the state of learning, and include employers, organisations, training providers and learners and try to ask them some serious questions?  Maybe.  Am I suggesting we need to do more with this so called big data thing that has been talked about so much?  Maybe.  I don’t have a definitive answer, I just think we need to be doing more if we really want to capture the true outcomes from our Vocational Education system.

 

Advertisements
%d bloggers like this: