Grassroots and start up Learning and Development

Today I thought I might give the world of Vocational Education a break and have a look at some issues that more focused around corporate learning and development and in particular early stage, or greenfields corporate learning and development. As many of you know apart from trying to run RTOs and navigate the VET sector for more than a decade, I have also been heavily involved in the world of L&D and in particular the world of shall we say grassroots L&D.  So I thought that I might share a few of the insights I have gained over that time.

Firstly what is grassroots L&D?  It’s that L&D role where in real terms you are starting from scratch.  Now this might be for a wide variety of reason, organisational restructure has centralised learning functions and created a new L&D department, L&D has been just in one part of the business and there is a need to make it organisational, the organisation is relatively young or has undergone rapid growth making L&D a focus,  or as happens in a lot of cases for some reason L&D has been badly neglected and everything has basically run down and virtually stopped.

This can be frightening place for an L&D professional to find themselves.  Usually we land in roles where there is already existing structures, where we have the foundations.  Training is being delivered, there is a team who are familiar with the business and its needs, a structure around budgets and finance, you know all of those things we expect to have in an L&D department.  Often at the grassroots level, even in a larger organisation you will find that the L&D team is a team of exactly one, You.  So on top of managing, you may also be developing and delivering training, doing the administration, implementing technologies, and on top of that trying to recruit new staff to take the load off.  The other pressure which is often present in these scenarios, is the pressure to get things up and running as soon as possible so to speak.

It is this expectation of creating something relatively quickly, which can cause heartburn for some L&D folk, primarily because we are often used to having data, strategies, platforms and frameworks already in place to allow us to move forward.  So what most people do is to dive into developing their strategy and framework, start doing TNAs, auditing compliance training and certifications, all those things that we know we have to have in order to deliver meaningful learning experiences to our staff.  This however could be a very costly mistake in terms of you longevity in the role.

Why?  Well because in most of these situations we are dealing with organisations,  managements teams and even boards who may not grasp the complexity of the L&D function.  This is of real concern when the L&D team has been created because the business has discovered a gap, or in some cases a gaping black hole which they need to address and address quickly. They often don’t have time to worry about how things are going to be evaluated for example, they just want them to work.  Getting some kind of training started in a particular area may be far more important than making sure that training is totally aligned with the business plan and strategic goals.  In these cases getting all of your ducks in a row before you start may well leave you in a situation where you find yourself having to justify your achievements. Often in these cases as well there can competing agendas across the business, particularly when L&D has become a more centralised function, instead of being within business units and under the control of Mangers or GM’s.

So what do we do, how can we get done what the business needs, or thinks it needs and still set ourselves up to be able to move forward strategically at the same time.  As many of you may already know I have been a fan of training impact maps for a very long time.  When I first saw one in Brinkerhoff’s book High Impact Learning, they struck a chord with me as a useful tool for ensuring what we are delivering meets the needs of the organisation.  They are incredibly useful in these greenfields style situations where the business wants a solution but is not sure what that solution could be.

How does this work, it’s really simple, get the business, or business unit or even the board to fill this out, with or without help from you and then use the information contained in it to create whatever intervention is necessary to meet the needs outlined.  Here is  hint though, if the business can’t fill out these sections, then training may not be the answer and you may need to have a longer conversation with people.  Another quick hint, and this is really important, resist the temptation to provide the business with ideas around measurement of success, if they don’t come up with it they won’t own it and if they don’t own it and they didn’t tell you that was what they wanted to measure, then you are potentially in for a world of hurt, when they come back and say we actually wanted to see an improvement in X why did you measure Y we don’t care about that.

Part of the trick here also is to get them to ask the right questions such as;

  • Who is the Target group for training,
  • Why are we doing this training, what result will it mean for the business
  • What are the tasks that the target group do that this training is seeking to improve
  • What are the skills and knowledge that staff need to perform these tasks
  • Which of the strategic goals of the business does this training relate to and how, and
  • How will you know if this training has been successful.

So if you can get them to fill this out properly you will have achieved a couple of quite important things, firstly as i said above you will have a nice base from which to look at what interventions you will need to develop to meet the need.  Secondly you will have started the process of the business thinking strategically about its learning and development needs, the value that training brings to the organisation and the need make sure that training that is being delivered or requested will actually meet the needs of the organisation and staff.

Now you should be up and running and can start to build and deliver things and then hopefully start work on some of the other areas which will need your attention.

Customisation of Learning – Connecting L&D and VET

A lot of training providers talk endlessly about their ability to customise a program to meet the needs of an organisation.  However, how many of them actually do it or do it in a way that really meets the needs of the organisation?

 

 

I think unfortunately, or fortunately for those who do, not many.  Often in the VET sector customisation means little more than choosing different electives, although not too different or there might not be someone able to train them. Unfortunately in most cases, just changing electives is not really customisation, it is far more a case of here are the options we are offering what would you like to choose. This of course is not something that is just confined to the VET sector, a great many licensed and proprietary training programs offer very little in the way of real customisation, however it is the ability to customise training to suit specific organisation and even individual need that is a strength of the VET system.

Customisation is building the training program in such a way that it achieves the goals that the organisation wants.  It is about using their documents, their policies, their procedures.  It is about building a program that produces a participant who has the skill set that the organisation requires, and who is able to utilise that skill set in their work.  The common complaint about this kind of customisation from providers is that you still have to do what the training package says, they have to be assessed on the performance criteria and you have to make sure that the skills and knowledge which are taught to the student are not so workplace specific that they are not easily transferable to other workplaces and roles.  Now of course, this is true, but I don’t think that anyone ever said that what was listed in the performance criteria was all a program could to contain.  It doesn’t say anywhere in the packages that you cannot add additional information or assessment or training.  What it says is that this set of skills and knowledge, assessed against this set of performance criteria is the evidence that is required to deem this person competent in this Unit of Competency.

The other issue that is often bought up is where there is something in the performance criteria that for whatever reason the organisation doesn’t do or does completely differently.  An example of this is a unit of competency around strength based practice in support work and counselling.  There is a process mentioned in the performance criteria which while correct and used by a lot of practitioners, is probably not used, described differently, or used differently, by equally many practitioners.  So (leaving aside questions whether or not the criteria should actually even be in the unit) often staff undertaking this unit end up being trained in something that their organisation does not use and in some cases is actively opposed to the use of.  This also then tends to mean that where that unit is an elective and can be left out it is, which may dilute the overall strength of the qualification from the organisations perspective.  It may also mean that the organisation may then have to go out and source additional training or develop it themselves, around the content which is contained in the unit.   So what does customisation look like here, for an organisation that doesn’t use the particular segment of the unit of competency, given that we know that in order to meet the performance criteria it can’t be left out, and it needs to be assessed.  Having done this on numerous occasions the answer is in general remarkably simple, do both and assess both.  Assess the accredited unit according to the performance criteria and the other according to what the organisation wants.  It is then a case of explaining to the students that while you have provided them with two options, one is the preferred method where they work now, but there are other organisations which may prefer to use the other method.  Is it a little more work?  Yes, but it will also makes the organisation much happier than saying well we have to teach them this method because that is what the training package says and then let them come up with a solution around how to train their staff in their preferred method.

Customisation is also about little things,  like making sure that when you are talking about documents and policies the examples you use are, where possible, from the organisation itself.  It is about using the language of the organisation as well, particularly if you are talking about reporting lines, hierarchies and business processes and software.  It is about sitting down with the manager, the L&D person or whoever you are working with and saying, what are the skills and knowledge you need your staff to have at the end of this and what tasks do you expect them to be able to undertake and then structuring the course around that.  Take the time to cluster and structure delivery and assessment so that it makes sense in the context of the work environment.  There is very little point in training someone in a skill they are not going to use for 6 months.  It is better to provide them with the training in proximity to when they will use the skill, to enhance the retention of the skill and knowledge.

Customisation is actually an enormous strength within our VET system.  This becomes particularly evident when it is compared to many of the other proprietary training programs that are out there, most of which can’t be changed or customised to suit particular circumstance, because the material is copyrighted and licensed and often, because of this the people delivering the training have no say in the content or its delivery.  So in order to meet the criteria of the provider that owns the program they have to deliver it in, often, a very particular manner which unless you are training large numbers of people or spending large sums of money on the training are probably not going to be altered by the program owner.  This ability to customise should not be taken to mean that we can and should ignore the rules of the VET sector, things like Volume of Learning, and the rules relating to assessment and evidence, however the space circumscribed by those rules allows us much more latitude to be able to develop and deliver a program that meets the needs of our clients than most licensed training would ever be able to do.

The real problem is that most providers seem very reluctant to do it.

Anyway that’s my opinion.

 

Paul contacted via;

Rasmussen Learning Solutions

Spectrum Training

Time to competence, vocational assessment and organisational need

So in this post on better connecting the L&D and VET sectors I want to look at time frames and how the concept of time to competence may encourage L&D people and organisations to look at professional development training over nationally accredited (VET) qualification.

Most L&D departments are under pressure to deliver programs in quite short timeframes, (Can I have that as a half day?) which I have explored in other works.  There is almost always a pressure from the business to ensure that staff are not taken ‘off the job’ for more time than is actually necessary.  In this way a program that runs over even five consecutive days and then is finished may be preferable to a program that runs for 6-12 months even if it only runs one day a month.  The logistics around making staff available are easier for one-off programs, in a lot of cases particularly where the person works in direct client facing roles, other staff have to be moved around or rostered in order to allow for a staff member to go on a training course.  It is also often the case with VET training that there will be work that the staff member is required after the delivery of the program itself to meet the assessment criteria of the program.  This in turn then, in a significant number of cases, leads to the staff member applying to have some of their work time allocated to completing their study which in turn puts additional time and resource pressure on the business manager.

The other time related factor which often comes into play here as well is that of the time commitment necessary from any managers, supervisors or team leaders involved with the staff who are undergoing training. With most professional development programs as opposed to nationally accredited programs, there is little or no involvement needed from the supervisory staff of those undertaking training.  However this is, in most cases, not the same situation when we look at VET training.  There is almost always in the case of VET training a requirement of ‘on the job’ observation or training which needs to be undertaken with the staff members in question.  This is often further exacerbated where the manager or supervisors are not in the same workplace as the staff requiring supervision and observation and by the by the fact that often these activities have to happen on more than one occasion for each participant.

In addition there is also the issue of the time involved for the individual L&D staff members, with professional development style programs there is often not a lot of additional work which they are required to undertake.  Again this is often not the case with VET training, in particular where the training program being delivered is not simply a generic program.  There is time spent consulting with the RTO around the content of the program, looking at what needs to contextualised to the particular business unit or units who are being trained, signing off on paperwork, which it of particular relevance where VET training is being delivered through a funding or subsidy program such as an apprenticeship or traineeship scheme.

The other side of the coin is that one of the things that organisations like about VET is the robustness of the assessment and the competence that results from on the job training and rigorous training and assessment practices.  This is particularly attractive to organisations who work in areas which could be considered to be high risk or where parts of the business deal in high risk areas.  Should something tragic occur within an organisation which results in the serious injury or death and the organisation needs to testify about the competence of its staff, being able to say that staff had undertaken nationally accredited and been deemed competent, is far more potent than saying that they attended a 2 day course with no assessment of competence.

Now of course this should not be taken to suggest that RTO’s need to shorten their time frames, forgo ‘on the job’ observation and assessment or compromise the integrity of the training and assessment.  Remember it is the robust assessment of competence that organisations value about VET.  What it does mean however is that we need to understand and work with the needs of the business.  This means asking questions like, do thee need a full qualification or just some units, is their training already being done in the organisation that we can map to accredited outcomes.  Make the observation and ‘on the job’ processes as simple for the managers as possible, create good checklists, not just the performance criteria, give the staff journals to fill in themselves, explain to everyone how the process works and what is expected.  Map out everything so the process makes sense for everyone.  The more that both the managers and the staff understand and are engaged in the end to end process the easier it is for everyone to get the result that they want.

Also the easier we can make the process from the perspective of the L&D staff the easier it will be over all.  If L&D can see that the time requirements for them in terms of staff undertaking an accredited program can be minimised, allowing them to do other value add undertakings the more like they are to champion the program and the easier it will be to get those successful outcomes.

 

 

Prior skills and knowledge and the L&D, VET intersection

Continuing on from my last post and in response to a question from one of the Linkedin Groups I am involved with, I want to look at how the knowledge, skills and experience that a person brings to a role are incorporated in this model.  My initial answer was that this is, could and should be handled through the RPL process of the Training organisation which is involved in the model.  This is I think however not the entire picture of what is going on here and why, because really there are three things happening all of which may be heading towards different outcomes.

Firstly we have the person who comes to a role with a set of skills, knowledge and experience, some of which may be directly applicable to the role in question while others may not.  Secondly we have the organisation whose goal is to, at least at a base level, ensure that all of their staff have whatever minimum set of skills and knowledge they have decided is applicable.  Thirdly we have the RTO who is trying to tie all of these threads and others together and translate that into formal outcomes.  Now I have discussed some ideas around how this third piece might be achieved here, but I will discuss additional ideas here as well.

Lets start with the organisation whom the person is employed by.  There are two issues here, the first is that all organisations have a level of expectation in relation to the skills and knowledge of their employees and seek to have all of their employees at that level.  Additionally however even with industry transportable skills, there may be quite large differences in the way those skills are utilised or play out between different organisations.  For example it may be the case and often is that two different community service providers may be ustilising totally difference delivery and care models.  Both of these models will use and rely on the same set of skills and knowledge, however those same skills and how they relate to service delivery and care, how they are used and at what level will depend on the model and the employees place within that model. So these issues then in turn lead to the need to train people in ‘how we do things here,’ it also points to one of the biggest complaints organisations make about staff they hire who have been trained ‘generically’ by a provider; while they may have certain skills and knowledge they don’t possess the organisational mindset around how these skills are used.  This in turn of course leads to over training of staff, needless refresher courses and a range of other activities that are done in the name of compliance, but ultimately just cost the organisation money.

From the point of view of the individual coming into a role with an already established set of skills, they rightly or wrongly feel that they have the requisite skills and can, again rightly or wrongly be quite adverse to receiving training in those areas they already feel skilled in, giving rise to the cries of ‘I did this in my last organisation,’ or ‘I learnt all of this at uni.’

However, and I spoke about this a couple of years ago at the Edutech conference, a lot of organisations both big and small already have a lot of the information they need to manage this interface between employee, organisation and provider much more easily than they do, but either don’t know they have it or don’t know what to do with it.  A great many organisations out there capture resume, training, and qualification data on their employees when they commence and through their time with the organisation, but few of them use this data to its full potential particularly with respect to training needs analysis, skills and knowledge assessment, or even RPL or credit transfer and competency assessment.

If this data is properly stored and mined it can provide a wealth of information, particularly when added to more formal assessment, as to what training is necessary for each individual to undertake.  To give you a conceptual idea of what I mean, we could collect a whole range of information from a new employee, including things like qualifications, training they have under taken, responses to skill and knowledge questions, any testing which took part, in essence a whole range of information.  This information could then be filtered against not only internal training requirements, but accredited training requirements to form an individual map for each employee and their managers of that person journey from induction to qualification.  Of course this won’t be all that is required, particularly at the accredited qualification end of the scale, but having a map like that would assist everyone, the employee, the organisation and the RTO to produce the outcomes that all of the stakeholders require.

My favourite L&D Books

I thought as a start to the year I would begin building a list of my favourite Learning and Development books, these are all books that I have read and would recommend highly.  If you have any other suggestions then feel free to let me know.

The business of learning – David Vance

The success case method – Robert O Brinkerhoff

High Impact Learning – Robert O Brinkerhoff

Ten Steps to a Learning Organisation – Kline and Saunders

Building the Learning Organisation – Marguardt

The Fifth Discipline – Senge

The Training Measurement Book – Bersin

The Six Disciplines of Breakthrough Learning – Wick, Pollock and Jefferson

How to measure training results – Phillips and Stone

 

So as I said if you have any other books that you like let me know.  I intend to build this list over the year.

 

Creating a High Impact Learning Culture

2013 ASTD State of the Industry Report

So as many of you know I am an avid consumer of the ASTD’s yearly State of the industry Report and guess what, the 2013 edition is now available.

So what does it have to save about the world of L&D this year. Well it is interesting, there is not a lot of change from last years report.  We see that spending on L&D globally was about $164.2 Billion with an average direct expenditure per employee of about $1,195.  In terms of Average Direct Expenditure, this represents a very small ($13) increase over last year.

Again however Learning hours used per employee stuck at around the 30 hours mark, 30.3 this years to be exact.  On suggestion for this stalling over the last four years in the increase in usage of non-traditional instructor led training and the more informal, workplace, just in time learning which is much harder to track and quantify.  We also see that Direct expenditure as percentage of payroll rise only slightly to 3.6% as has the Direct expenditure of percentage of revenue rising slightly to 1.32%.

There has also been little or no change in the percentage of expenditure taken up by internal costs which remains steady at 61.5%, lower that 2009 (62.4%) but higher than last year (60.5%).  There has however, been a not insignificant (5%) drop in the number of employees per L&D staff member which now sits at 299:1, there is an even more startling drop of  around 40%, in this number in the ASTD BEST organisations, taking the number there from 288:1 down to 178:1.

The cost of learning has also gone up both in terms of the cost of providing one hour of training to one employee, rising to $89 and the overall cost of developing one hours training rising to  $1,772, a rise of 20% over the last 4 years.  Some reasons suggested for this increase if the up front costs of technology and the reduction in the ratio of employees to L&D staff members.

Managerial and Supervisory training makes up the largest content area for Learning programs, closely followed by mandatory and compliance training, business process and practices, and industry specific training with these four areas taking up just of 40% of all the learning programs delivered.  How these programs were delivered tells what I think is an interesting story however, while yet again, instructor led classroom delivery dropped (5% down to 54.28) and technology based learning rose slightly to 39.20% which is not unexpected.  What I find interesting is that  All Online delivery has remained around the same percentage, (27.29% this year) since 2008.  When you pair this with the fact that instructor lead training (either classroom or online/remote) accounted for some 70% of all training delivered, it seems to suggest, at least in my opinion that participants like to have instructors to interact with even when utilising online training.  The other final thing I find interesting about the content and delivery data is that while there was a big jump in the percentage of hours used in terms of mobile technologies between 2009 and 2010, this usage has flattened out of the last three years remaining at 1.51%

So what does all this data mean?  A couple of comments I would make would be that

  1. Instructor led learning is still the preferred method of delivery for a large amount of participants,
  2. New technologies may have had a quite significant effect on the overall cost of the development of training,
  3. Mobile learning is not the powerhouse, game changing, way of the future that everyone keeps suggesting it is.

I would be interested to know what others think of the data and what it means for the industry.

AHRI – Pulse Learning and Development Report 2013

As most of you know I devour these reports and state of the industry papers about the world of L&D so it was with interest that I read the 2013 release of the Australian Human Resources Institute – Pulse L&D report.

So what are the interesting little highlights I found when I read through the data collected.  Well before I talk about that it is important to note that this survey unlike the 2010 survey was not done in conjunction with the AITD, but was done solely with AHRI members, which may or may not have had an effect on the results.

The first thing I found interesting was in the comments from the AHRI Chairman, where he says ‘it is pleasing to note also that nearly a third of the sample group (31 per cent) report that learning and development budgets account for more than 5 per cent of revenue’  but seemed disappointed that 68% of the organisations surveyed had L&D budgets which were less than 5% of revenue.  I find this statement a little strange and at odds with the general level of L&D investment (as a % of revenue) globally, and this may be a case of simple misunderstanding of the wider global L&D community trends.  I say this because in the 2012 ASTD State of the Industry report the average figure for direct expenditure as a percentage of revenue is around the 1.2%, with most Global Fortune 500 companies averaging around 0.7% of revenue.  Now while it is true these levels globally are rising, it would be difficult to suggest they would top 5% of revenue anytime in the next few years at least, which to me says that investment in Australia in L&D is in very very good shape, when we compare it globally and to intimate that budgets of less than 5% of revenue are disappointing, is a little bit strange.

Still as always the vast majority of people in the industry are female (70%), though I would really like to see a survey done in the Australian market that look at gender across roles within the industry as I think, particularly if we think about senior management and executive learning roles these figures may not be giving us the full picture.   If anyone knows of a survey like this, particularly one with data collected from organisations with a range of L&D functions I would be interested to know about it.

Again as we tend to always see in these surveys, most L&D functions sit either solely with HR or within HR and externally to it, with only 11% sitting outside of HR as a separate function.  Now as we know where Learning should sit has been a topic of debate for a long time now, but in reality it seems nothing much has actually changed.  The other thing that interested me on this page of the report was the size of L&D team with more than 75% of learning functions only having between 1-5 staff.  Now I am sure that this has something to do with the fact that 60% of the respondents worked for companies with less that 500 staff and 80% for companies with less than 2500 staff.  I also think and this is just personal opinion here, that it has a lot to do with the Learning function sitting inside HR, and to some extend being treated as a poor cousin to other HR functions, and a misunderstanding of the value both in people and monetary terms of a well-funded, highly functioning Learning unit, but then again I am a L&D person I would say that.

Some of the really interesting information for me starts on page 11 of the report were it begins to look at the mix of L&D activities with organisations.

The vast majority of L&D activities within organisations turn out to be….. wait for it……Internal face to face training, Well who would have thought that.  Certainly not anyone who had been to any of the major conference recently where it almost seemed that if you talked about-face to face training and not, informal MOOC’s than you were a dinosaur, who needed to move out of the way.  In fact this idea is only further supported on page 12 where we see that only 8% of the Learning Activities provided by organisations are e-learning based, with the two largest percentages being in-house training and inductions. (Sorry had to say that, it is just nice to see some real figures that point to the fact that online learning in not taking over the world at least not inside organisations.  The other two really interesting bits of information from this were that the split between formal and informal learning was about even with informal a little bit ahead, nowhere near the 90:10 split we would expect to see under some models of informal learning and the in terms of kinds of training compliance and other training were split about 50:50 as well.

So what then do people think are the most and least valuable learning and development activities, well the most valuable are clearly induction of new staff and leadership training (though I am unsure of the real value of leadership training myself), closely followed by training relating to in-house operations, (surprising all the stuff that organisations need their staff to know), with the least valuable (as I have always suspected to be the case) Team building activities followed by compliance training.

So there you have it, nothing stunning, but some facts which I think tend to shed some light on some of the rhetoric of learning pundits and evangelists out there.

As always if you have any thoughts of comments I am more than happy to hear them.

 

 

Evaluating Informal Learning

As some of you know the problems associated with capturing the organisational impact and increases in competency (if any) gained by staff through their informal learning has occupied my thinking for some time now, and I have posted and spoke about it on a number of different occasions.   I really got to thinking about it again after the recent learnX conference, particularly after some stimulating conversations with Con and Saul amongst others.    The problem for me is that even if you don’t believe the numbers  in 70:20:10 (which I don’t)  there is still a lot of informal learning that happens in a person’s life and at least some of that learning has to relate to their job roles.  Just before I go on however,  I want to set some shall we say boundary conditions to what I am talking about here and that is informal learning that has some impact on the day-to-day operation of the organisation.  If you choose to go off in your own time and as one of my colleagues loves to call  it, study underwater basket weaving, I am really not interested in the fact that you have done that, unless you can show be some tangible link to your day-to-day work.

So for me there are two types of organisational informal learning, that learning that simply increases, builds or improves a skill and that learning that does that and in addition provides some ‘formal’ recognition pathway for the learner.  When I think about this however it is more  the job role or the organisational imperative that moves us towards the path of recognition rather than in general the needs of the staff member.  Again however,  when I am talking about recognition I am talking about formal recognition, where there is some ‘qualification’ style outcome as a result of the learning, usually in an Australian sense a Unit of competency for example.  I am not talking about badges and other like devices to capture learning, be they peer-reviewed or whatever, for from an organisational point of view they in my opinion are (at least currently) meaningless.

Why do I say this; I have often recounted a story of being asked as part of a formal investigation into an incident involving a member of staff, “how did you know this person was competent?”  Now if my answer had been, well he has a badge for it, I think I may have gotten a much different reaction to saying, as I did, “Well, they have completed all of the assessment tasks, including a third-party observation, necessary for us to be satisfied that they were competent under the rules of evidence set out in the legislation pertaining to the operations of Registered Training Organisations.”

Not all informal learning though is going to lead a staff member to a qualification however, some of it is not related to or captured by the range of qualification available, some simply adds to the skill set they already have, making them better at their role, but not providing them with a new skill. So for me there are a number things that we need to know in order to be able to begin to evaluate informal learning, and I am indebted to Saul Carliner for some of his thinking around this;

  1. A baseline – what is the staff members current skill level,
  2. What they have learned,
  3. How they have learned it,
  4. New skill level,
  5. Is there a competency attachable to the learning,
  6. Effect of learning on organisational metrics – reduction in customer complaints, less injuries etc, and maybe
  7. Return on Investment?

Now you might look at that list and say well isn’t that what we would want from any kind of organisational learning process, and in fact is not exactly the kinds of data that we want from our formal processes.  I don’t see why we should be treating the outcomes of informal learning differently to how we would treat formal learning.  Now it may not be relevant, we may not or able to or we may not want to, capture the standard ‘smile’ sheet satisfaction style data that we collect from formal training and yes, the natural of the learning, pull not push, driven by the individual, just in time etc, all make the nature of the process of learning different.  However, when we look at it from an organisational point of view aren’t we not looking for the same thing as with formal learning.  We are looking for an increase in the skill level of the staff member, such that increase in skill will have an effect on the relevant workforce metrics that relate to their role in the organisation.  If we aren’t looking to improve the skills of our staff and the organisation as a whole, what are we investing in informal learning systems in the workplace, and why has it become so important.

So what are some ways in which we can achieve this?  If we start with the idea of a baseline we might be able to sort out some structure and processes around this idea.  So, where might a baseline come from;

  • Position Description,
  • Performance and Professional development plans,
  • Self Assessment,
  • Formalised Assessment, or
  • Job skills analysis plus a rating system.

There are a range of ways in which we can establish this baseline, but how can we do it without it being onerous on everyone involved.  If we use position descriptions as our starting point, we have the problem of there not being enough detail or they are not skills based or we don’t assess the person against them in a really formal way, that gives us any real data to work with in the first place.  They could be coupled with self assessment, and direct manager assessment to give a fuller picture of the skill set and levels of an individual staff member.  PPD plans can be seen in the same light, in order to make them more useful in terms of presenting us with a baseline we need to capture more granular detail about the role and the staff members skills relating to that role.

So I have a bit of a rough process around my thinking in relation to this and it goes something like this;

  1. Skills Outline for each role type within the organisation,
  2. Assessment of Staff member against skills outline – there are a range of options here, but I think there has to be at least self assessment + manager assessment at the very least,
  3. System for capturing staff informal learning activities,
  4. Regular (6-12 Month) updates of Staff skill assessments,
  5. Data capture of changes in skills levels across the organisation,
  6. Method of mapping skills changes to competencies. and
  7. Methodology of converting skills changes to organisational metrics and ROI.

So as I said that is my thinking currently, I would be really interested in getting everyone’s feedback on what they think, so feel free to chime in and let me know what you think.

Primary, Secondary, Tertiary and VET education

Is Education Snobbery still alive and well in Australia?

As some of you might know one of my first posts on this blog was about Academic snobbery and the perceived value of VET qualifications, where I talked about the ‘I have a degree, why would I want a workplace (VET) qualification?’ and what it said about the perception of the value of VET sector qualifications.

This whole idea of the VET and organisational learning sectors, not being as professional, rigorous, or just plain good, as the ‘Teaching and Academic sectors’ has risen up in a number of conversations I have had with people recently.  This time however it has been the ‘But that just training’ or ‘They are just a trainer, I’m a teacher/lecturer’  commentary.  What I find really interesting about this is that I almost never here this language from people in the organisational learning and VET sectors only from those in the teaching and university sectors.  The other thing that I find interesting is this (and I am going to generalise here so beware);

Teachers are experts in practice of teaching, they are not for the most part subject matter experts;

Lecturers and Academics are subject matter experts, and not for the most part experts in the practice of teaching;

VET and organisation learning practitioners are expected to be both, they must have subject matter knowledge and expertise and they must hold training qualifications.

Now don’t get me wrong here, I am not saying that practitioners in the VET and organisational learning sector are better or more qualified than those in other sectors.   I have known over the years outstanding teacher, lecturers and trainers, I have also known some, in all three sectors, that were downright awful and made me wonder how they managed to continue to be employed.

So  lets stop this petty bickering about who or what sector is best, applaud great talent where we find it and work together to ensure that the people we educate get the best outcomes they can regardless of the sector they are in.

%d bloggers like this: