The Future of Learning and its effect on VET

I thought I might take a little bit of a different tack with my post this week and do some crystal ball gazing and look to the future and how technology is going to effect the way in which we learn and then how this might effect the kinds of learning that make up the VET arena.

Late in 2014 I wrote a couple of pieces on rapid skill acquisition and interface learning, a cyberpunk notion of simply jacking any skills or knowledge directly into our brains through some kind of brain/machine interface.  Imagine basically plugging a small usb stick into your skull and downloading all the skills, knowledge and physicality of say, how to service your car, and then when you were finished simply deleting it until you needed to utilise it again.  I suggested that in essence places such as YouTube already provide us with some of this by enabling us to watch how to do some specific thing, in order so that we might be able to replicate that skill ourselves for that specific task, without having to learn all of the skills and knowledge which sit around it.

Since then we have seen the rise of augmented technologies, Virtual reality, Artificial intelligence, machine learning and even robots.  Now while most of these new technologies are only being tinkered with in terms of their learning potential and despite what a number of pundits claim, will not reach their true potential in terms of how people learn and deliver learning for quite a few years yet, they will without doubt irrevocably change what human learning looks like in the future.

Augmented reality allows anyone with a smart phone to point it at an object and receive all of the information and bite sized learning objects they require in order to what ever tasks are associated with the object in question.  A care worker who is unsure of how to operate a new patient lift, simply points their phone at the lift and instantly they receive detailed instructions in how to operate it.

Virtual reality reality and robotics present a future where participants can be trained in fully immersive environments, interacting with the world around them as if it was real.  Add to this an AI controlled population (NPCs in gaming terms) with the ability to react in both expected and random ways to ensure that those undertaking training encounter a full range of circumstances and variations.

Online learning and Mooc’s facilitated, moderated and assessed by AI ‘teachers’ with student support and assistance handled by AI chatbots.  In fact it is more than possible to imagine an entire student experience from their first contact through to their graduation and issuance of certifications without the student at any point having to interact with, in real life (IRL), another person. Enrollments can already be handled by smart website interfaces, the addition of AI chatbots to lead the potential student through the process seems a very small step away.  Access to systems and learning platforms is already automated in most providers at least to some extent, with in a lot of cases significant amounts of communication regarding the course, content and assessments being handled through email.   Shared virtual reality simulations, where students and NPCs interact with both the environment and themselves, facilitated and moderated by an avatar of the AI controlling the entire system, utilising natural language processing based on machine learning to interact with students, conduct, collate and ‘mark’ various assessment pieces both from within the simulation and external to it.

So where do directions like this leave Vocational education, apprenticeships and the other educational activities we utilise currently?  Well if you talk about there always needing to be experts, sme’s and people to provide the system with information, or that there needs to be practical on the job components or that there will always be a need for face to face human interaction you are unfortunately, most likely wrong.  While we won’t see these things happening over night, we will see practical components, which were usually done on the job, moved to complex virtual simulations, why?  Well to give you an example staff working in the community sector, even with at risk clients, may go their entire working careers, let alone their on the job training phase without ever encountering a person at immediate risk of suicide and never know until the moment happens how they will react.  Complex simulations populated by AI characters, provide  a safe environment for staff to encounter situations which are rare in the workplace.  Working on car engines, dealing with electricity, building houses, all will be able to be simulated through virtual reality in such a way as to mimic the actions in the real world.  Simple economics are already moving many providers to more automated enrollment systems and as the levels of complex analysis and response available through ‘bots’ and other systems increases more and more of these processes can and will be successfully automated.

But then if other predictions are true and they probably are a vast array of the jobs that we currently train people for in this sector won’t exist in the very near future.  However there seems as with a range of other industries there may also be niches available to capitalise on gaps left by all of this progress.  Highly skilled teachers and trainers could impart their long held and well developed skills, knowledge and wisdom through ‘Artisan’ face to face models to those who wished that they or their children received their education in a ‘tradition’ environment, all of course for a substantial additional cost. I can see the advertising now.

Anyway that’s just what I think




AHRI – Pulse Learning and Development Report 2013

As most of you know I devour these reports and state of the industry papers about the world of L&D so it was with interest that I read the 2013 release of the Australian Human Resources Institute – Pulse L&D report.

So what are the interesting little highlights I found when I read through the data collected.  Well before I talk about that it is important to note that this survey unlike the 2010 survey was not done in conjunction with the AITD, but was done solely with AHRI members, which may or may not have had an effect on the results.

The first thing I found interesting was in the comments from the AHRI Chairman, where he says ‘it is pleasing to note also that nearly a third of the sample group (31 per cent) report that learning and development budgets account for more than 5 per cent of revenue’  but seemed disappointed that 68% of the organisations surveyed had L&D budgets which were less than 5% of revenue.  I find this statement a little strange and at odds with the general level of L&D investment (as a % of revenue) globally, and this may be a case of simple misunderstanding of the wider global L&D community trends.  I say this because in the 2012 ASTD State of the Industry report the average figure for direct expenditure as a percentage of revenue is around the 1.2%, with most Global Fortune 500 companies averaging around 0.7% of revenue.  Now while it is true these levels globally are rising, it would be difficult to suggest they would top 5% of revenue anytime in the next few years at least, which to me says that investment in Australia in L&D is in very very good shape, when we compare it globally and to intimate that budgets of less than 5% of revenue are disappointing, is a little bit strange.

Still as always the vast majority of people in the industry are female (70%), though I would really like to see a survey done in the Australian market that look at gender across roles within the industry as I think, particularly if we think about senior management and executive learning roles these figures may not be giving us the full picture.   If anyone knows of a survey like this, particularly one with data collected from organisations with a range of L&D functions I would be interested to know about it.

Again as we tend to always see in these surveys, most L&D functions sit either solely with HR or within HR and externally to it, with only 11% sitting outside of HR as a separate function.  Now as we know where Learning should sit has been a topic of debate for a long time now, but in reality it seems nothing much has actually changed.  The other thing that interested me on this page of the report was the size of L&D team with more than 75% of learning functions only having between 1-5 staff.  Now I am sure that this has something to do with the fact that 60% of the respondents worked for companies with less that 500 staff and 80% for companies with less than 2500 staff.  I also think and this is just personal opinion here, that it has a lot to do with the Learning function sitting inside HR, and to some extend being treated as a poor cousin to other HR functions, and a misunderstanding of the value both in people and monetary terms of a well-funded, highly functioning Learning unit, but then again I am a L&D person I would say that.

Some of the really interesting information for me starts on page 11 of the report were it begins to look at the mix of L&D activities with organisations.

The vast majority of L&D activities within organisations turn out to be….. wait for it……Internal face to face training, Well who would have thought that.  Certainly not anyone who had been to any of the major conference recently where it almost seemed that if you talked about-face to face training and not, informal MOOC’s than you were a dinosaur, who needed to move out of the way.  In fact this idea is only further supported on page 12 where we see that only 8% of the Learning Activities provided by organisations are e-learning based, with the two largest percentages being in-house training and inductions. (Sorry had to say that, it is just nice to see some real figures that point to the fact that online learning in not taking over the world at least not inside organisations.  The other two really interesting bits of information from this were that the split between formal and informal learning was about even with informal a little bit ahead, nowhere near the 90:10 split we would expect to see under some models of informal learning and the in terms of kinds of training compliance and other training were split about 50:50 as well.

So what then do people think are the most and least valuable learning and development activities, well the most valuable are clearly induction of new staff and leadership training (though I am unsure of the real value of leadership training myself), closely followed by training relating to in-house operations, (surprising all the stuff that organisations need their staff to know), with the least valuable (as I have always suspected to be the case) Team building activities followed by compliance training.

So there you have it, nothing stunning, but some facts which I think tend to shed some light on some of the rhetoric of learning pundits and evangelists out there.

As always if you have any thoughts of comments I am more than happy to hear them.



Herding Cats – Capturing informal and social learning

Social and Informal Learning in taking off in leaps and bounds,

with significant number of courses and programs available for free via various MOOC’s or through providers of free online education the profile of this kind of learning has increased dramatically.  Add into this workplace learning, communities of practice, on demand e-learning, and corporate social media, just to name a few and you soon realise that Learning is happening all over the place.  And I have to say I think that is a fantastic thing, anything which encourages people to learn or makes it easier for them to learn is good for everyone.

However, with all of this learning going on, how do we as organisations know what it is our staff know and how can we be comfortable that they are actually capable of doing the things they have been learning and is it important that we know.

Let me answer the second half of that question first, Yes, it is important that organisations know what their staff are learning, what their skills and capabilities are, what they are competent and not competent to do.  Why? because without that knowledge organisations cannot best plan for the future.  Without this information decisions about workforce needs and capability cannot be accurately made, nor can we properly succession plan for the organisations future.  It is hard to know who are going to be the next senior leaders with your organisation without knowing the knowledge, skills and capabilities.

The first half of the question if harder to answer however, how do organisations capture what it is that their staff know, what it is they have learnt over a year or six months, how does an organisation verify that learning and how can it be integrated into the work of the organisation and where the organisation is ‘providing’ the opportunities to learn, say through communities of practice etc, how do they determine the return on investment they are getting.

Even the simple act of capturing the information about staff learning can be challenging, do you try and capture everything, do you have a system where staff upload what they think is relevant information about their learning activities, does the organisation try and vet the information that is uploaded or captured to ensure that the learning activity was undertaken, or that if possible the person was deemed competent, or do you only accept formal qualifications as evidence of knowledge.  I guess for me it depends on the purposes for which you are looking to capture the information.  If it is being captured primarily so that staff have a record of their learning activities, both formal, non-formal and informal, which they can utilise to show industry currency or professional development, then I think casting a wide net, without too much checking of competence etc is fine.  If however the organisation is using the information as one string to its workforce planning or succession planning bow, or as looking to recognise formally the skills people have gained from informal sources, then I think the capture needs to be much stricter, perhaps even with competency based assessment backing up the learning from these informal avenues.

If you are in organisational learning I would love to hear what you think or what you do in terms of capturing what your staff learn informally.


Great little Blog post on what I think is one of the biggest issues facing MOOC’s and the like. That is how does one show that one learnt anything, or that you are competent in the materials. From an organisational point of view I really dont care if you have done 100 MOOC’s on a range of subject, unless there has been proper assessment (and sorry peer review of your ‘work’ just doesnt cut it for me) then dont even both putting it in your resume as it is meaningless unless you can actually show that someone is willing to say that you are competent.


The issue of how a mark MOOCs is a moot point at the moment.

As Europeans race to play catch up with their U.S. counterparts, (no educational body left behind!?), two glaring questions rise to the fore in many articles about these massive open online courses.

These are:

  • How do institutions make money from them?
  • Will MOOC students be able to gain credits for offline courses?

The answer to both these questions, in my view, ultimately rests on how the courses are evaluated.

View original post 934 more words

I read this today and thought it was worth sharing with everyone and along with Inge‘s blog makes some interesting reading.  For me the jury is still out on the real value of MOOC’s and other large online courses and programs, why?  well I am just not convinced that in the long run we are actually getting any real tangible benefits from these programs.

That is not to say that I don’t see value in learning new things, and providing people with new ways in which to learn them and to interact and the like.  However without things like formal accreditation and assessment process how do we know what we, or in an organisational sense what has been learned.

The other thing that worries me about relates directly to the ease at which we can enroll in courses, any courses really, whether or not we might be able to or have the opportunity to utilise the knowledge gained from these courses.  I good enroll and complete a course in Astronomy, but unless I have an avenue in which to practice what I have learned, my knowledge is gradually going to fade away.  We all know that in order to keep ones skills in an area, that one needs to be able to practice those skills.  So if we do all of these courses and learn lots of things, but we never actually apply any of the things that we have learnt in the real world, what is the point, because that knowledge will degrade over time.

I think MOOC’s are a great idea, but like with a lot of these things we can get carried away on the idea, and the possibility and sometimes forget that there is and has to be real world applications for the skills that we learn.

Jonathan Walsh

This link directs to an post about two big problems with online college courses. They are as follows:

1. Online College courses have a very high attrition rate. In some cases 90% of the students who enroll drop-out.

2. Online college courses are inappropriate for struggling college students who make up a significant portion of college enrollment.

The estimate of 90% drop out rate applies to MOOCs. They are free and can be joined on a whim. Daphne Koller in her TED talk discusses how with a free and accessible online course there will be many who enroll due the consequence free nature of it. Many of these early enrollers lose interest or find they do not have time and drop-out. Those that remained are more dedicated and genuinely enthusiastic about learning. She also makes the point that even with this huge drop out rate a MOOC can still reach considerably more…

View original post 190 more words

Massive Online Open Courses (MOOC) iTunesU, and Learning via YouTube

Does Online Learning Equal Competency?

I love Learning; lets get that one out of the way right from the word go.  I love to be able to look for a solution to something by searching google, then reading and article or watching a YouTube video on whatever I need to know.  I have iTunesU on my IPad, I watch Khan Academy video’s, in essence I tend to devour learning and information from whatever source I can get it.  Do I learn things by doing this – Yes I think I do.  Does this kind of Learning make me competent – I am far less sure of this one, and I guess this is where my headspace is with these kinds of courses and programs.  In most cases there is no real assessment of outcomes for the participants and where there, will they, or do they fit what employers etc might consider to be relevant outcomes.

Consider two applicants for a position one who has done a degree via traditional delivery and assessment and one who has done and equal amount of ‘online’, ‘free’ programs.  All other things being equal, (even without them being equal in my opinion) who are you going to give the job to.  I would think hands down the person with the degree and would challenge anyone to justify to me, why they would choose the other candidate.

What about recognition of prior learning, some which is a core component of the Australian VET system, do these freely available online courses count as acceptable evidence of competence or is there still further work that needs to be done, perhaps independant assessment of competence, before they are recognised?

I have a deeper issue though with this kind of learning which is one of transfer and application of skills.  Let me give you an example of what I mean.  A number of years ago I was in a training role, where after the courses had ended, clients would often contact me with a range of technical questions around some of the software that was use as part of the course (even though the course itself was not a technical course).  I quickly learnt that it was simpler for me in most cases to Google their question and give them the answer there and then, rather than  say I wasn’t sure and try and get back to them at a later date.  It kept them happy, value added to what we did, and positioned me a technical expert in a piece of software, that I actually knew technically very little about.  Was I competent; I dont think so, I never had any background knowledge about how the system worked or why some of the things worked the way they did.  I was just following the instructions of someone else.  This is not to say that I did not learn things I certainly did, but learning things does not in my opinion equate necessarily to competence, and if I am being compeltely honest most of the solutions went straight out of my head after I had given them to the client, simply because I did not need to know.  If a got multiple clients who wanted the same or very similar information I would bookmark the site or video so that I could simply go back to it when needed and pass the instructions on. The other thing that I could never understand about this situation (and this is a bit of an aside) is why clients rang me in first place, when they could have simply searched the web themselves and found the answer just as quickly as I had.

The other and final issue Ihave with all of these programs is how do we integrate them into the range of informal learning with our organisations and more importantly for me at least, how to we evaluate the learning that comes from them for both or staff and the organisation as a whole.

I would be really excited to hear any ideas that you have around this subject.

%d bloggers like this: