The problem of opinion and misinformation in VET

Australian VET is a regulated industry, in fact if you spend more than five minutes talking to almost anyone in the sector, you will understand just how abundant these regulations and other associated controls are.  Given that this is the case, and given that because this information is written into legislation, and other associated documents linked to and referenced in the legislation, I am sometimes dumbfounded how it is possible not to know the answer to a lot of the questions I hear being asked in the sector, and worse still giving an opinion, which is wrong, on something that is clearly articulated in black and white in the various documentation for the sector.

Now I am the first to admit two things, one, there is a lot of documentation to look through (the Standards, Data Definitions, funding agreements, AQF, to just name a couple) and second, I am the kind of person who loves, reading and assimilating information and data.  That being said however, how is it possible for someone to be in a position of relative power, (CEO, Consultant, Compliance Manager, whatever) and not have read and more importantly understood at least the very basic documents which regulate the sector, and given the amount of misinformation, and glaringly wrong opinions which are offered by people who should really no better, it can only be assumed that they either haven’t read or haven’t understood the documents, or have simply shifted all of the responsibility for knowing what the right answer is to someone else.  Now to some extend I don’t, by necessity, mind if people, say a CEO of an RTO doesn’t have all the answers and relies on his compliance person to understand everything and to get it right, however if this is you, then don’t give an answer when someone asks a question.  If you don’t know the answer, all you are going to do is muddy the water and make it more difficult for the person asking the question to get the right answer, which then probably need.

What is far more concerning to me is when people, who are supposed to be senior leaders in the sector or who are consultants who work with large numbers of providers, voice opinions which are clearly incorrect on subjects where your opinion doesn’t matter because the answer or the definition is written clearly into some form of regulatory document.  Not only does this provide whoever is asking the question, or who they are working with, with the wrong answer, which could have catastrophic consequences for that person or organisation, but if they say it enough and it gains momentum and gets passed around enough, this clearly wrong piece of misinformation, becomes gospel.

One of these, as an example, came up a number of years ago, at an ASQA briefing and was categorically answered, but the myth, wrong, opinion, or misinformation still exists today and is still quoted by people.  At this briefing a gentleman stood up during the time allocated for questions and asked, why it was the case that highly experience industry people had to hold all of the units of competency that they were teaching.  He said that it was making it difficult for him to find trainers because a lot of people in his industry didn’t have the newest UOC’s and therefore couldn’t teach and assess those units.  The person from ASQA (who is a person who is highly regarded, highly skilled and help draft the standards) looked dumbfounded for a moment and then replied that the standards didn’t say that and asking him where he had learned this from.  This of course bought a hue and cry from the audience many of whom insisted that that was exactly what the standards said.  The ASQA representative carefully explained that all the standards said was that trainers had to have, small c vocational competencies.  They didn’t have to hold the exact unit they just had to be able to prove that they were competent in the skill that they were teaching if and when they were audited.  Another round of discontent emerged with a lot of people say that TAFE had always required them to RPL the most recent units at the very least.  Again, the ASQA person reiterated that while that may be the practice of TAFE, that was simply a management decision, was not required by the standards and should not actually be considered to be best practice. Now not only was this information shared at the briefing, it was also shared through FAQ’s on the ASQA website and through recordings of the briefing.  Yet, much to my disbelief, I heard this very question being asked in another forum late last year, a a great many of the people who answered spouted the very same information which has been debunked numerous times since that first briefing.

The real problem is that this is only one example of this kind of opinion masquerading as fact which is doing a substantial amount of damage to the sector.  It is no wonder that RTO’s are failing audit if they are relying on opinions from so called experts rather than actually going and reading the cold hard, black and white information contained in the various acts and other documents.  The vast majority of the questions I see posed on online forums, at conferences and in general discussion, aren’t the subject of opinion, and do in fact have definitive answers if you can just be bothered to go and read the documents that govern what we do.

So how about before we ask or answer a question, we all go and read, not just The Standards, but all of the ancillary documentation associated with the sector, or if you don’t have the time or the inclination to do that, (I personally think our sector would be better off if more people did though) just google the question, ignore the opinion and go to the actual source documents.  We all talk about wanting the VET sector to be more professional, and I have to say, actually reading and understand the legislation etc which underpins might be a good start.


Total VET students and courses 2016 – First look

Woohooo, it’s that time of the year again.  NCVER has just released the Total Vet students and courses Data for 2016.  Yes I know lot of you are now going ‘you’re weird’ and to be fair you are probably right, however, there is more often than not some lovely little gems of information tucked away in this report.  So lets have a look at what it says and see where that takes us.

First the highlights; 4279 providers delivered TVA training to about 4.2 million students in 2016, which represents an almost 5% rise in the number of enrolled students over the 2015 figures. There was also an around 1% rise in the overall participation rates in VET in the population aged 15-64, with  participation highest among 15 to 19 year olds at 46.2%.  This is on the back of an almost 3% decrease in the amount of commonwealth or state government funded training and while management and commerce remains the most popular field of education despite a 5.8% decline in program enrollments, the heath sector saw the largest one rise increasing by 30% in 2016.  The other interesting fact is that about 10% of all of the RTOs listed on had no enrollment activity during 2016.

So what do all of these highlights mean, before we dig a little bit deeper into the data.  I have to say that realistically it seems to paint a fairly good picture for the VET sector in Australia.  We are seeing solid levels of participation across the Australian population and far more importantly I think we are seeing almost 50% of the 15 to 19 age group involved in some kind of VET activity in 2016.  The decline in funded programs is however a concern, and a concern that must be addressed by both state and federal governments.  For too long now the VET sector has been under funded with the amount of public money coming into the sector reducing in real terms and falling further and further behind K-12 and university funding.

So what other pieces of information can we glean from the report.  Well Queensland had the highest overall number of providers at 1270, with VIC close behind with 1100 and NSW third with just over 1000 providers.  As you would expect the three eastern states dwarf the other states, with each of them having more providers than all of the other states combined.  This trend is also echoed in the total student numbers a well.  NSW leading here with 1.1 million students with both VIC and QLD coming in not far behind, each with over 900,000 students.  When we look at the student numbers a couple of really interesting points pop out.  Firstly female students increased by 10.5% to 2.0 million, with the proportion of female students increasing from 44.1% to 46.5%.  In addition indigenous students increased by 20.1%, accounting for 4.0% of the total estimated VET student population and students with a disability increased by 1.8% or 4.3% of total estimated students.  All other things being equal, this has to be a great result for the sector, with those populations which have traditionally struggled comparatively in terms of workforce participation and education, becoming more engage with vocational education.

When we consider the data around program enrollments, that is what people studied and how, we see a trend or a movement which I think is going to continue into the future.  Across the board in 2016, compared with 2015 national training package program enrollments decreased by 4.1%, while skill sets increased by 111.6%.  This is something that a significant number of commentators including myself have been suggesting, has been occurring and will to continue to occur over time.  I would suggest that in our fast changing workplaces, both workers and employers are looking build and improve specific skills to meet market demands quickly and effectively, thus preferring skill sets and single unit training over full qualifications.

The largest percentage of enrollments was, as we would expect at the Certificate III level with 26% of all enrollments, with diploma level and certificate II qualifications coming in next, both with in excess of 16% of enrollments.  These are the core business areas for the VET sector so the fact that they account for in excess of 60% of enrollments is no surprise.  We saw the health sector record the largest growth of all fields of education increasing by 30.3%, however troublingly we saw by contrast, Natural and physical sciences experienced the largest decline in program enrollments, decreasing by 15.0%.  While our ageing population and the ever growing need for health and allied health professional is clearly driving the growth in that sector, I suspect that a severe misalignment between the training packages and the needs of the science sectors may be the underlying reason behind the decrease in enrollments there.

So that’s it for a first look, I think there are real positives that the VET sector can take from the data in this report, but there are also some things to consider quite seriously, such as the continued decrease in funding and the lack of student enrolling in science related courses.  It will be interesting to see if anything else pops up out of the data as it it looked at more closely and read in conjunction with other data.

Anyway, that’s just my opinion.


NCVER VET provider market structures – Dam what a boring title

First off NCVER could you come up with some sexier titles for your work please.  I mean VET provider market structures: history, growth and change is a very interesting read but the title doesn’t excite me to even open it, which is of course a shame given the volumes of interesting things contained inside it.

Enough of that though.  A little over a 18 month ago I wrote a  piece entitled “Who are these private RTOs anyway?” and the response to it enormous, in fact it became one of my most popular posts of 2015.  It simply sought to provide some perspective on the breadth of providers within the VET sector in this country and how it seemed a little unfair to simply lump all providers together into one basket, particularly when the vast majority of providers are small to medium size businesses and not nation spanning conglomerates.  Now NCVER has released a report on providers within the VET market and what these providers look like, and well isn’t it an interesting read.

Now I did have the pleasure of seeing some of this data earlier in the year during a presentation around it by NCVER, but I wanted to wait until the full paper was released before I made any commentary about the results.  So now it is all out and available lets have a look at what it says.

The first thing that is really striking in the research is that the VET provider market place has been fairly stable in terms of the number of providers over the last 15 years. While there was, as to be expected, back in around 1998 (when we first got our RTO status) an enormous amount of applications, much higher than at any point since, since that point applications and overall numbers have remained relatively constant.  What can we take from this?  I think we can pretty safely say that the number of providers we currently have in the market is probably the number of providers that the market can support.  While providers may come and go for various reasons having such a constant number over such a long period of time seems to suggest that the overall number of providers is appropriate.  What makes this really interesting is that over the 15 years the data covers there has been a myriad of changes to policy, funding arrangements, training packages, and well pretty much everything to do with the sector, however the number of providers within the sector has not changed substantially.  Providers have obviously come and gone and new providers have replaced old, but the overall number has really not altered at all.

A lot of the other information that is interesting is the research which pertains to the breadth and diversity of providers within the market place.  Firstly it needs to be said that this research makes no claims nor does it seek to make any about the levels of quality or outcomes across the various provider types, it simply looks at the number of providers and the students they service.

The really interesting thing for me was to see that around 2000 providers or 40% had less than 100 students and some had far far fewer students than this number, and on the flip side 50% of all students were enrolled with the largest 100 providers.  What we see from this is something that I have been suggesting for some time now, while there are a small number of very large providers, with large numbers of students (and can we please stop using the terms TOP which suggest they are the best and use the term LARGEST), that is not the norm, in fact about 50% of all students don’t do their study with a large institution be it public or private, they in fact choose one of the multitude of small to medium sized providers who operate in the market.  In fact when we look more closely at the data we find that 30 providers account for more than 1 million students (about 25% of the total number), however the next million students (25%) are services by more than double that number (70 providers).  The other 50% of students is looked after by around 4500 providers all of whom have less than 6000 students and in fact 4000 providers have less than 1000 students.

So what does all of this tell us.  Well while the research data released by NCVER doesn’t make any claims about what the data might be saying, I am going to.  To me what the data is telling us is that around half of all of the students involved in Vocational Education and Training are choosing to undertake their training with small to medium providers, most of whom are not public providers (TAFE) and the interesting question which comes out of that should be why.  In a lot of cases small to medium providers tend to play in niche markets or are strongly connected to organisations either as enterprise RTOs or in some other way, or have only a small number of qualifications on scope which represents the skill sets of the people involved in the business.  They also in a lot of cases provide a very different learner experience,  more personalised or tailored to the particular needs of the student and tend to provide a variety of ways in which students can study and interact with them outside of standardised classroom or online learning environments.  As with most parts of the Australian economy small to medium enterprises seem to be the foundation of the VET sector and the place that significant number of Australians want to get their education.

I think that when we look at the overall data in this report it becomes clear that those pundits who have suggested that there needs to be a rationalisation of the VET provider market place are simply wrong, well at least in my opinion.  The number of providers that we currently have seems, as I have said previously, to be the number of providers that the market wants and the diversity within those providers seems appropriate as well.  It is to me at least a recognition that not everyone learns in the same way or in the same environment or at the same pace and a lot of students realise that and look for providers that allow them  to engage in study in way they want to and in the programs that they want to enroll in.

Anyway that’s just my opinion.

Total VET Reporting – Lets talk about the figures.

So as some of you may have noticed I have had a little break from my usual posting schedule, mainly due to spending most of the last 2 weeks working with an organisation to delivery some initial TAE training to a large group of their staff.  Of course while I was having a break we saw the release of the Total VET students and courses data 2014 and a number of other documents which relate to it including Equity groups in TVA 2014, both of which I found to be very enlightening reads.  There have already been a couple of responses to the data, most notably Rod Camm’s which to me was quite reasonable, but I thought that I might look at some of the things which jumped out at me.

The first thing that really did leap out at me as I started to look through the data was, what part of this data related to VET FEE Help and what related to everything else and then I saw in explanatory note 30 – ‘It is not possible to identify VET FEE-HELP assisted activity by funding.’  Now I have to admit that this let me down a little when I read, because one of the things I was really interested in looking at in the data was the relationship between VFH and other kinds of funding, but as we can’t currently identify it there is not much that can be done.

So what are some of the figures which I found really interesting; firstly it was the break down of the actual number of students,  3,908,000 students enrolled in training with 4601 Australian providers, or 849 students per provider on average.  Let’s look closer at this however, as a lot has been made of the break up of figures between public and non-public providers and the effect that non public providers are having on TAFE admissions, with non-public providers servicing 57% of students.  What is not often considered, when we hear people talk about this is the massive disparity in the number of public vs non-public providers.  There are 57 TAFE institutes training 1,065,600 students and 2865 non-public providers training 2,252,900 students or 18,700 students per TAFE vs 786 students per non-public provider.  These numbers bear thinking about, at least to my mind, whenever public providers suggest that they don’t have enough students to make ends meet.  Even at a figure of say $2,000 per student, in terms of revenue that is over $35,000,000 on average for a TAFE as opposed to $1,500,000 on average for a private provider.  Now I know that I am talking in averages here and that there are big, small and medium players in both parts of the sector, but I still think it is interesting to consider.

The majority of students were male over the age of 25, which I personally found interesting because our student demographics are more slewed towards female participants. This has a lot to do with the fact that the vast majority of the training we deliver is in community services, where around 85% of the workforce is female.

What about the programs these students are undertaking, 30% of all enrollments were in Certificate III level programs and 86% of all programs completed were at a Certificate I-IV level.  This I think says something very important about the system that we have and that at its heart it is focusing on the right thing, that is, those programs that really are going to make a difference to people’s employment outcomes and their workforce participation options.  Business and commerce was the area in which most people studied, followed closely by community services.  While it has been suggested that the amount of business and commerce training being undertaken relates tightly to the VFH, its marketing and the perceived ease of deliver of these courses, and while we can’t see what amounts of these courses were funded using VFH or at least not from these figures, general business skills are deeply embedded in most of the things that people do so having a high percentage of people here may simply portray the market.  This could also be said of community sector qualifications, which are the second most popular.  The community sector is one of the largest employment areas and one in which the need for workers continues to grows.  It could be suggested that if areas like these were not high on the list that this may well be far more concerning than the current situation.

Another of the figures which I found quite interesting was in the equity group data.  By far the two largest equity group accessing VET were students from a non-English speaking background and students from rural and remote areas, with their participation rates being much higher than indigenous students or students with a disability.  Again within these groups we see that the overwhelming majority of students as with the general student population are undertaking certificate I-IV level programs, which as I said above is I think a good indicator that the heart of the system is targeted properly.  As we would also expect in a system where the vast majority of training delivered is around entry-level job roles, government funding made up around 60% of the way in which people ‘paid’ for their training with fee for service making up the rest.

So are there any disturbing pieces of data in this report.  In my honest opinion, when we consider that this is the first time this data has been collected and we don’t have a lot of previous data to base assumptions on, I don’t think there is.  I think the big thing is that this data needs to be improved and perhaps integrated with the data collected around VFH and other programs and then sliced and diced to give us a better picture of what is happening as will also happen as we accumulate data sets over a number of years and can begin to make comparisons.

Anyway that’s my opinion.

Measuring outcomes in Vocational Education – Are we doing enough?

With the deadline for Total VET activity reporting fast approaching us my mind has turned to how we evaluate outcomes in Vocational Education in Australia, or at least the data that we collect nationally which seeks to provide us with information on the Vocational Education Industry in this country.  Now we all know and love the Learner Questionnaire and the Employer Questionnaire, but really what are we capturing here and is it really giving us any real useful information.  Lets be honest, when we look at the Questionnaire’s, even the employer one, they really nothing more simple level one satisfaction (smile) sheets and anyone who has been in the L&D industry for even a few weeks knows how resounding useless smile sheets are if you want to convince anyone about anything that has to do with training.  Now I know that these are only one source of data, but if I had used that sort of data to convince an executive team as to why they should invest in a learning program I would have been laughed out of the boardroom.

If we look at the well know Kirkpatrick model of evaluation with its four levels

  1. Reaction.
  2. Learning.
  3. Behavior.
  4. Results.

realistically here all we are doing is evaluating reaction.  We are not really looking at anything that comes close to the other 3 steps, at least not on any consistent regular basis.  Now of course, someone could suggest that levels 2-4 and even level 5 (ROI) if we want to include that is something that should be done in the organisations that are sending their employees to be trained and that is a fair point and I would suggest that most organisations with L&D units are already doing this.  However, what about all of the people who are undertaking training not at the behest of their employer, they may be unemployed, casual, wanting to change careers, we seem not to be collecting and analysing any higher level data on these people.  We also seem not to be asking those employers who are collecting robust data on the learning of their employees for their data either.  Now admittedly in Queensland, and this may be the case in other states as well there is some collection being done of employment outcomes for students undertaking funded programs, but really if we are trying make a better system don’t we need some real data to allow us to make decisions about what is working and what isn’t.  (if we are collecting this data and someone knows about it and how we are doing it please let me know)  I come back again to the point that if I was to try and justify an organisational L&D budget on the kinds of data that is being collected on training, I would be told to go away and come up with some real figures.

So what should we be capturing, If we look at the Kirkpatrick model (I will talk about some other options later)  we should at least be capturing whether or not the student actually learnt something, where the learning objectives met, and how much change has there been in their skills, knowledge or attitude.  Now of course to do this properly, we need to collect data before and after training, and it could be suggested that by virtue of the person going from not having a qualification to having a qualification (becoming competent) that there is an increase in knowledge, but is this actually the case, has their actually be a change in the persons skills, knowledge or attitude.  With the data we are collecting at the moment we just don’t know we are making assumptions, but without good data those assumptions could very well be wrong.

What about whether or not we are seeing any changes in behaviour, are the learners putting what they ‘learnt’ into action in the workplace, are their behaviours changing as a result of the training that was delivered to them.  This is not something that can be answered with a smile sheet at the end of training, or an employer questionnaire with general questions about perceived improvement.  This is data that needs to be collected once the learner has been in the workplace for a while, once they have had time to adjust and integrate their new learnings into their work environment.

Then of course there is the question of what the result was, did this training improve anything organisationally, on a state level, on a national level.  Was it good for the organisation, the country, the learners, the bottom line, whatever.  We need to figure out what areas we a seeking to see improvements in and then track and see whether or not we are seeing those improvements.  Again this is a step up in complexity in terms of the types of data needed and how we might collect it.

Finally of course there is the question of Return on Investment.  It lets us begin to answer the question, has the billions of dollars the government has handed out in VET-FEE HELP and other programs and the money spent by business and organisations on top of that has been worth it, have we got the return on that money that we expected or did it just swirl away into a black hole where it didn’t improve anything which we would consider enough to measure.

Is all of this easy to do? No.  Am I suggesting that the reporting requirements on RTO’s change so that they are required to capture a whole range of additional data over a timeframe long after the student has finished their courses?  No.  Am I suggesting that perhaps we need to do a yearly survey, akin to the kind done by ATD (what was ASTD) on the state of learning, and include employers, organisations, training providers and learners and try to ask them some serious questions?  Maybe.  Am I suggesting we need to do more with this so called big data thing that has been talked about so much?  Maybe.  I don’t have a definitive answer, I just think we need to be doing more if we really want to capture the true outcomes from our Vocational Education system.


%d bloggers like this: