Investigating the Dip VET Unknown Unknowns: TAEDES501


Unknown Unknowns

If you missed the background, we have decided to take a look at what experienced training professionals do, and compare that with what the TAE50116 Dip VET says they should be doing.  You can read more about that HERE.  Following on from our first articles about four core units (TAEASS501, TAEDEL502, TAEASS502TAELLN501 and TAEPDD501), we have now collected some descriptive data related to the final core unit from within the TAE Dip VET, TAEDES501 – Design and develop learning strategies.

The following table shows the summary of assessment outcomes in this unit.  For the sake of comparison, we will include in this table (and subsequent ones) the data for the earlier units for which we have collected data.  For Performance Evidence, the figures represent the average percentages of benchmarks that were satisfactorily demonstrated, and the range for that across all 50 sample candidates.  For the Knowledge Evidence, the figures represent the percentage who satisfactorily demonstrated all the benchmarks.

TAEDES501 Summary data



For the unit, TAEDES501, of the 50 RPL candidates whose submissions were sampled, the average number of Performance Evidence benchmarks that were demonstrated satisfactorily was 64%. This is close to the average of 68% that has emerged from the six core units. Compared to other Core units within the Dip VET, it is stronger only than two units: TAEASS501 and TAELLN501.

Looking at the range, among these 50 candidates it was 40-88%, which represents the lowest upper end of all the core units. The issues surrounding trainer capacity in this particular part of their work is further foregrounded within the Knowledge Evidence area, where less than half of experienced trainers/assessors satisfactorily demonstrated the required Knowledge.

Performance in the individual benchmarks.

The following table lists the benchmarks used for assessing Performance Evidence of RPL candidates; the right hand column indicates the percentage of candidates who demonstrated it satisfactorily.

TAEDES501 Benchmark Data

We will allow those data to speak for themselves, and move to the Discussion.


While these data are very interesting, and while it may be simple to draw inferences from and conclusions about them, we must exercise caution without first conducting further analyses. Having said that, we do notice the following three things:

  1. there appears to be a dearth of structure related to the process for developing learning strategies, with less than half showing that strategies were based on research to identify possible options based on the target group and their needs.
  2. perhaps not surprisingly given the above, only about half of RPL applicants included delivery and assessment strategies that were appropriate for the specific context in which the learning strategy would be implemented, and less than half of RPL candidates satisfactorily demonstrated that they had reviewed the programs in the manner that the unit of competency requires.
  3. having said this, a clear majority did identify the target groups and their learning needs, and did research to identify what qualification or other options would meet the needs of that group. Anecdotally, however, it does seem that most applicants developed programs that were already being delivered or whose structure had already been decided.


What next for this investigation into Dip VET RPL outcomes?

We will continue to analyse data related to these Core units from within TAE50116 Diploma of Vocational Education & Training.  Ultimately, we are seeking to identify:

  1. those components that are commonly not demonstrated through an assessment based on RPL.
  2. if there exist any statistically significant differences in these ‘gaps’ between those RPL candidates who have more or less than 5 years’ experience, and who perform roles that are coal-face or back office roles.

We expect this investigation to be completed by about August 2018.  Results will be published at Fortress Learning and in relevant journal/s, and you can stay up to date with our Research program by adding your details below.  You can expect to receive a Monthly Roundup, and notification of other articles that we have written.



Why is Fortress Learning conducting research, anyway?

It’s a fair question.  As a private organisation, we certainly have no obligation to spend money on conducting research.

We believe that conducting a program of empirical research will allow us to better understand what is needed within our industry and how to provide for that.  Fortress Learning’s research program will certainly lead to refereed publications and conference presentations at the academic end of the spectrum and the process of conducting research will build valuable and exciting relationships with others, both in Australia and internationally.

Additionally, for the coalface of RTO operations, the findings that emerge from the research process will lead to a trickle-down effect, where findings and questions and issues can be identified and shared along the way in a range of forums (such as this one). It is these that we believe will have the greatest impact, by:

  1. Informing what we do and how we do it to ensure we are providing the best possible service for our learners and their future employers
  2. Communicating with likeminded organisations and individuals what we are discovering, and thus contributing in some small way to a restoration of confidence in what we do
  3. Providing our students, prospective students and employers with confidence in the commitment of Fortress Learning to best practice, and from that a confidence in the integrity of a Fortress Learning qualification
  4. Nurturing strong relationships within the VET sector, both domestically and internationally, by working together to identify opportunities of mutual benefit.

Click here to learn more about our Research Program.


Spread the love