Investigating the Dip VET Unknown Unknowns: TAEPDD501


 

Unknown Unknowns

If you missed the background, we have decided to take a look at what experienced training professionals do, and compare that with what the TAE50116 Dip VET says they should be doing.  You can read more about that HERE.  Following on from our first articles about four core units (TAEASS501, TAEDEL502, TAEASS502, TAELLN501), we have now collected some descriptive data related to another core unit from within the TAE Diplomas, TAEPDD501 – Maintain and enhance professional practice .

The following table shows the summary of assessment outcomes in this unit.  For the sake of comparison, we will include in this table (and subsequent ones) the data for the earlier units for which we have collected data.  For Performance Evidence, the figures represent the average percentages of benchmarks that were satisfactorily demonstrated, and the range for that across all 50 sample candidates.  For the Knowledge Evidence, the figures represent the percentage who satisfactorily demonstrated all the benchmarks.

 

For the unit, TAEPDD501, of the 50 RPL candidates whose submissions were sampled, the average number of Performance Evidence benchmarks that were demonstrated satisfactorily was 70%.  This is close to the average of 68.4% that has emerged from the five units thus examined.  Compared to other Core units within the Dip VET, it is stronger than TAEASS501 and TAELLN501, but weaker than TAEASS502 and TAEDEL502.

Looking at the range, among these 50 candidates it was 50-100%, which is closest to the data for TAEASS502.  The issues surrounding trainer capacity in this particular part of their work is further foregrounded within the Knowledge Evidence area, where less than two-thirds of experienced trainers/assessors satisfactorily demonstrated the required Knowledge.

Performance in the individual benchmarks.

The following table lists the benchmarks used for assessing Performance Evidence of RPL candidates; the right hand column indicates the percentage of candidates who demonstrated it satisfactorily.

TAEPDD501 Benchmarks TableWe will allow those data to speak for themselves, and move to the Discussion.

 

Discussion

While these data are very interesting, and while it may be simple to draw inferences from and conclusions about them, we must exercise caution without first conducting further analyses.  Having said that, we do notice the following three things:

  1. in simple terms, there appears to be lots of discussion among peers regarding both delivery and assessment practice.  This is perhaps expected among a group of people who have chosen to seek RPL for this unit, since this requirement is fairly well known, and understood.
  2. documentation of reflective activity surrounding assessment is much more robust than around delivery. Anecdotally, this is possibly due to the higher regulatory emphasis on recording ‘all things assessment’.  By contrast, critical incidents related to assessment were less frequently identified than were critical incidents related to delivery.  Perhaps confusing things further here, is that opportunities to improve was more evident with regards delivery than with assessment practice.
  3. an average of half of those trainers/assessors did not appear to base their PD on research, consulting and networking activities, and about a third did not seem to make a connection between PD activities and broader organisational and industry needs.

 

What next for this investigation into Dip VET RPL outcomes?

We will continue to analyse data related to these and the other two Core units from within TAE50116 Diploma of Vocational Education & Training.   Ultimately, we are seeking to identify:

  1. those components that are commonly not demonstrated through an assessment based on RPL.
  2. if there exist any statistically significant differences in these ‘gaps’ between those RPL candidates who have more or less than 5 years’ experience, and who perform roles that are coal-face or back office roles.

We expect this investigation to be completed by about August 2018.  Results will be published at Fortress Learning and in relevant journal/s, and you can stay up to date with our Research program by adding your details below.  You can expect to receive a Monthly Roundup, and notification of other articles that we have written.

Name:

Email:

Why is Fortress Learning conducting research, anyway?

It’s a fair question.  As a private organisation, we certainly have no obligation to spend money on conducting research.

We believe that conducting a program of empirical research will allow us to better understand what is needed within our industry and how to provide for that.  Fortress Learning’s research program will certainly lead to refereed publications and conference presentations at the academic end of the spectrum and the process of conducting research will build valuable and exciting relationships with others, both in Australia and internationally.

Additionally, for the coalface of RTO operations, the findings that emerge from the research process will lead to a trickle-down effect, where findings and questions and issues can be identified and shared along the way in a range of forums (such as this one). It is these that we believe will have the greatest impact, by:

  1. Informing what we do and how we do it to ensure we are providing the best possible service for our learners and their future employers
  2. Communicating with likeminded organisations and individuals what we are discovering, and thus contributing in some small way to a restoration of confidence in what we do
  3. Providing our students, prospective students and employers with confidence in the commitment of Fortress Learning to best practice, and from that a confidence in the integrity of a Fortress Learning qualification
  4. Nurturing strong relationships within the VET sector, both domestically and internationally, by working together to identify opportunities of mutual benefit.

Click here to learn more about our Research Program.