NCALNE (Voc) Remixed: Part 2

So what’s the solution then…? Well, in the last week I’ve written, rewritten, and am in the middle of trialling a totally new delivery approach for our NCALNE (Voc) training.


It’s a hybrid… a remix of aspects of the various other delivery models that I mentioned last time.

It combines the following:

  • Workshop delivery: Two days compressed from our standard 5 or 6 days with several components removed.
  • Candidate self assessment: This is essentially a needs analysis to determine what people know and do already, and where any gaps are in terms of the assessment criteria. This allows us to potentially customise the two days of training to meet the needs of the group.
  • Facilitator/Assessor verification: This is from our assessment of current competency assessment in part, but re-written from the ground up. The goal is to find out what people already know and do with regards to some aspects of the assessment criteria. Plus there are some things that we can teach, deliver, and assess in a very short space of time if we change the elective units and alter the nature of the assessment tasks. This relates in particular to the “describe knowledge of” part of the qualification and works best for people with some prior knowledge. This verifier checklist is something that we can check off during the two day workshop.
  • Evidence portfolio: Again this comes from our assessment of current competency, but is also informed by what we do with people in the project work for our standard delivery. This portfolio is designed to be quite prescriptive, but it does relate to the actual delivery side of things that should be part of tutors’ regular work. Here I’m referring the processes like literacy and numeracy diagnostic testing, embedding and contextualising literacy and numeracy into regular training, and measuring learner progress with regards to specific literacy and numeracy skills. The assumption here is that the host organisation is already well down the path of embedding literacy and numeracy, even if the tutors are new to the organisation or the concepts.
  • Supervisor attestation: Participants’ supervisors and managers need to buy into this process and support their tutors in gaining credentials. They should also know what their tutors are doing when it comes to delivery of embedded literacy and numeracy. Supervisor attestation is another kind of evidence we’re looking to use in conjunction with the teaching portfolio and our own verifier checklists. This also needs some controls around it, but this work has been underway long enough to suggest that there should be at least someone in the organisation already with some kind of credentials in this area that could attest to and verify some aspects of the assessment criteria.

I think we could make this approach work for smaller numbers of participants if we had strong buy in from management and tutors had support internally to put together the required portfolios (or put structures in place to deal with any gaps).

To the above, I probably need to add two more things:

  • Pre-workshop reading and tasks designed to address key content areas.
  • Online access to training, materials and resources that participants can use to address gaps identified in the self assessment as they work towards submitting their evidence portfolios.

This approach assumes that we’ll be working with tutors and trainers who are already doing some of the things that we’re looking for. It’s not all the way to a current competency model, but it’s not totally pitched at newbies either. It also assumes a high degree of support from the organisation including management.

It’s a shift towards a “credentialing” process as well, and away from training (although there is still a limited training component).

What I would love to have to complement this is a large collection of short video-based training sessions that capture the best of the live training that we do. Something along the lines of Salman Khan’s flipped classroom model would work. This might go some way to bridging the gap between the “nice to have” training, the “must have” training, and the assessment criteria that we have to work with.

What do you think…?

Author: Graeme Smith

Education, technology, design. Also making cool stuff...

Leave a Reply