Hacking Reading Comprehension: Part 4 – Having a Go

Reporting back

Just a quick post to report back on my Reading Hacks. I had the opportunity to try out my prototype template the other day. The participants were a group of graduates from our training. They all had their NCALNE (Voc) and are involved in workplace literacy around the country. Our focus for the training day was a refresh of the big picture for embedding literacy and numeracy as well as a specific examination of what they could be doing to strengthen reading comprehension including working with workplace texts.

I’ll do another post on the big picture stuff that I’m currently using as I only want to talk about my approach to reading comprehension. If you missed it the template and a summary is here.

What did we do?

Very simple…

  1. The participants, who are all training facilitators, sat a paper-based version of the reading comprehension assessment generated by their organisation. This is a standardised assessment, and they sat a “dummy version” but essentially similar to what their own learners have to sit. Part of their work is to administer this.
  2. We had some discussion around how you sell this assessment process to learners. It was a timely discussion due to the compliance requirements involved. These trainers have already thought through some of the psychology behind how you talk to learners about this and will continue this discussion internally.
  3. I then asked them to then do a quick analysis of the reading test using my template and have a go at coding the questions according to the list. They worked in pairs and small groups to do this with some guidance from me as necessary.
  4. We reviewed the results together. There were some differences around how a few items had been coded. We agreed that it’s a subjective process sometimes and that the actual analysis was probably less important than what they do with it afterwards.
  5. From there I handed out some workplace forms they use in their own training and made them generate questions in the style of the standardised test that they had just sat and then analysed. They did this in pairs so they could bounce ideas off each other.
  6. We finished it up with a review of what question items each pair had generated including discussion around what aspects of the question made it more or less difficult for potential readers.

The results

I don’t have copies of what these participants actually generated as these are specific to their own texts and organisation. However, my reflection was that it worked really well. The list isn’t perfect and you have to “interpret” some of the questions creatively in places, but it’s a short list and did the job in a short time frame (about 2.5 hours). They came up with preliminary lists of their own question items and I finished off by giving them some standard question wording “in the text…”, “according to the article…”.

Next steps

This group agreed that they would continue with the question item generation beyond our workshop. I’m hoping that they will also internalise the question generation process so that they can generate verbal questions “on the spot” as generated by needs or other analysis. I think that they actually do this already, but my point was to shift this to a much more explicit process. At the moment this is mostly automatic and intuitive and the idea with the NCALNE (Voc) training is to keep shifting people to a more explicit model of delivery when it comes to the embedded literacy and numeracy.

From my side, it was a great place to start a conversation around teaching inferencing skills. When I do this again I’ll pay more attention to inferencing in particular. Inferencing is hard to teach and it was a helpful process for the group to code a question as requiring inferencing or deciding if the information was explicit in the text.

One of the things that we didn’t do was link the questions they were generating to specific steps on the Learning Progressions framework that we use in New Zealand. I deliberately avoided this as I didn’t want to get bogged down in minute details. I’ll probably have a go at this soon, but it was good to keep things simple first time around.

Let me know what you think in the comments. How would you use the template or this approach to dealing with reading comprehension? Do you think the questions need to be directly linked to steps on the progressions? Does it actually matter?

Author: Graeme Smith

Education, technology, design. Also making cool stuff...

2 thoughts

Leave a Reply