Evaluation of Multi-Level Media

This post is a chance to collate a lot of complex thinking around how my multi media studies class ran this year. The purpose is to evaluate and reflect with a focus on next steps for developing the course.

I proposed a multi level media course last year, which manifested as two 31-33 classes with a fairly even mixture of Y12 and Y13 students in both. I developed an approach to the course by moving away from the Achievement Standard and looking at the core curriculum seeds from which assessment could grow. I was looking for the core concepts and learning objectives that had commonalities between the year groups. I developed a plan which saw the year split into three areas:

  • Production
  • Genre
  • Research

Each of these areas had specific curriculum links to focus on during these thirds of the year, and potential achievement standards that students could opt into.course-outline

According to Hipkins, Sheehan and Johnston “standards are not…designed to be treated as a basis for time-bounded, sequential teaching units” (46, 2016). They suggest that courses structured by chunks of Achievement Standards typically contain problems of fragmentation. Their suggestion is to compile a comprehensive compilation of what is worth learning for each curriculum area and design courses from that.

The delivery of this course was a big shift as well. My planning was around identifying the key concepts or key learning that needed to take place for success in each standard to meet – stripping the multiple week units I had taught in the past right back to their core. I split these learning topics over the course of weeks, attempting to create one idea or topic per lesson. Given the design of this course has student agency at its heart, I never made teacher time compulsory. Learning outlines were shared and students could opt in to taking part in the tutorial-type structure which left me working with a small group in a teacher directed way. To support this structure resources were developed for each learning area to guide students through in a self directed way. To develop this I need to:

  • Create student opportunities to run tutorials with peers
  • Refine the approach to the breakdown of the course so that tutorial time is effective
  • Develop the self directed resources to further emphasise learning, not assessment.

To support this, students were put into critical quartets (groups of four sometimes five multi-level with a range of individual needs). Each week we would have 10 minutes for each group to discuss three or four reflective questions:

  1. Share one piece of significant learning for you in the past week.
  2. Check assessment plan together. Outline what you are doing for each piece of assessment for the remainder of the year.
  3. How can your learning be supported for the rest of the year?

The purpose of such a time was to focus on the principles of the class which I regularly articulated:

  • Ako – grounded in the principle of reciprocity
  • Collaboration – learning together
  • Reflection – engaging in continuous learning
  • Whanaungatanga – positive relationships
  • Me Whakamatau – work had to achieve together

Hand ins for assessment were not as naturally occurring as I would like, but an improvement from my approach in 2015. I had a go at zero deadlines last year with mixed success. While I feel that achieved some deep personal learning for a number of students, I didn’t really have the data to be able to continue with that approach. Students that we would not typically define as ‘high achieving’ struggled and administration of this approach proved challenging.

This year I set up four deadlines across the year. For each one a student needed to submit one assessment. This effectively reduced the amount of credits in the course (although students were welcome to submit additional assessments, although only seven students across the two classes took up the option). The following statistics capture the picture at the time of writing:

  • Prior to external assessment (where additional credits could be gained) the average number of credits per student were
    • 10.4 credits – line one
    • 10.1 credits – line two
  • When outliers are removed (i.e. international students not working towards NCEA, students that did not engage due to horrific absences) the averages were:
    • 10.7 credits – line one
    • 11.4 credits – line two
  • When broken down between Y12 and Y13 the difference is clear. Reflecting the difference between the uptake in the external and :
    • Y12 – 10.9 credits
    • Y13 – 13.2 credits

Overall, this credit attainment is lower overall than previous years. When there was a structured course design Y12 contained 17 internal credits and Y13 contained 16 internal credits. More student choice has led to less overall credits. More analysis needs to take place of the level of achievement gained as my hypothesis is that less coverage has led to deeper content – and therefore an increase in the number of Merits and Excellences.

However, more pressing is the consideration of whether this course design has led to deeper learning in terms of the vision of the school and the front half of the curriculum. In terms of data to measure this, firstly, I have taken surveys of the students throughout the year to self reflect on the development of their understanding and application of the key competencies. This data can be built on when gathered next year after implementing those key next steps.

Furthermore, student voice has been gathered which capture some of the perspectives of the class. These quotes firstly establish the positives of this approach:

Having a choice with what internals to do and when to do them by was a very important learning step for me. I feel it got me prepared for the mindset and the self motivation skills I will need next year at university. In saying that, it was fairly difficult to get into the habit of this especially since it was the first year where we really got a choice on what we do.

There shouldn’t be any boundaries with learning and I think that everyone should be able to study together, it lets people connect and share more ideas with each other no matter the age or year difference.

It worked good because being self directed meant I set more goals

I think this has worked for me in a way where I got to get into discussions with peers that I otherwise wouldn’t talk to, especially with the discussion opportunities. The classroom being an overall friendly environment that allows growth has helped me a lot with my learning and understanding.

And these perspectives offer some insight into the challenges going forward:

I thought that this was good for my learning becaue it meant I could do things I enjoyed and was interested in but I think I would have benefitted from some more structured lessons around how to do certain things.

I didn’t particularly enjoy having a mixed class were everyone was doing different stuff. I’m not very good at working in an environment that is not teacher directed. I did however enjoy when we did class discussions.

At first I was lost and didn’t know what I was meant to do / what I was doing. Even when I was giving help I never really understood what I was still supposed to do.

My interest here is in the difference between what I thought I was doing and how what I was doing was seen by the students. There is plenty of feedback here to keep developing my approach. I believe the core data here speaks to a continuation of the principles of the class, but refinement of the method.

Advertisements

2 thoughts on “Evaluation of Multi-Level Media

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s