Cognitive load and practical work research – an update

Estimated reading time – 9 minutes.

As I wrote about in my recent EiC article, I am particularly interested in improving the effectiveness of practical work in my classroom. This interest actually goes back right to the start of my teaching career, and I completed my MEd project on this. Sadly, I never made the time to write this up in a peer-reviewed journal.

Cognitive load theory has resonated strongly with me since I came across it over the last year or so, as it seems to provide a useful framework in which to fit my empirical observation about the effectiveness, or not, of practical work in my classroom.

I consider practical work to be important in student’s science education, but it can be hampered by a variety of factors. One of the key reasons is practicals tend to be overloaded, with too much trying to be achieved in one activity. Students end up seeing practicals as the ‘fun’ part of the lesson, mostly because they aren’t sitting having to think hard about the hard chemistry concepts, and it can provide an opportunity to chat with their mates.

The idea of ‘hand-on, minds-on practical’ has been around for some time. However, despite how well put together these suggested practicals are, I still find the students too often not getting past the ‘following the recipe’ and asking me ‘what do I do next?’

One possible reason for this was, as cognitive load theory discusses, is the split-attention effect, where students are having to refer to two or more sources of information to understand how to proceed. In a practical session, an example would be a list of written instructions with a diagram of the apparatus. By providing information in two distinct forms, the student has to switch back and forth between the two to develop an understanding of what they have to practically do and then measure/observe. This raises the level of extraneous cognitive load, leaving less room in the working memory for the development of understanding (germane load).

To try to combat this, I have been adapting some of the practical activities I use with various classes to integrate the diagrams and the textual instructions into ‘integrated-instructions’.

The first use of integrated-instructions was for the classic properties of halogens practical with a Year 9 group (13-14 year olds). Stage 1 was a demonstration and used to explain how the integrated-instructions worked. Stage 2, they carried out in dimple tile. Stage 3 and 4 we didn’t get time to complete (this incidentally spurring me to getting on with ordering the dropper bottles for my continued introduction of microscale to the department).

halogendisplacement

I then tried out the method with a Year 7 class (11-12 year olds) doing another classic, the melting/freezing characteristics of stearic acid. While this practical is criticised in parts for being fairly dull for students, I find it useful for a number of reasons. I’m still training up this young group of students, so I find it useful to have them focus on a practical task that doesn’t require much more than safe working and measurement. I can get around the class assessing their practical competence, and spot those who aren’t controlling a natural tendency to wander away from their experimental setup.

meltingstearic

Next was a simple thermochemistry practical, mixing two substances and measuring temperature change. This was a Year 10 group (14-15 year olds) with a wide range of attitudes towards Chemistry. There is a common attitude in several of them that practical work is the ‘easy’ lesson where they have plenty of opportunities to chat rather than learn new skills and focus on data collection.

thermochemistry

This session went well, with the students self-correcting their practical work in their groups, and producing useable data in a reasonable time. I set them off at different stages on the list of substances to get through, so we shared their data.

Following some useful conversations on Twitter, and being awarded a small research grant from the RSC Chemical Education Research Interest Group (working with my mentor Suzanne Fergus), I have started engaging more with the literature to see what has been done before, and take lessons on how to improve my adaptations. (A big thank you to my fellow Fellow Naomi Hennah for some useful pointers).

Haslam and Hamilton (2009) investigated the use of integrated-instructions at secondary school level. They produced a practical task of setting up a powerpack, bulbs and voltmeter. The control experimental task was written instructions, and this group had the equipment available to look at prior to the practical activity. The experimental group had integrated-instructions, which included photos of the equipment, and diagrams with the instructions integrated.

haslam physics

The effectiveness of the integrated-instructions was assessed in various ways, including speed of task completion, self-reporting of ease of the task and understanding of the underlying physics by the student’s written conclusions.

Overall, the group with integrated instructions i) completed the task quicker; ii) found the tasks easier; iii) had a better understanding of the physics involved.

I think the self-report of how easy the students find the task will be useful. I’m not sure about the level of analysis applied to these data however. The Likert-like scale data were treated as cardinal numbers, with calculated means and standard deviations. Beyond the excessive precision in these calculated values, the issues with treating ordinal numbers as cardinal numbers was well highlighted by Stewart Kirton at MICER17.

An earlier piece of research was carried out by Deschsri, Jones and Heikkinen (1997) along similar lines. (Another thanks to my research critical friend Michael Seery for originally bringing this one to my attention). This was another extensive study in terms of the data collected on the student’s responses to integrated-instructions. The study sought to investigate achievement (cognitive outcomes), student attitudes to practical work (affective outcomes) and physical skills development (psychomotor functions).

deschi

Four practicals in a laboratory manual were adapted in the integrated instructions manner.

In general, students with the integrated-instructions achieved better in some areas of interpretation and comprehension of practical work (including rates), had a more favourable attitude to laboratory work (although weren’t more enthusiastic about it) and showed greater manipulative and organisational skills.

In addition to the outcomes of the experiment, the authors produced some useful guidelines, included in how they developed the integrated-instructions:

  • Characteristics of the instructions
    • Clear objectives & short introduction
    • Simple precise language
    • Directing practice as required
    • Sequencing of steps.
  • The diagrams followed criteria
    • Pictures illustrating new equipment
    • Diagrams illustrating construction of apparatus and use of correct procedures
    • Diagrams showing procedural sequences.

I’ve also been thinking about the effectiveness integrated-instructions across my teaching groups age range (11-18), and when and where they will be most effective. A lot of work on this style of practical work seems to be at college level (18+). I have noted that techniques that are effective for novices, can become ineffective, or even inhibitory for experts (Paas et al, 2000expertise reversal effect). This is perhaps not surprising. Diagrams of apparatus will potentially become redundant information, and therefore actually adding to extraneous cognitive load (Cook, 2006). Movement from a novice to an expert will be at different non-linear rates within any classroom context. This has implications in the use of one instructional method for a whole class. The availability of instructions provided at different levels of sophistication may be a way to allow students to work at their level of expertise. Cook makes a further point that interpretation of graphical information has social aspects, and discussion amongst students on the meaning of the graphics is an important part of forming understanding (possibly linking in nicely with Naomi’s research).

Feedback from Twitter on posted images of the integrated-instructions included the idea of adding tick boxes on each instruction to allow the students to keep track of what they had completed, and having all the instructions in sequence (clockwise or anticlockwise).

The latest data

I started to implement the ideas from feedback and published work with the Year 10 group, who have starting their study of reaction kinetics. I wanted to train them up in the techniques used to measure rates (skills development) before introducing too much collision theory and factors affecting rate. The intention was for them to carry out two skill-building practicals in the same lesson so they could make direct comparisons. As it turned out, they were completed on different days.

The written instruction-with-diagram ‘mass loss’ practical was carried out first. The ‘disappearing cross’ integrated-instructions practical included sequencing of the steps in order (anticlockwise in this case), with tick boxes for the students to keep track of their progress.

ratesofreaction

All students managed to complete both practicals in the time available. Anecdotally, the students sought more practical help during the ‘mass loss’ practical, struggling with the concept of weighing everything together before and after the reaction to find the mass lost.

The ‘disappearing cross’ practical went well, with all students engaging with the practical, and a couple were heard directing their peers back to the integrated-instructions when they started asking me the classic ‘what do I do now?’

I asked the students to write a brief evaluation of the two experiments, and I received some positive feedback:

  • “I found the second experiment easier because it told you what you needed to put in where and it was short sentences that were easier to understand.”
  • “I found myself not asking as many questions and I found that the instructions were clearer.”
  • “Easy to keep track and make notes.”
  • “It had clear steps and what order to do things in.”

So, what’s next?

My research proposal to CERG was:

I intend to apply an action-research type methodology:

  • make changes to ‘standard’ practical activities as provided from publically available sources such as examination boards, RSC LearnChemistry and CLEAPSS;
  • gather data on the effectiveness of the practical activities in the form of
    • teacher perception of student engagement – a reflective diary written on the day of the practical
    • student progress through the practical activities – percentages completing/partially completing practical
    • ability of students to answer questions relevant to the activities – assessment of student responses to questions
    • student feedback on the activities – simple questionnaire
  • analysis of these data would then feed into the modifications made to future practical work.
  • recruitment of another school to trial 2-3 of the practicals during the development progress to get external validation of my findings/conclusions.
  • The expected outcomes of the project would be any general rules for improving the effectiveness of practical work through ‘integrated diagrams’, and a set of modified practical activities available to other teachers.

My next practical session with the Year 10 group will be taking the disappearing cross practical further to look at the effect of concentration on rates.

rates1of2

rates2of2

I have included the developing ideas so far in the practical sheet, so I can start to more systematically collect data on the effectiveness of the method of instruction.

If you’d be interested in discussing this further, do drop me a line via Twitter @dave2004b.

4 comments

Leave a comment