A quick review of the latest edition of Impact

A chemical problem solving metacognition prompt sheet.


And the review…




Ben Rogers’ “Big Ideas in Physics” – a review

I came across Ben Rogers in the last couple of years on Twitter. He has written some interesting blogs in this time, with a strong focus on Physics teaching and latterly particularly on aspects of maths in Physics. Always readable, my interest was particularly peaked when I saw his story map on Physics last August and the mention he was writing a book for novice Physics teachers.

Having taught some GCSE and A-level Physics in the last few year, I don’t consider myself a novice. However, there was always the feeling that I hadn’t spent enough time doing it to really know the stories of the subject well, and the really good demonstrations and explanations which can help the students learn the subject.

When Ben’s book, “The Big Ideas in Physics and How to Teach Them”, was finally published, I ordered right away and managed to finish it over this weekend. It was definitely worth the wait, and the few hours of reading and thinking time.

Chapter Zero (a nice allusion to the Zeroth Law of Thermodynamics I’m guessing) is a clear, succinct and useful summary of the zeitgeist of using cognitive science can help students learn. I’ve been working my way into these ideas over the last couple of years, so while it wasn’t all new to me, the clear exemplars and discussion make it a useful chapter by itself.

There then follows five chapters, each covering the Big Ideas of Physics: electricity; forces at a distance; energy; particles; the universe. The format is the same for each chapter (reducing the readers cognitive load perhaps). The important stories are told of the development of our understanding of the concepts, highlighting the key players and also the conflict that makes scientific advances as much a ‘soap opera’ as many other parts of our human development.

Next comes a discussion of the key misconceptions students will probably hold. Knowledge of these is always useful before teaching a subject for the first time, or after some time. While the misconceptions tend to surface anyway within teaching, having them explicit at the start helps shortcut some of the difficulties encountered. I like how Ben’s pragmatism really shines through in these sections. For example, in his discussion about how student’s misconceptions may never really ever go away – they are too well embedded in the long-term memory. Rather our job is to ‘crowd out’ the misconceptions by exhaustive exposure of the student to problems and examples of the accepted understanding of the physics. Also his line on understanding energy: ‘If Richard Feynman says we don’t understand it, we don’t understand it.’ (Also reminding me of one of my favourite Feynman interviews on Magnets)

Ideas about what and how to teach are presented, with key experiments and demonstrations. Many of these are familiar to anyone who has taught some physics, and will be commonly found in the commercial schemes of work that are widely available. However, there are a few I hadn’t come across, making for a fun hour with my kids at home on a Sunday afternoon.


A range of archetypal questions are presented which can provide the foundation of the bulk of the work students will need to work on to gain confidence and competence in the concepts. Suggested class activities draw strongly on ideas from cognitive science/psychology for those that will have the most impact on student understanding. ‘Similar/different’, goal-free and elaboration tasks are commonly suggested in each section – again the repetition to help with our cognitive load!

As a chemist who teaches some physics, this book has a dual use. It will be my go-to recap guide before I launch into teaching a particular topic. More than that, four of the five chapters on concepts underlie the teaching of concepts in chemistry, particularly energy and particles. While teaching physics is inevitably more challenging than teaching chemistry for me, I think it has improved my chemistry teaching over the years. I have a deeper appreciation of the concepts the students are encountering in their physics lessons, and it helps me to build on these in their chemistry lessons. I think I have, and continue to, gain a deeper understanding of how to teach chemistry effectively with this ‘cross-fertilisation’ of ideas.

In summary – this a great introductory text for those new to teaching 11-18 physics. However, as with any well written pragmatic text, it is also useful to those who have been teaching physics for a while, but sometimes don’t feel that they quite get how to teach it well. The zeroth chapter is of use to anyone in the classroom, regardless of how long they have been there, and the stories are concise and useful signposts for further reading. Finally, the exemplification of useful learning activities are well crafted, and can be lifted directly into the classroom. Overall, a well-written and useful text which I’m happy is part of my collection.


Feedback on the intergrated-instructions sheets

I’ve had a chance to look at the data I collected from the four practical activities I’ve run with three groups using integrated-instructions.

The emerging themes are:

  • all students completed the activities and none found them hard;
  • most students found the clarity of the instructions helpful in understanding what they were doing;
  • some students found the ability to visualise what they were doing helped with their confidence;
  • there was a mixed picture on how well students understood what they had done / observed.

I’d be interested at this stage to get some feedback from other teachers in different schools about how their students find the instruction sheets. I’d be interested in any general themes. I’d also be interested in comments about the sheets themselves.

Below are links to the four practical sheet I have used, and links to the original practicals.

Anything you do share will be fully anonymised in future communications/publications. Please check that your students consent to you sharing any specific comments with me beforehand.

If you do have feedback, please contact me via djpaterson@aldenham.com


Have we lost ‘scientific literacy’?

Reading time – 4 minutes

I have been re-reading some old texts recently to try and help clarify a vague sense of disquiet I have about the new GCSEs. These have been James Williams’ ‘How Science Works’ and the Beyond 2000 report.

I trained in 2006-7 and completed my NQT year at a school that was one of the pilot schools for the 21st Century Science course. I don’t think I fully appreciate how the course was structured during that year. I then went on to teach AQA Science for six year, then 21st Century again for a year before taking a break from teaching. I sat my A-levels in 1996, following the first iteration (I think) of the Salters Chemistry course, and then taught the third iteration of the course for six years. The 21st-Century Science GCSE were strongly driven by the Beyond 2000 report, and Salters Chemistry can be considered a descendant of the Nuffield courses. I also had a hand in producing resources for the new 21st Century Chemistry course while I was working at OCR. All of this is to say, I have had some experience on both sides of the whiteboard in learning and teaching context-led courses.

I think my main concern stems from Recommendation 2 from the Beyond 2000 report:

At Key Stage 4, the structure of the science curriculum needs to differentiate more explicitly between those elements designed to enhance ‘scientific literacy’, and those designed as the early stages of a specialist training in science, so that the requirement for the latter does not come to distort the former.

This seemed an eminently sensible recommendation when I first read it years ago, and it still seems sensible now.

I will admit that I sometimes found the 21st Century course frustrating and hard to teach. The ‘Core’ science course (the ‘scientific literacy’ part) seemed to mark an interruption in the teaching of science from KS3, and then onto KS4 (in Year 10 with the ‘Additional’ then ‘Further Additional’ Science). However, I think the way we taught it (probably driven a lot by how the assessments worked in the days of modular exams) wasn’t optimal. Re-reading ‘Beyond 2000’, it seems the intention was for the ‘scientific literacy’ course to be taught in parallel with an academic or vocational course (Additional Science or Additional Applied Science etc), rather than one after the other.

The parts of the 21st Century courses that made a lot of sense were the explanatory stories. These were interesting, captivating for the students, and made the teaching of the wider aspects of science (Ideas About Science) much more authentic and understandable to the students. The provision of fully resourced packages (text-books and related and coherent resources) helped here. The assessment, while sometimes cumbersome with the Case Study / Data Analysis / Investigation, at least provided some focus for these Ideas About Science. I know well the arguments for the removal of this Controlled Assessment, but I worry the assessment pendulum may have swung too far by going to fully external written assessments.

Similar arguments can be made with the Salters A Level Chemistry course, prior to the latest iteration. It was a coherent, spiral curriculum, well resourced, with a significant and well respected Individual Investigation, which the students gained a huge amount from.

So where are we now? Admittedly, I’m teaching different courses now (AQA Sciences and OCR A Chemistry A Level), so the comparison isn’t exact. However, with the 100% defined content with the GCSE, I imagine there are similar issues with teaching the new OCR GCSE Sciences. I’ve been teaching the new course for less than a year, while others have been at it since its first-teach in 2016 (and probably before, given the reality of the three-year GCSE these days). However, it has all seemed so very content-heavy. I recognise Working Scientifically is written into the specification, and there are links made to specification statements. However, I can’t see much coherence yet in how I will teach and assess the full range of Working Scientifically across the years. I also recognise there remain opportunities to incorporate the ‘explanatory stories’ throughout the course. However, it all feels to pushed for time just to ‘get through’ the specification. (The innovation of stories+resources from EiC recently is proving helpful a useful model of resourcing.)

Perhaps this will all work itself through over the next couple of years. Any curriculum reform will inevitably lead to extra work and a period of uncertainty while we get used to the new requirements. However, I remain worried we may have lost something important. The new GCSEs will probably be fine for those carrying on the post-16 science study. For the rest, the majority of our pupils, I’m not yet convinced that the new GCSEs as they are currently set-up will be helpful in their journey towards being scientifically literate citizens.

I would be interested in your thoughts.


#RSCPoster #RSCEdu

An online poster competition from the RSC.

I entered the fourth annual #RSCPoster competition last week, and am very proud to say that I won the Secondary and Further Education section.

20180228 RSCPoster MICER Poster v2

This is a international online competition, where delegates post a poster on their research projects under a range of categories, including Chemical Education, Nanoscience and Chemical Biology.

I designed a poster on my ‘integrated-instructions’ practical research project. I posted at the start of the ‘conference’ which ran from 9am Tuesday 6th March to 9am Wednesday 7th March 2018.

During the day, I checked in on my poster tweet to see who had left comments, and to reply to these and any questions. There was also a little time to look at other people’s posters, alongside a pretty full teaching day! It was very gratifying to receive so many Likes, Retweets and comments.


After the conference, the judges conferred and picked the winners. In the Chemistry Education category, there was one winner for Secondary/Further Education, and one for Higher Education.

The whole process was very interesting. I’ve only done one other online poster before, for MICER17 last year. I used some of the feedback from that to try and improve for this poster. Suzanne Fergus and Michael Seery, my CERG Research Fellowship mentors, gave me some very useful feedback, which I think improved the finished article.

I think among the most useful advice was on what to put in, and what to leave out. There is a balance to be struck between making an online (and paper) poster readable and accessible, but also including enough detail to allow others to ask relevant and probing questions.

Overall, a very worthwhile event, and one I’ll hopefully take part in again in future. I would recommend others who are carrying out any level of classroom research to think about entering as well.


Integrated instructions – templates

The integrated instructions practical equipment templates.

This half term is the main data collection half term for my project. I’m running practicals with my two Year 9 classes and my Year 10 class. The students seem to be responding well to the instructions, and they seem confident in what they are doing in lab.

Another teacher has asked for the templates so they could make their own practicals. I’m happy to share these and the file is linked below.

Integrated instructions – templates – David Paterson – v0.01.pptx


The instructions are all made in Powerpoint so they’re not fancy, but they do the job!

Please note these are released on a Creative Commons license (“Attribution-NonCommercial-ShareAlike“).


Cognitive load and practical work research – an update

An update on my thinking and research into improving the effectiveness of practical work through integrated-instructions.

Estimated reading time – 9 minutes.

As I wrote about in my recent EiC article, I am particularly interested in improving the effectiveness of practical work in my classroom. This interest actually goes back right to the start of my teaching career, and I completed my MEd project on this. Sadly, I never made the time to write this up in a peer-reviewed journal.

Cognitive load theory has resonated strongly with me since I came across it over the last year or so, as it seems to provide a useful framework in which to fit my empirical observation about the effectiveness, or not, of practical work in my classroom.

I consider practical work to be important in student’s science education, but it can be hampered by a variety of factors. One of the key reasons is practicals tend to be overloaded, with too much trying to be achieved in one activity. Students end up seeing practicals as the ‘fun’ part of the lesson, mostly because they aren’t sitting having to think hard about the hard chemistry concepts, and it can provide an opportunity to chat with their mates.

The idea of ‘hand-on, minds-on practical’ has been around for some time. However, despite how well put together these suggested practicals are, I still find the students too often not getting past the ‘following the recipe’ and asking me ‘what do I do next?’

One possible reason for this was, as cognitive load theory discusses, is the split-attention effect, where students are having to refer to two or more sources of information to understand how to proceed. In a practical session, an example would be a list of written instructions with a diagram of the apparatus. By providing information in two distinct forms, the student has to switch back and forth between the two to develop an understanding of what they have to practically do and then measure/observe. This raises the level of extraneous cognitive load, leaving less room in the working memory for the development of understanding (germane load).

To try to combat this, I have been adapting some of the practical activities I use with various classes to integrate the diagrams and the textual instructions into ‘integrated-instructions’.

The first use of integrated-instructions was for the classic properties of halogens practical with a Year 9 group (13-14 year olds). Stage 1 was a demonstration and used to explain how the integrated-instructions worked. Stage 2, they carried out in dimple tile. Stage 3 and 4 we didn’t get time to complete (this incidentally spurring me to getting on with ordering the dropper bottles for my continued introduction of microscale to the department).


I then tried out the method with a Year 7 class (11-12 year olds) doing another classic, the melting/freezing characteristics of stearic acid. While this practical is criticised in parts for being fairly dull for students, I find it useful for a number of reasons. I’m still training up this young group of students, so I find it useful to have them focus on a practical task that doesn’t require much more than safe working and measurement. I can get around the class assessing their practical competence, and spot those who aren’t controlling a natural tendency to wander away from their experimental setup.


Next was a simple thermochemistry practical, mixing two substances and measuring temperature change. This was a Year 10 group (14-15 year olds) with a wide range of attitudes towards Chemistry. There is a common attitude in several of them that practical work is the ‘easy’ lesson where they have plenty of opportunities to chat rather than learn new skills and focus on data collection.


This session went well, with the students self-correcting their practical work in their groups, and producing useable data in a reasonable time. I set them off at different stages on the list of substances to get through, so we shared their data.

Following some useful conversations on Twitter, and being awarded a small research grant from the RSC Chemical Education Research Interest Group (working with my mentor Suzanne Fergus), I have started engaging more with the literature to see what has been done before, and take lessons on how to improve my adaptations. (A big thank you to my fellow Fellow Naomi Hennah for some useful pointers).

Haslam and Hamilton (2009) investigated the use of integrated-instructions at secondary school level. They produced a practical task of setting up a powerpack, bulbs and voltmeter. The control experimental task was written instructions, and this group had the equipment available to look at prior to the practical activity. The experimental group had integrated-instructions, which included photos of the equipment, and diagrams with the instructions integrated.

haslam physics

The effectiveness of the integrated-instructions was assessed in various ways, including speed of task completion, self-reporting of ease of the task and understanding of the underlying physics by the student’s written conclusions.

Overall, the group with integrated instructions i) completed the task quicker; ii) found the tasks easier; iii) had a better understanding of the physics involved.

I think the self-report of how easy the students find the task will be useful. I’m not sure about the level of analysis applied to these data however. The Likert-like scale data were treated as cardinal numbers, with calculated means and standard deviations. Beyond the excessive precision in these calculated values, the issues with treating ordinal numbers as cardinal numbers was well highlighted by Stewart Kirton at MICER17.

An earlier piece of research was carried out by Deschsri, Jones and Heikkinen (1997) along similar lines. (Another thanks to my research critical friend Michael Seery for originally bringing this one to my attention). This was another extensive study in terms of the data collected on the student’s responses to integrated-instructions. The study sought to investigate achievement (cognitive outcomes), student attitudes to practical work (affective outcomes) and physical skills development (psychomotor functions).


Four practicals in a laboratory manual were adapted in the integrated instructions manner.

In general, students with the integrated-instructions achieved better in some areas of interpretation and comprehension of practical work (including rates), had a more favourable attitude to laboratory work (although weren’t more enthusiastic about it) and showed greater manipulative and organisational skills.

In addition to the outcomes of the experiment, the authors produced some useful guidelines, included in how they developed the integrated-instructions:

  • Characteristics of the instructions
    • Clear objectives & short introduction
    • Simple precise language
    • Directing practice as required
    • Sequencing of steps.
  • The diagrams followed criteria
    • Pictures illustrating new equipment
    • Diagrams illustrating construction of apparatus and use of correct procedures
    • Diagrams showing procedural sequences.

I’ve also been thinking about the effectiveness integrated-instructions across my teaching groups age range (11-18), and when and where they will be most effective. A lot of work on this style of practical work seems to be at college level (18+). I have noted that techniques that are effective for novices, can become ineffective, or even inhibitory for experts (Paas et al, 2000expertise reversal effect). This is perhaps not surprising. Diagrams of apparatus will potentially become redundant information, and therefore actually adding to extraneous cognitive load (Cook, 2006). Movement from a novice to an expert will be at different non-linear rates within any classroom context. This has implications in the use of one instructional method for a whole class. The availability of instructions provided at different levels of sophistication may be a way to allow students to work at their level of expertise. Cook makes a further point that interpretation of graphical information has social aspects, and discussion amongst students on the meaning of the graphics is an important part of forming understanding (possibly linking in nicely with Naomi’s research).

Feedback from Twitter on posted images of the integrated-instructions included the idea of adding tick boxes on each instruction to allow the students to keep track of what they had completed, and having all the instructions in sequence (clockwise or anticlockwise).

The latest data

I started to implement the ideas from feedback and published work with the Year 10 group, who have starting their study of reaction kinetics. I wanted to train them up in the techniques used to measure rates (skills development) before introducing too much collision theory and factors affecting rate. The intention was for them to carry out two skill-building practicals in the same lesson so they could make direct comparisons. As it turned out, they were completed on different days.

The written instruction-with-diagram ‘mass loss’ practical was carried out first. The ‘disappearing cross’ integrated-instructions practical included sequencing of the steps in order (anticlockwise in this case), with tick boxes for the students to keep track of their progress.


All students managed to complete both practicals in the time available. Anecdotally, the students sought more practical help during the ‘mass loss’ practical, struggling with the concept of weighing everything together before and after the reaction to find the mass lost.

The ‘disappearing cross’ practical went well, with all students engaging with the practical, and a couple were heard directing their peers back to the integrated-instructions when they started asking me the classic ‘what do I do now?’

I asked the students to write a brief evaluation of the two experiments, and I received some positive feedback:

  • “I found the second experiment easier because it told you what you needed to put in where and it was short sentences that were easier to understand.”
  • “I found myself not asking as many questions and I found that the instructions were clearer.”
  • “Easy to keep track and make notes.”
  • “It had clear steps and what order to do things in.”

So, what’s next?

My research proposal to CERG was:

I intend to apply an action-research type methodology:

  • make changes to ‘standard’ practical activities as provided from publically available sources such as examination boards, RSC LearnChemistry and CLEAPSS;
  • gather data on the effectiveness of the practical activities in the form of
    • teacher perception of student engagement – a reflective diary written on the day of the practical
    • student progress through the practical activities – percentages completing/partially completing practical
    • ability of students to answer questions relevant to the activities – assessment of student responses to questions
    • student feedback on the activities – simple questionnaire
  • analysis of these data would then feed into the modifications made to future practical work.
  • recruitment of another school to trial 2-3 of the practicals during the development progress to get external validation of my findings/conclusions.
  • The expected outcomes of the project would be any general rules for improving the effectiveness of practical work through ‘integrated diagrams’, and a set of modified practical activities available to other teachers.

My next practical session with the Year 10 group will be taking the disappearing cross practical further to look at the effect of concentration on rates.



I have included the developing ideas so far in the practical sheet, so I can start to more systematically collect data on the effectiveness of the method of instruction.

If you’d be interested in discussing this further, do drop me a line via Twitter @dave2004b.


Using microscale practicals in the classroom

Reflections on introducing microscale chemistry practicals in a new school

Reading time: 6-7 minutes

Having spent six weeks in the new school, and trying out microscale practicals with a variety of different year groups, I thought it was time to put some words down on my experiences so far. I have written previously about this in Education in Chemistry with a more theoretical and personal historical perspective, so this post is more on the actuality of introducing a new style (or just new) practicals to a school and my classroom. I had made some use of microscale in previous teaching jobs, and spent a lot of time at OCR thinking and writing about it. I am now keen to give microscale a good go at the new school for several reasons.

Firstly, I think the more limited equipment and small required field of observation can reduce the extraneous cognitive load on the students, giving them more free working memory to allow for interpretion of observations at the time they are making them.

Secondly, my current school has 45 minute lessons, and I get very few double lessons over the fortnight timetable. Having taught nine years of hour-long lessons, I’m still adjusting to this significant reduction in the teaching period. More organisation is needed by me on the preparation the students do before the practical session, e.g. having them read practical sheets / watch videos of the techniques etc.

Thirdly, I have advocated microscale strongly while at OCR, so am very keen to see the benefits across a wide number of groups. While I’m not teaching to either the Gateway or 21st Century specifications, I’m making use of these practicals never-the-less. I think one of the major advantages of the new specifications is the freedom teachers have in choosing which practicals they use throughout the course, so I’m grasping the opportunity with both hands!

Lastly, it fulfils a personal interest in maintaining a toe in the waters of research. I presented some intentions at MICER17, and now that I’m back in school I can do something about investigating the effectiveness or otherwise of microscale, rather than just theorising about it.

So far, I’ve tried out five microscale practical:

1. Chromatography of leaf chloroplasts. I did this with two Year 9 groups, a higher ability and a lower ability group. The chromatograms ran relatively successfully, but the main problem was the size of the spots – most students made these far too large, so any distinction between the components in the mixture was lost. I found extraction of the chloroplasts from the leaves took much longer than I’d predicted, so I suggested to a colleague who also tried it out, to do the extraction prior to the lesson and then have the students just spot and then run the chromatograms – this led to a more efficient practical session in his class.


2. Microscale synthesis of copper sulfate – this was partially successful – half the group managed to follow the written instructions without additional help and got to the copper sulfate solution quickly and efficiently. The other half (same group as the titration – below), seemed very hesitant to read instructions carefully, or more than once, and were persistent in asking me what to do next. We did a larger scale version as well as a context for crystallising solutions over a waterbath and filtration with fluted filter papers, and got some lovely crystals.


3. Electrolysis of copper sulfate to aid consolidation of the discussion of industrial copper production. I replaced the CuCl2 with CuSO4, so we didn’t have to deal with chlorine production, and as such the potassium halide and litmus paper weren’t necessary. The majority of the students set up the electrolysis apparatus quickly and got to relevant observations within about 5 minutes of the ‘go’. I was also using this as a test practical to see how practically competent the Year 11 group were (having just taken them over), and how well trained they were in cooperative working. Some of the students were spontaneously talking about redox when observing the copper and oxygen production, and this led to a fuller discussion of electrolysis than I’d planned.


4. Metal displacement reactions to consolidate work on the reactivity series, with Year 8 students. I didn’t give the students the written instructions for this activity, rather demonstrated the setup with an IPEVO Point to View camera (a great investment I think for any teacher). I used the drop-setup sheet in a plastic wallet and the majority of the students managed to set up accurately after the projected on-screen demonstration. There were some who managed to get the drops wrong the first time, despite the printed chemical formulae on the sheets, but the small scale made clearing the sheet quick and easy, and they didn’t lose too much time. Most made good observations, and some managed to articulate the link with the reactivity series immediately, pointing to the live observations as evidence when questioned.


5. Gravimetric titration – I used the ‘The Vinegar Dilemma’ as a consolidation activity for my Year 10 Combined Science group. While they aren’t required to know titration theory, the practical links directly with the required understanding of concentration. They had also had a string of rather dry calculations lessons previously, and they and I needed some hands-on learning. There was a distinct split in this group. We read through the instructions together, I demonstrated the setup and then ran the first titration with them observing (12 in the group). Half of the group then got on with no further requests for help and gained fairly accurate results. The other half, were persistently asking ‘what do I do now?’, ‘what’s next?’, even while holding the instruction sheet in their hands. This group is still relatively new to me, so we’re still building the class relationship, and I’m working hard to instil some more self-confidence in their abilities. I think this latter half were slowly getting the message the all the necessary instructions were to be found on the printed sheet, but were frustrated with what they perceived as me ‘not helping them’. I’ve taught groups like this in the past, and I’m sure they’ll self-reliance will improve with time.


I think the most significant thing I’ve confirmed for myself so far is the importance of trialling new resources before widespread use in the classroom. There are already adaptations I’ll be making to some of these practicals before using them again. Lots of resources are produced from many individuals and organisations, but live testing of them with students must be critical to their development and long-term success.

So what next? Well, I’ve ordered myself some dropper bottles to make up the boxes for ion tests as a start. We’ve got a set of the small glass vials which have been useful. I’ve got plenty of other practicals I’d like to try out, but I’m feeling a little swamped by the process of starting in a new school, establishing a reputation with colleagues and the students, and taking on adult volunteering in the school’s CCF RAF section.

I intend to be more systematic in my evaluation of the microscale practicals I use before I start rolling them out across the department. I’d be interested to hear how other people have done this in their departments, and the barriers they faced in up-take by other members of their departments.


Quizzing and random question selector

UPDATE 8/10/17 A 10-question version is available here:



Reposting here so there is a permanent page for my quizzing Excel file and any updates over the year.

I would be interested in your feedback on how it works out in the classroom.

Good luck all for the new academic year.




This spreadsheet helps to organise questions, and generate quizes.

The intention is for each lesson, a quiz of six questions is used: three new questions based on the last lesson’s learning, and three random questions from all previous lessons.

For each new lesson, type in the next three numbers in column E, e.g. 013, 014 and 015 for Lesson 3. Then move to the ‘Questions’ tab and type in your questions and answers in the appropriate row.

You can pre-populate your questions in the ‘Questions’ tab – as long as you haven’t referenced future questions in your Planner (column E), they won’t be selected in the random question generator.

To select the three random questions, press F9, which will generate three random question numbers  (yellow box) (up to the most recently used question in your planner). Type, or copy, these into the remaining  three rows for the lesson. (Note if you copy /paste from K2-4, you’ll need to  use Paste Special/Values).

Note – the random question selection includes an aspect of weighting. All questions are weighted such that the more often and more recently a question has been used, the less likely it is to be selected in the random question generator. This should help improve the coverage of all the questions over the year.

The ‘LessonQs’ and ‘LessonQ&A’ tabs will auto-update with the relevant questions and answers (i.e. the last six in the Column E). This can then be displayed to the class.

Separately, a completely random six question quiz can be made with the ‘6RandomQs’ and ‘6RandomQ&A’ tabs – just press F9.

Separately, you can generate a quiz based on six selected questions by filling in  the ‘Six  selected question’ cells (orange) and the ‘6SelectedQs’ and ‘6SelectedQ&A’ tabs will auto-update .

(Note for any Excel-aficionado – the coding is a massive bodge-job – please don’t judge too harshly! I will tidy it up in a later version if this proves useful.)


A response to Wellcome’s concerns about practical work assessment

A response to Wellcome’s concerns about practical work assessment

TES has reported that the Wellcome Trust says

‘removing science practicals from A levels could challenge ‘authenticity’ of qualification’.

I can’t find a press release for these comments, so I’m assuming that this was a direct conversation between Wellcome and TES. Wellcome Trust has well-known concerns over the removal of controlled assessment/coursework (for example here and here, cited about half way down this article, amongst others).

The discussion in the TES article on how the new regime works is pretty accurate: demonstration of competency over 12 practicals (although neglecting that this is a minimum), noting that 15% of marks in the written exam are based on ‘theory and application of practical skills’ (again not noting this is a minimum percentage), and the concerns that there were over the validity and reliability of the previous coursework regime.

The report then runs through the set of concerns that Wellcome has. I find myself disagreeing with a lot of what is said.

‘breaking the link between experiments and grades could cause students to focus exclusively on the assessed aspects of their course, reducing their motivation for practical work’

I find this a rather reductionist view of education, which assumes students are only interested in the final grade and will neglect everything else. I have certainly had some students who are only interested in the final grade (a means to an end, usually those for whom chemistry was the ‘third’ subject), but the majority of my students have studied chemistry broadly and enthusiastically, and were well engaged with their practical work, regardless of whether it was part of a formal assessment.

‘In a time when we know there’s a lot of pressure on the delivery of science, it may be that teachers are less able to prioritise the delivery of practical science – even though they may very much want to do so.’

I’ve only been involved in teaching for a little over a decade and there hasn’t really been a time when there wasn’t pressure on my departments to ‘produce results’ and of ‘tightening budgets’. The very high stakes nature of controlled assessment (at GCSE) and coursework (at A level) often skewed how practical work was used during the course, with the practical work overly focussed on what was assessed rather than broadly over the course. My experience of the new A-level course is that it has allowed teachers to more authentically use practical work during the lessons, and in some cases given them the ‘ammunition’ needed to use more practical work in their lessons. The assessment of practical work throughout the two-year course, and it is a DIRECT assessment of students’ practical skills, is ongoing and can be used formatively. If students are struggling with a practical during an assessed task, teachers can now intervene and re-teach/support/instruct, and then re-assess the students later on in the course. In the old days of coursework, this wouldn’t have been possible as it was all carried out under high-control.

In terms of the actual marks that went towards the final grade, it has always been a small amount. Looking at the last year of coursework tasks for OCR A Level Chemistry A, about 4.5% of the final mark was based on actual practical skills. While the coursework modules accounted for 20% of the final marks, the majority of the questions asked within these papers were analysis and evaluation questions based on the observations/measurements taken during the two practical tasks. (I’m assuming the percentages are broadly similar for other qualifications.) These papers were carried out under exam conditions, just like in the new qualification exams, but internally marked, moderated, then externally moderated, with all the administrative burden and issues with reliability this entails.

We now have the practical endorsement – a holistic judgment of the competence of a student in a wide variety of practical skills, apparatus, and techniques, carried out over the two years of the course. 99% of students have been awarded the endorsement this year, and this is a massive credit to the students, and to their teachers in adapting to a new assessment regime and in delivering what I think is a much more authentic practical experience.

‘downgrading the significance of practicals at A level could make it more difficult for students to decide whether they wanted to go on to pursue science at university’

For all the reasons above, I just don’t believe that the significance of practical work has been downgraded, and again the assumption that if it doesn’t count towards a grade it isn’t significant. For example, the science trips I have taken students on (for example to the Diamond Light Source in Oxfordshire) have had just as much impact on students’ decision to pursue science at university, and this had no direct relevance to their final grade.

A quote from the Wellcome Trust report from February 2017 (https://wellcome.ac.uk/sites/default/files/science-education-tracker-report-feb17.pdf) is given:

‘35% were encouraged to learn science because of practical work’.

This isn’t quite what the report says – the actual statement was ‘enjoying practical work’ as being the second most important factor in encouraging young people to learn science. I think there is a subtle distinction here – enjoyment can encourage learning, but enjoying practical work may mean they like being in science lessons (more than other subjects?) but not necessarily that they are learning from the practical work itself – that’s a whole other topic in itself.

‘Stripping out marked practicals could also challenge the “authenticity of the grade of the A level and whether it’s really reflecting how good you are at doing science”’.

Again the majority of the coursework marks were based on analysis/evaluation which can just as well be assessed in the exam hall as under high control in the science lab, with only 5% of the final mark was based on actual practical skills. Having a whole endorsement that is a result of the assessment of practical skills over two years seems a better, more authentic experience. I’ve spoken to quite a few undergraduate lecturers over the last couple of years, and there are plenty of examples of high grade students who don’t do well in first-year labs, and lower-grade students who excel in the labs. The correlation between practical skills and the ability to attain high grades isn’t concrete.

The TES article then concludes with commentary about the variability of the requirements universities place on students to pass the endorsement. There is quite a lot of variation out there – some universities have a blanket requirement on all STEM degrees, some on some courses, and some universities are ‘waiting and seeing’. This is probably not surprising given this large change to the assessment regime, and no doubt universities will reflect on their students’ abilities this year and decide on whether or not to make it a requirement.

I am with the majority of teachers cited in the Ofqual initial research who see the reform as being positive for teaching and learning of practical skills. The new way of assessment is in line with how I have taught science over the years, and it allows students to build up (assessed) competence over time with ongoing support, rather than having to perform under the artificial constraints of coursework/controlled assessment tasks. I think we need to trust that teachers will do their best in whatever circumstances they are in to give their students a broad and balanced science education, including practical work, and not allow the assessment-tail to completely wag the teaching-and-learning dog.


Late to the party – Core knowledge booklets

Core knowledge booklets for Chemistry and Physics

As I’m going back to teaching after two years working at one of the UK exam boards, I’ve started riffling through my folders to get myself back in the swing of things. I came across these booklets that I put together four years ago now for the legacy AQA Chemistry and Physics GCSEs – only C1, C2, P1 and P2 by the looks of things.


I was interested to see what I wrote on the cover of these booklets:

This is the core knowledge that you need to know in this unit. Knowing the information in this booklet will help you to understand the more complex chemistry that you will study, and to express yourself clearly in class, in assessments and in exams. Knowing the information below is an essential FIRST STEP to gaining top grades – but it is not sufficient. Only regular study, discussion and practice of the chemistry will consistently lead to the top grades.

This seems a good summary of what I expected from my students, and what I encouraged the teacher in my departments to expect as well.

The students were issued the booklets at the start of the year, and expected to keep them in their exercise books for reference to and regular use of. As and when students had completed work in class, I directed them to spend time self- and peer- quizzing using these booklets, and they made for useful resources for homework and revision. A lot of this seems similar to what some others are advocating.

With the rise of Knowledge Organisers and the like, I thought I’d share them again. Please feel free to adapt and use. As and when I update them for my new classes and the new qualifications, I’ll repost.

AQA Chemistry C1 (2011 Core Science)

AQA Chemistry C2 (2011 Additional Science)

AQA Physics P1 (2011 Core Science)

AQA Physics P2 (2011 Additional Science)


Complications and errors in teaching

Commonalities in teaching and medicine; learning from our mistakes and always aiming for perfection.

Estimated reading time: 6-7 minutes


I finished another book by one of my favourite popular-expert authors recently, ‘Complications’ by Atul Gawande. I originally heard about Atul’s work on the BBC radio show Desert Island Discs back in 2015. His ‘Checklist Manifesto identifies the use of checklists in highly complex professional arenas such as piloting airliners and in surgery. In brief summary, he discusses how many of the things humans do are so complex that no one can hope to complete all that is required without making mistakes, such as forgetting important steps. Aircraft test pilots developed checklists to help them fly the increasingly complex planes in the 1950s. As the planes were becoming more complex, the test pilots kept crashing them as they forgot critical steps, not through lack of expertise, rather the limitations of human memory.

The quest for perfection

‘Complications’ discusses Atul’s progression through his medical training, and how the inevitable errors made by doctors are dealt with. Much of what he writes chimes with my own development as a teacher, and in supporting others through their careers.

… This is the uncomfortable truth about teaching. By traditional ethics and public insistence… a patient’s right of the best care possible must trump the objective of training novices. We want perfection without practice. Yet everyone is harmed if no one is trained for the future.” (page 24)

Here Atul discusses the tension between what we want from our doctors and what the system requires. As individual patients, we want the best possible care from the best doctors. However, a medical system with a sole focus on individual outcomes could never function. The system needs continually to train new doctors. Part of this training is treating individual patients, and the training doctors will make mistakes. New members of the profession need to learn, need to practice, need space to improve.

This has direct corollaries with teaching. As parents, we want the best possible teachers teaching our children. As a teaching profession, we always need new teachers coming in, and they will need to be educated, and they will need to practice. In the process of their development, they will make mistakes and inevitably the education of the children will not be as good as if they were taught by more effective teachers.

The problem with choice

If learning is necessary but causes harm, then above all it ought to apply to everyone alike. Given a choice, people wriggle out, and those choices are not offered equally…. If choice cannot go to everyone, maybe it is better when it is not allowed at all.” (page 32)

Here Atul discusses how, when his child was in hospital, he insisted on having the attending physician (senior doctor) treat his child, not the resident (junior doctor) who was assigned to his child’s case. He knows the medical system well and so could argue to get the best possible treatment for his child. Other members of the public, who don’t know the system so well, are more likely to accept the medical treatments as given.

As a former head of department, I was most acutely aware of these tensions when assigning teaching groups. Which classes are the NQTs (newly qualified teachers) assigned to; the most experienced teachers; the exam groups (Year 11, Year 13); those most likely to be getting the grades which have the greatest effects on accountability measures (C/D grade borderline groups)? My own solution, and what I think was the most equitable, tended to be that all teachers taught across the ability range and year groups where possible, and support provided to less experienced teachers as required. My thinking ran in line with Atul’s – we can only provide the service (be it medical or educational) based of the skills and abilities of those professionals we have available at the time. The majority of science departments are likely to have a range of teachers of experience, expertise and enthusiasm. If we can’t have all of the students in our departments taught by the most experienced, enthusiastic experts, then the allocation of teachers should be on the most equitable basis possible, without the possibility of ‘special pleading’ for particular children or groups of children. Inevitably, the power of a head of department is not absolute, and some decisions were over-ruled by senior management who may have had different priorities to mine.

How to deal with mistakes

One of the most powerful passages in the book was the discussion of the Morbidity and Mortality conference (M&M), a place where mistakes in patient care are discussed in open forum – everyone attends from the most junior doctor to the head of department.

In its way, the M & M is an impressively sophisticated and human institution. Unlike the courts or the media, it recognizes that human error is not generally something that can be deterred by punishment. The M & M sees avoiding error as largely a matter of will – of staying sufficiently informed and alert to anticipate the myriad ways that something can go wrong and then trying to head off each potential problem before it happens. It isn’t damnable that an error occurs, but there is more shame to it. In fact, the M & M’s ethos can seem paradoxical. On the one hand, it reinforces the very American idea that error is intolerable. On the other hand, the very existence of the M & M, its place on the weekly schedule, amounts to an acknowledgment that mistakes are an inevitable part of medicine.” (page 73)

Again, the chime with teaching of students. We must always seek to avoid error in what we do when we teach students, but have to accept that errors do occurs. By exposing these in a safe professional environment, we can all learn from the situation and improve everyone’s  on-going practice. As professionals we should always be seeking to improve our practice, and do the best we possibly can for our students, and recognise that mistakes we make are not just learning opportunities for our own practice but that of others. How high stakes accountability, for example graded lesson observations and performance-related pay, can skew this ideology is a topic for another time. Suffice it to say I always had strong reservations about the helpfulness of these techniques in driving good professional improvement.  

How do we continually improve?

No matter what measures are take, doctors will sometimes falter, and it isn’t reasonable to ask that we achieve perfection. What is reasonable is to ask that we never cease to aim for it.” (page 74)

We are human. We make mistakes. This is inevitable. No set of resources, force of will, system of incentives or sanction will ever get us to a place where teachers teach perfectly and all students learn to their optimum ability. What we can insist on is that all teachers recognise that there is always more to learn, always improvements to be made. And the system needs to recognise that teachers need access to forums to allow them to share and learn from each other.

And this access to forums, I think, is one of the biggest problems facing the teaching profession at the moment. I have delivered training and support across England over the last couple of years, and the attendance at events is significantly down of what it was only a few years back. Schools are finding it very hard to release teachers to attend CPD events, and it is quite common now for teachers to use their own free time to attend events such as TeachMeets. I am a big fan of such events, but am always troubled by a system that seems to have normalised an expectation that teacher improvement should come in teacher’s own time.

Can we make improvements? Yes – at a system level, only government can change the funding situation that has driven the reduction in capacity in schools to allow teachers to have time away from the classroom. Government policy can be influenced by engaging with professional bodies like the Chartered College of Teaching, unions and learned societies such as the Royal Society of Chemistry. Yes – at school level, department and school meetings can schedule significant time for discussing good practice and learning from mistakes in a safe professional environment. Yes – at an individual level, by recognising that we aren’t perfect, we won’t always get it right, but that the aim for perfection is a worthy goal, and one we can all work towards across a career.


Teaching shortcuts and when they can trip you up…

Reading time: 4-5 minutes

Along with others of late, including Kristy Turner , Niki Kaiser and Adam Boxer, I have been mulling misconceptions and teaching. Cognitive Load Theory is also being discussed a lot and I’ll be attending Niki’s conference soon to hash out some more on these ideas and how to apply it to the classroom. Much of what I have read resonates with my previous teaching, and I’ve written about it in relation to practical work.


What prompted this particular post was a session with Steve Barnes and David Read at the Wessex Group conference a couple of weeks back, and a Twitter chat recently. Both were related to aspects of equilibrium, a concept many students find hard, especially when questions are a bit different from what they have seen before. With the increased demand in the new A level papers, and the increased emphasis on applying knowledge to unknown situations, this seems like a timely issue.

Eric Scerri noted a couple of years back the problems with Le Chatelier – it works sometimes, but can break down quite quickly, and can potentially stop us from having to think thinking too hard about the details of the context. Somewhat like Kristy’s SEABODI, students tend to go for a stock answer to the almost inevitable NH3 or SO3 production questions, but aren’t necessarily thinking deeply about their understanding.

At the Wessex Group conference, Steve used a set of equilibrium questions, drawing on work by Juan Quilez, to highlight these problems. We also discussed the nitrogen dioxide-dinitrogen tetroxide equilibrium. When applying Le Chatelier to this equilibrium, and looking at changing temperature, the expected changes to equilibrium position and hence observations are borne out, as seen in numerous videos. Increasing the pressure by compression is more complex. The expected shift in the position of equilibrium is to the right to decrease the pressure. A not uncommon prediction of the observation would be that the mixture lightens, as NO2 is converted into N2O4. The problem is that as the total volume has decreased, the mixture actually darkens initially as the NO2 becomes more concentrated, then the colour lightens as the NO2 is converted to N2O4. So the shift in equilibrium position may be correctly predicted, but the predicted observation may be wrong (or at least incomplete) because the full system was not considered or the question is not carefully phrased.

I recognise that at times I have come to rely on shortcuts and when the questions becomes more complex, the shortcuts can break down. Now, Le Chatelier is model like any other – we tend to use the simplest model that will allow us to explain the observations. When the model breaks, we use or develop a more sophisticated one. This issue has tied up with other conversations recently on what is meant by ‘mastery’ in science – is it mastering the concepts at the appropriate level, or introducing more sophisticated concepts earlier on so we aren’t ‘lying’ to the students. I’m not sure it would be appropriate to introduce GCSE and / or A level models directly in KS3. I may be able to get the students to repeat back the facts, but I doubt they’d be able to use them confidently or competently. Of course, this is a whole bucket of worms on assessing understanding – for another time perhaps.

What finally prompted this post was a resource I was reviewing on fuel cells. I have always found electrochemistry one of the harder topics to teach effectively, and tend to take more of a pause before launching into it. I have taught electrolysis plenty at GCSE, but for whatever reason never really gone into galvanic cells and fuel cells in detail. I had developed shortcuts for electrolysis along the lines of ‘it’s the reverse of normal chemical reactions’ and ‘cathodes are negative as cations are attracted to them’.  I got myself into a muddle with the hydrogen fuel cell, on working out the polarity of electrodes, not helped by some vaguely written resources. A quick shout out to Twitter set me straight (thanks to Peter Hoare and Adrian Dingle), but it was a useful reminder of the need to check my understanding of the fundamentals from time to time.

Does this have any wider relevance – certainly for me going back to teaching after a couple of years out. For others – perhaps. I think it points to the importance of subject knowledge CPD. I’m a strong advocate of pedagogical content knowledge CPD, but spending time on deepening my personal understanding of the content knowledge is probably worthwhile from time to time. At this point in my teaching career, I would put depth ahead of breadth now. I have sufficient breadth of chemistry to teach my students effectively at the level I’m teaching at, but I think increasing the depth of my understanding as the years pass can only be a good thing. There are some great resources out there. I’m a particular fan of knockhardy and chemguide. I also have a copy of Chemistry3 close by – I’ve consigned my other university books to the lab shelf – I think one general undergraduate chemistry text is sufficient for what I need for now.

Any thoughts?  When was the last time a student asked you a question that you couldn’t quite answer to your satisfaction? What resources do you use the support your depth of understanding?


Review and reflections on #MICER17

A review and some personal reflections on the MICER 2017 conference.


At the sumptuous RSC Library at Burlington House, we gathered Methods in Chemistry Education Research 2017,  a day of lectures, activities and catching up with friends and colleagues. From school teachers to a Professor Emeritus, we gathered with a common purpose – to spend a day thinking about methods in chemical education research.

The day started with Dr Suzanne Fergus (@suzannefergus), Principal Lecturer in Pharmaceutical Chemistry at University of Hertfordshire (also 2016 RSC Award Winner for Higher Education Teaching). Through the context of her journey into ChemEdRes, Suzanne discussed the difference between anecdote of what works in our own teaching situation, and what constitutes genuine research. Critical features included i) contextualization within the current literature, ii) robust data collection and evaluation, and iii) novelty of work. While replication of others work in our own context can help increased generalisability of ideas, the new learning from such replication needs to be made explicit.

We worked through an exercise in formulating a RESEARCH QUESTION, central to ensuring high-quality research, and ultimately in getting our studies published. In my previous teaching of A-level sciences, I have come across research questions in Biology fieldwork, but their use in Chemistry research are not common. The worksheet proved a useful structure to start the challenging process of formulating high quality and usable research questions. Benefits of starting the research process with the research question include i) helping connect with the literature; ii) influence on the methods used; iii) focus on the presentation of the work and iv) focus on the discussion of the conclusion.

One of Suzanne’s papers (DOI: 10.1021/ed2004966) was highlighted as a useful example of how ChemEdRes can be written. The ‘New Directions’ journal was also suggested as a good starting point for those looking to get into academic publishing. Suzanne also suggested other less formal (more ‘social’) ways of publishing to help build one’s confidence in sharing our thoughts with a wider community. This included speaking at TeachMeets, small conferences, engaging in Twitter conversations, writing personal and professional blogs and writing for institutional publications. On a personal level, Suzanne’s talk gave me that last little push to start a personal blog!

Suzanne’s colleague Dr Stewart Kirton (@skirtonUH), Head of Pharmaceutical Chemistry, University of Hertfordshire, then took us through the use of Likert scales in providing an assessment of the impact of our interventions. While analysis of attainment in assessments is a major source of such information, surveying students’ perceptions is an increasingly used source of information. I used such surveys throughout my time in secondary teaching, and their use is becoming more common at university level with the ‘Teaching Excellent Framework’.

Stewart took us through a process for developing valid questions and Likert-scale responses, including:

  • the importance of trialling the questions with peers
  • trialling with the subjects of the questions (usually your students
  • ensure each question is only examining one idea
  • avoiding jargon
  • think carefully about the possible responses – including ‘Don’t Know’ is acceptable
  • phrase questions positively (if possible)
  • sticking to around eight questions – using many more than this and the students will likely run out of steam!

Our activity involved drafting some questions to help evaluate a programme run by final year students to help second-year students prepare for interviews for industry years. A particularly useful online app was used to share our ideas (www.mentimeter.com) – a virtual notice board where you can send in your responses via smartphone/laptops.

Stewart finished with a clear exhortation on NOT taking the average of responses when using numerical responses on Likert scales (e.g. 1=strongly agree to 5=strongly disagree). Simply put, these numbers are not interval data, where the difference between successive values are identical and meaningful, rather they are ordinal data, i.e. can be ordered but the differences between them are meaningless. Stewart’s suggestion was to present the relative ratio of each response to each question and analyse pre- and post- intervention where appropriate.

After coffee, and meeting up some friends I have made on Twitter over the last year, Dr Orla Kelly (@orlakelly5), Senior Lecturer in Social, Environmental and Science Education, Dublin City University, discussed the evaluation of classroom practice, with a focus on ‘Classroom Action Research’. Orla started with a definition from the Open University, of ‘systematic and collaborative collection of evidence on which to base reflection’. She provided a summary of the cycle of action research as ‘Plan / Act / Observe / Reflect’. Orla’s extensive use of Problem-Based Learning in undergraduate labs provided a context for the talk, and had strong resonances with Suzanne’s earlier talk.

Prof Graham Scott (@grahamscott14), Professor of Bioscience Education at University of Hull then expanded the speaker repertoire beyond chemists to a biologist helping bring a perspective from a related relevant field. Graham’s key message was of the advantage of moving away from our ‘science comfort blanket’ and embracing the discomfiture of collecting and using the more qualitative data derived from interviews. Graham took us through his research journey of using interviews in studying various different areas, from an analysis of student’s and teacher’s perceptions of a course, to barriers to using biological fieldwork in primary schools.

Key ideas in making effective use of interviews included i) establishing a suitable dynamic between the interviewer and interviewee (location, time available, consideration of any prior professional relationship); ii) clearly constructed questions that will elicit the information required (including the use of trialling) and iii) the importance of audio/video recording and the processes of transcribing and analysing the data.

I have used interviews in previous research and as part of evaluating the effectiveness of my previous school departments. While I experienced many of the problems that Graham described, I wholeheartedly agree with Graham that the quality of information you can derive makes them well worth the effort.

After a much-needed lunch (energy levels were flagging by 1.10pm!) we had time to chat and look at posters. This was a nice aspect of the conference, starting off the conference a good month before the get-together, and get feedback on our ideas online. My particular interest right now is in how microscale chemistry can be integrated into my teaching, and whether it has sustained benefit to students’ learning.



Prof Keith Taber (@DrKeithSTaber) took us on a tour of ethics in educational research. It has been many years since my MEd days with Keith as my supervisor, but his erudite and rigorous style continues to shine through and it was a pleasure to be part of the audience.

Starting with a brief tour of various ethical frameworks, including deontology and utilitarianism, we discussed the importance of voluntary informed consent from the subjects of our research, and the responsibility we bear as researchers. These include reporting our findings as completely and fairly as possible, including not selectively reporting our findings, and highlighting the known limitations. We discussed the issues around anonymity and confidentiality, and the particular problems that the easy access to worldwide information via the internet can provide. We discussed the particular cases of the Milgram studies and the Tuskegee Syphilis experiment, highlighting areas of real contention in the ethics of research.

The key message from this double session was that while the rules of ethics can be relatively easily stated, the actual decisions we have to make as researchers can be very nuanced and rightly deserve careful consideration before, during and after our studies.

The day finished with Prof Georgios Tsaparlis, Professor Emeritus of Science Education in the Department of Chemistry at the University of Ioannina, Greece, winner of the 2016 RSC Education Award. Georgios’ work on problem-solving has spanned decades, and a whistle stop tour was presented in this final session. Ideas around how the limitations of our cognitive architecture, with a particular focus on working memory, were discussed. The importance of scaffolding and exercise, as well as success for students, in developing problem-solving skill was clearly emphasised.

MICER17 proved to be all I had been looking forward to, and a great venue to meet new people, make connections and expand my professional network. Most of all, it has helped put the human face to the world of ChemEdRes. Reading articles in CERP or J Chem Educ can be a little daunting to those new to ChemEdRes, and the barrier to entry to the world can seem impossibly high. However, it really is an inclusive and welcoming community, one that I look forward to contributing to in the coming years. Many thanks to Micheal Seery (@seerymk) and Claire McDonnell (@clairemcdonndit) for all the hard work in bringing this together.