Author Archives: Austin Booth

About Austin Booth

I'm a Physics and Chemistry teacher, passionate about using assessment to drive improvement for my students. I also like running and cake!

Research Ed Rugby 2017

Here are the slides from my researchED Rugby talk – click here to download or view below.

In the questions there was a good comment from Chris White (@onechriswhite) about an easy way of getting the QLA marks data off the paper and into a spreadsheet. He doesn’t mark on the scripts at all and only records marks on a spreadsheet as he’s marking. Great idea!

I’ll also be writing a series of blog articles from the talk, so watch out for those coming soon!

Teaching pupils how to revise

For some time now I’ve been interested in how cognitive science findings can be used to improve teaching and learning in the classroom.  Whilst this has shaped what I currently do in the classroom, until now I’ve never shared this information with my pupils.  This is mainly because I’ve been unsure how to communicate this simply to pupils.

Then last year I discovered the Learning Scientists.  They produce lots of free materials (from videos to bookmarks) that turn the latest cognitive science into simple explanations of how to study effectively.  So after asking one of my year 11 classes if anyone had ever taught them how to revise for exams (to which there was a resounding ‘no!’) I decided to use the Learning Scientists resources to do this.

If you look at the Learning Scientists website you’ll see they talk about 6 different study strategies.  I didn’t want to overwhelm my pupils by going over all 6 strategies so I decided to go over what I thought were the two most important: retrieval practice (aka quizzing) and spaced practice.  I also wanted to give my pupils some activities to put the strategies into practice.

To help my pupils see the effectiveness of retrieval practice I decided to give them a short quiz on a Chemistry topic they all did badly in in the last mock, ethanol production.  They did the quiz, followed by some retrieval practice exercises, before trying the quiz again.

All students could answer more questions and in more detail after the retrieval practice exercises.  And most importantly, all pupils had enjoyed the activities and felt that the strategy had been effective in helping them to learn.

Now I know there are problems with giving the students the same questions twice and that although they could answer more second time round this doesn’t prove they learned more and won’t forget the information.  However I wasn’t doing a research study here, just trying to get my pupils to buy in to the technique so they will hopefully use it outside of lessons.

The retrieval practice exercises were very straight forward.  I let the pupils choose whether they wanted to quiz themselves (by using the cover-write-check method or with flashcards they made) or each other (either with flashcards they made or just making questions up using the information sheet).

I’m a big fan of flashcards, and they certainly helped me throughout my degree course.  Interestingly at my school there is a big push for pupils to create ‘revision cards’, which are short content summaries.  So I asked my pupils if they made or used them.  Almost everyone put their hand up.  Then when I asked them how they used them, almost every pupil said they read them.

I (briefly!) talked about how to use flashcards, that they are much more effective when you create them with a question on one side, the answer on the other and actually try to answer the question fully before looking at the answer.  I also described how they could also put their cards into three piles (cards they could answer fully, cards they could partially answer and cards they couldn’t answer) and that they needed to quiz themselves more on the last two piles than the first (although I stressed that they still needed to quiz themselves with cards they could previously answer so they didn’t forget the information, just not as much as the others).

Going forward I want to try to incorporate retrieval practice into my lessons regularly.  Making flashcards and quizzing each other could be a good plenary activity.  And if done regularly, when it comes to revision time pupils will already have made their flashcards with which to quiz themselves.

It’s important to note that one or two pupils in the classes I did this with didn’t like it.  One girl said to me ‘people learn in different ways and this won’t work for me’.  This resistance could be because creating questions from information  (either to ask verbally or for flashcards) is harder than just copying information.  However it could also be that some pupils believe that they have a certain learning style.

I think the best response to this is that quizzing can be done in different ways; on your own, with others, using the cover-write-check method, using flashcards, answering questions (verbally or written) from the revision guide or past papers, redrawing from memory mindmaps and diagrams, etc.  So once pupils have tested out all the different quizzing methods (and have given them a good go) they can use the method which best suits them.

Below are links to the files I used for the lesson.  Feel free to download and use.

Files:

Teaching calculation skills better and more creatively

In GCSE Science students often struggle to remember how to do calculations properly (when to convert units, rearrange equations, use principles such as conservation of momentum, etc).  When I have given students several calculations to do (after going through a worked example) I often work harder than the students do as I have had to go around helping a lot of them, while others are waiting for help.  And this is with a culture of students trying all the questions before asking for help and asking the other students on their table for help first. This has made me reflect on how I can make sure students get the feedback and guidance they need, just at the time they need it, and doesn’t require me to personally deliver it. Below are two strategies I’ve used to try to do this.

My first strategy was to give the students the answers to the questions, but in the wrong order. This allowed students to check their answers and get immediate feedback as to whether they were right or not. However if students went wrong, although they now knew they were wrong they didn’t know why. To help students try to identify their own mistakes I next got them to check their work against these common mistakes:

  • Did you copy down the equation or any of the numbers wrong?
  • Did you write down the rearranged equation? Write it out and then underneath replace each symbol with the number it represents.
  • Did you rearrange the equation properly? Check with your table partner.
  • Have you converted units? E.g. km to m, kJ to J, g to kg
  • Did you make a mistake when pressing the buttons on the calculator? Try it again. [I call this ‘fat finger syndrome’!]

This helped and cut the number of questions from students by more than half.

Now this is fine for fairly simple calculations (such as using a formula such as speed=distance/time), but not for multi-step calculations, e.g. conservation of momentum style questions.  For these types of questions students need to have a ‘strategy’ and know where they’re headed. In my experience this is difficult for students, even when they are fine with all the individual steps – it’s knowing how to put them together that they often struggle with.

At the time I was thinking about this, I discovered this video by Dan Meyers about making maths lessons more engaging. In the video Dan quoted Daniel Willingham: “Sometimes I think that we, as teachers, are so eager to get to the answers that we do not devote sufficient time to developing the question”. He goes on to discuss video games and argues that they are engaging because they allow choice in how tasks and puzzles are completed and solved and there is a low cost of failure and error (you can re-try something so you don’t have to start the level over again, etc) [he discusses other reasons as well, which are out of the scope of this post]. Also worth watching is this earlier Dan Meyers video where he discusses some of these ideas to create three act maths problems.

I applied these ideas when I next taught conservation of momentum (strategy #2). Instead of opening with a video (as Dan Meyer does) I used a Newton’s cradle and discussed with the class what was happening and drew out that momentum was conserved. I then asked the class what information would we need to find the velocity of the last ball after the first one has hit, allowing students to discuss this in their groups first. After this I gave them the information they had suggested (such as the masses of the balls and the initial velocity) and also gave them the final answer that they were aiming for but rounded to the nearest whole number, asking them to work with their partner to try to work out the velocity to two decimal places. The students responded well to this, they instantly could tell if their answers were right or wrong and felt that if they got a wrong answer they could simply try again and adjust their method, allowing me to circulate and listen to their conversation.

Through this process most students were able to calculate the correct velocity. After getting a couple of students to demonstrate their solutions, I then got each student to write a method as a set of steps for how to do the calculation. A lot of the students simply stated the numbers in their methods, so after they had written it I got them to replace the numbers with what they represented (e.g. momentum before collision, velocity after the collision, etc).

The students then applied their method to a simplified problem involving cars, before moving onto a past paper question. This worked well and at the end of the (double) lesson the students had applied their methods to answering the past paper question correctly.

Both of these strategies were successful in that they met my objective of reducing the amount of reacting I was doing in lesson and so freeing me up to circulate and assess how the students were getting on. For the second strategy the students seemed to enjoy producing their own method and they definitely did more thinking, and so should remember more.

Update: Since doing this I came across this excellent idea of having all the calculations projected on the board and if someone is stuck on a problem then they write their name on a post-it and stick in on the question on the board, allowing you to see who’s stuck on what and if you need to go over something with an individual, group or the whole class. I’ll definitely be trying it out!

 

How can I simply implement cognitive science findings to improve learning?

Over the last year I’ve read three books which have made me think deeply about cognitive science and how to improve learning – (i) The Hidden Lives of Learners (ii) Make it Stick (iii) Why Don’t Students Like School? Two very good summaries of the principles in these books can be found here and here.

For me, the most important principles from these books are:

  1. Students should meet any concept at least three different times for it to pass into long-term memory.
  2. Instruction and practice of a concept should be interleaved with a different (but not totally unrelated) concept. So instead of teaching topic 1 followed by topic 2, teach them ‘together’ (i.e. teach one or two lessons on the first topic then switch to topic 2, etc). This point also suggests that concepts that are very similar (e.g. mean, mode and median or conduction, convection and radiation) should not be taught together.
  3. Distributed (or spaced) practice – instead of practicing a skill during a single large block of time (massed practice, aka cramming) distribute the time spent practicing over a longer duration.
  4. The Testing effect – students learn more from regular low-stakes testing, whether that’s self testing (by using flash cards, etc), being tested by peers or testing in the classroom. So in a pattern of ‘study, test, test, final test’ students learn more than by following a ‘study, study, study, final test’ pattern.

All of this is great, but a little overwhelming. How can I simply implement these suggestions without having to re-plan the entire curriculum or adding excessively to my already substantial workload? Below are some suggestions based on what I’ve been trying in my classroom recently.

  • Using starters to review knowledge and skills from previous lessons by students answering a series of questions, drawing and labeling a diagram, or some other activity. Often I’ll ask students to answer questions from topics that we studied not just from the last lesson but from the last few months to make sure that they can remember the key content. If students can’t remember how to do something they can ask the other pupils on their table for help. This combines low-stakes testing with distributed practice.
  • If consecutive lessons in a unit will teach cognitively similar concepts (e.g. conduction, convection and radiation) then mix up the order of the lessons so these are separated by another not so similar concept.
  • Use quizzing regularly (once a week or every two weeks). This is a longer set of questions to what I’d use in a starter (approx 20-25 instead of 4-6). I almost always use multiple choice questions. This is ideal to set as homework and can easily be electronically marked by using systems such as Google Forms and Flubaroo or Quick Key. Not only does this make use of the testing effect but it also allows you to assess students learning against individual concepts. Again, I ask questions not just from recent lessons but from two or more months ago to aid retention of key content.

Sources of multiple choice questions

As you see above, I’m advocating using multiple choice questions. However where can you get good quality questions from? Below are the sources I’ve used.

  1. It might sound obvious, but search Google for your topic and add ‘multiple choice questions’. So far I’ve been able to find some questions for every topic I’ve searched for on the net.
  2. The tests on the BBC Bitesize website.
  3. Make your own – I’d recommend that for every lesson you write 2 or 3 questions that assess the learning against the learning outcomes that you can use later on. See this blog post for some guidance on writing good multiple choice questions.
  4. Get students to create their own (e.g during plenaries, revision sessions or for homework). Students can be supported with a questioning grid. Something I plan to introduce is Peerwise, which allows students to compete with each other in writing multiple choice questions.

If you teach maths you can use the Diagnostic Questions (DQ) website, created by Craig Barton and his team. It’s a free tool that allows teachers to share and download multiple choice questions and allows students to answer them online. Recently it’s been opened up to allow questions for other subjects to be created, and at the time of writing there are a few hundred Science and MFL questions.

A really nice feature of the DQ website is that if students answer questions on the website they can add a sentence to two to explain their reasoning, and you as the teacher can review that. Students can also look at the explanations of other students for correctly answered questions.

However one disadvantage of the DQ website is that questions are created, stored and downloaded as images only. So if you want to export the questions to a Word document, Google Form or other program, you have to type them out. This is a real shame because you can’t easily edit or adapt a question. Multiple choice questions can be modified in several ways, including turning them into two-tier questions and confidence grids, so the inability to do this easily is very disappointing. A good York Science publication about this can be found at the following link – Developing formative assessment in practice.

Improving Student Performance – Interventions

This well overdue blog post is about the hard part of formative assessment – using the information I’d gained about each student to intervene with them in order to improve their performance.

Before I delve into the details of the different strategies I employed, I want to outline a few points with regards to my own opinions on intervention. In no way should an intervention ever be used or thought of as a substitute for quality teaching during the regular school day. And it’s important that pupils and parents never form this opinion (e.g. “it’s ok to take Johnny out of class to go on holiday because there’ll be a revision session later on”). That’s why I’ve always refused to do after school revision sessions close to the exams, which only promotes an exam/test ‘cramming’ culture and not one of hard work over time (i.e. a growth mindset).  That said, interventions can be a powerful tool, as long as they are used throughout the year, and not just near exams. I also believe that it’s up to every individual teacher to intervene with their classes on a regular basis in order to help students to improve their knowledge and understanding, and to become more reflective about their learning. This year my department has introduced the idea of dedicating one lesson every two weeks to do just that, and so far it’s very positive.

Ok, so onto the intervention strategies. I tried 6:

1. Starters on different topics – while I was teaching the next unit, I used starters based on the topics that most of the class had performed badly in. However it was hard to keep them to 10-15 minutes, and some took over the entire lesson (such as when I tried to go over standard index form in 10 minutes!). I think my mistake was to try to teach a concept in this time instead of practice one, and so this approach was probably best for ‘amber’ concepts, where most students had some understanding of the topic.

2. Homeworks on topics – generally involving using BBC Bitesize and online videos (e.g. my-gcsescience.com). I had some success with this, but it was a battle getting students to complete it (I had to call a lot of parents). So this was quite time-intensive.

3. Some 1-2-1 after school sessions for a couple of students with low confidence in the subject. This was very successful, but again time-intensive.

4. I’d recently bought ‘Talk-Less Teaching’ and used ‘Teach me, tell me (and then tell me more!)’ on p156. This strategy involves students questioning each other using cards with two different questions on them – one knowledge-based, the other application-based. I got the students to write the questions for a random topic (picked out of a ‘hat’). They did ok in writing their own questions, although some needed quite a lot of support. However the students really enjoyed questioning and explaining the answers to each other.

Since then I’ve found that this is also a Kagan structure (Quiz-Quiz-Trade), and for Science teachers there are questions cards already prepared on Daria Kohls excellent blog (listed as ‘Quiz, Quiz, Trade’ cards – these have one question per card and not 2 as in ‘Talk-Less Teaching’).

5. Multiple choice quiz for the whole topic – I used the questions from the BBC Bitesize website for the topic along with a few of my own. After completing it, students collected a mark scheme to mark it, and then filled in an analysis sheet to record their mistakes and the reasons for them (e.g. ‘didn’t understand the question’, ‘didn’t know the science’, etc). Following this they then created a revision sheet for what they needed to revise.

What surprised me about this was how the students responded – they were completely focused for the whole lesson and really seemed to enjoy it. I think this may be because (a) what they were doing was completely personalised to them, and (b) they received instant feedback to the test by marking it themselves, and then used that to determine their next step (so in a sense gamified). Hopefully this also motivated them to use the revision sheet they created at home to revise for the second test.

6. For the second test, I told the students that instead of them getting their results individually, I would publish them colour coded and in rank order on my classroom door window, so the whole class, and everybody else that passed by, could see them.  This visibly got the attention of most of the students! Hopefully it gave them a little extra incentive to revise for the test. That said, I only did this because they are a higher ability group and want to do well. I’d think very carefully about using it with a lower ability class where there may be a certain amount of kudos in underperforming.

Out of all the strategies, I’ll definitely be using the ‘quiz-trade’ card activity (#4) as well as my multiple choice test and mark activity (#5). I’ll also continue to use the starters to revisit old topics (#1), but only those that students have some understanding of and require more practice.

This post is by no means comprehensive in its review of intervention/revision strategies. What do you do? What have you found particularly successful (or not!). Leave a comment below.

#YorkTU 2014

Yesterday I attended the #YorkTU Science TeachMeet.  It was a great event with a wide range of presentations and discussions, one of the best CPD events I’ve ever attended!  Embedded below is the presentation I gave about the work I’ve recently done on analysing assessments.

A great reflective blog post about the whole day by Helen Rogerson can be found here.

After the event we went to the pub and even won the pub quiz!

alex

Making Better Use of Assessments – Part 2

A few months ago I wrote here about how I’d analysed the results from a year 10 class test with a spreadsheet to give me a breakdown of every students performance against every specification objective/concept.

Since then I’ve been experimenting with how I could use the data to improve my students performance.  In this blog post I’ll give details of the students performance after those interventions (I’ll write another blog post about the specific intervention approaches I tried).

Interventions complete, I tested the students again with the same test as before (so I could use the same spreadsheet).  It was approximately 3 months since they’d last seen the test, and so highly unlikely that they’d remember much about the questions.

Below are two screenshots of the spreadsheet breakdown of the test results for every spec objective, before and after interventions (the student numbers are not the same as in the previous blog post as some students were not present for the second test, so have been removed).  The RAG thresholds used were: 0-39% red, 40-69% amber, 70-100% green.

Breakdown of test 1 against objectives (before interventions).

Breakdown of test 2 against objectives (after interventions).

Breakdown of test 2 against objectives (after interventions).

It can clearly be seen that many improvements have been made for all students.  The screenshot for test 2 as well as showing the individual strengths and weaknesses of the students, also shows that most students still struggle with certain areas, mainly to do with red-shift and the universe (although some improvements have been made in those areas).

Previously I’d identified the ‘use of the wave equation’ objective as a priority for interventions, as the first test had showed that few students could use the equation, and those that could had trouble using it when given data in standard index form.  This objective increased overall from 21% in test 1 to 51% in test 2.  Interrogation of the marks showed that now most students could use the wave equation correctly, but they still struggled with numbers in standard index form.  So this is something that requires ongoing work.

To quantify the improvement, the number of red, amber and green objectives was totalled for both tests and the percentage change calculated (shown below).

Percentage changes in number of red, amber and green objectives following interventions.

The rows highlighted in orange are for those students who joined my class from another just before I administered the first test but hadn’t been taught the Waves topic.

These numbers show that on average the entire class increased their proficiency in a third of the objectives.  The students who had not been taught the topic previously (and so had only been exposed to the material during the interventions) unsurprisingly showed a bigger increase – 52% on average.  The students who had been taught the topic previously increased by just under a third.  However some individual students made much bigger gains (83% and 50%).  This gives me valuable evidence of not only what I have done with the students, but also of how individual students have responded to the interventions, and so how they are likely to respond to interventions in the future.

My spreadsheet also calculated the percentage scores for multiple choice and non-multiple choice questions, as well as higher and foundation tier questions, and the overall percentage for the entire test.  These numbers are shown below for both tests (again the students not taught the topic prior to test 1 are highlighted).

Headline figures

Overall figures for different question types and the whole test.

From these figures I calculated the percentage change between tests, which are below.

Headline figures - percentage changes

Percentage change in the overall figures for different question types and the whole test.

This shows that students increased in all areas of the test, but the increase was less for the higher tier questions; so this is an area in need of further work.

From this exercise I’ve learned not only the individual strengths and weaknesses of students but also what parts of the topic students struggle with the most (and so requires further work), and how individual students respond to interventions.  It also gives me clear evidence of what I’ve done in the classroom and the results of my interventions.  In my next blog post I’ll be writing about what intervention strategies I tried and how successful they were.