They didn’t (always) behave for me

A conversation with one of my younger tutees this week reminded me just how toxic classroom disruption can be. While rueing his poor performance in a recent test, the boy expressed real frustration about the situation in his Latin class. “Some kids just see it as their job to mess around” he said. He even reported that the situation had brought his teacher to tears in the past.

At even its most minor level, any form of classroom disruption is an issue for all learners. Children who may be struggling with the material go unsupported because their teacher’s attention is taken by the disruptors in the class (who may, of course, be struggling themselves). Schools which have not yet faced up to the inescapable fact that impeccable behaviour is the central, non-negotiable foundation on which all teaching and learning is built, those schools will continue to let young learners down.

One of the things I find most puzzling about the teaching profession is that we cannot seem to agree on how to manage behaviour. Debates continue to rage about schools which set the bar high, with cries from numerous educators claiming that vulnerable students and/or SEND students cannot handle such a high bar and that clear boundaries such as the use of SLANT in classrooms and the insistence on silent corridors are oppressive and stifling. I find this baffling, not to mention an insult to the children with those needs. As someone who has worked in schools rated Good or Outstanding for behaviour, I can tell you that there were times when I was frightened in the corridors. There were times when I felt pushed around and intimidated by some students. There were times when I felt humiliated. What this all translates to in schools with behaviour that ends up being classified below Good I cannot even begin to imagine. Moreover, if I as a middle-aged adult felt like this in the school corridor, how did our most vulnerable students feel?

A recent survey on Teacher Tapp, a daily survey app for classroom teachers, highlighted the ever-increasing use of ear defenders by some students in our schools. As I pointed out in response to the discussion, I find their necessity deeply depressing. How did we get to the point where we simply accept that some school environments are too noisy and overwhelming for some of our students? Like that’s ok? And like noisy, boisterous environments aren’t actually a negative for all learners? How on earth did we end up in a situation in which the kind of equipment required by men on building sites using machines to break up concrete becomes a necessity to protect our students from the environment in our schools?

Let me tell you about my one of my own experiences in the classroom. I was sent to an expensive girls’ boarding school (although I didn’t board, I was one of a small percentage of day pupils). In Year 9 (or the Upper Fourth, as it was called would you believe) I was part of a Classical Civilisation class run by a young female teacher whom I shall call Miss Jones. Poor Miss Jones was a sweet, kind and well-meaning woman, who no doubt went into teaching because she cared about her subject and wanted to share it with the world. I suspect she had no training, because in a private school in the 1980s teacher training was considered very much optional and barely even desirable. The school was tiny, consisting of 400 girls in total and had a pretty strict regime – for example, silent corridors. The Head was terrifying – genuinely so. But poor Miss Jones, with her reticent nature, her lack of training and her lack of experience, had no control over our class. One girl was particularly disruptive. I shall call her Millie. Millie was taller and looked older than most of us. She terrified many of us and was a merciless bully to some. That included Miss Jones. Millie refused to cooperate with the class, to the extent that she would not sit where she was told, she would not participate in the class in any way, she would not even unpack her bag. She would lay her head on the desk in a flagrant show of disdain. Miss Jones’s methodology was to ignore her and try to teach around her, but behaviour in general was so poor that we all learnt very little. She never received any support or help with the situation and did not last long in the job.

I share this to illustrate the fact that issues with poor behaviour occur in all schools. Another recent survey from Teacher Tapp, carried out just this week, indicates that student behaviour, alongside workload, is now the overwhelming reason why teachers are leaving the profession in their thousands. There is much talk about “challenging” schools and understandably so, because getting behaviour right in such places has very real safeguarding issues, as explained in this brilliant blog post which I have cited many times before. Yet I would like to highlight the fact that behaviour that is disruptive enough to impact on teaching and learning goes on everywhere – in schools rated Good or Outstanding, in grammar schools and in private schools. Some of what I hear from my tutees would not be out of place in a chapter of William Golding’s Lord of the Flies – and these are the sorts of schools with Latin on their timetable.

While I do not wish to promote panic or cause any pearl-clutching, I do believe that disruptive behaviour in our schools is an issue that nobody wants to face up to. Nobody – whether they are a parent or a teacher – wants to believe that our children’s education is being hampered by disruption in the classroom. It is hard for all of us to accept. While writing this blog post, a memory from close to a decade ago came back to me with a jolt. It is a comment made by a boy in one of my past Forms, a boy who was one of the most disruptive members of the class (and indeed the school). “Your PSHE lessons are like watching a YouTube video with crap internet, Miss: you keep buffering.” I recall being somewhat non-plussed by this rude remark, one which was called out across the class and interrupted the flow of the lesson in exactly the way he was describing. Out of the mouths of our not-so-innocent babes can come the real truth: my ability to share information was being constantly put on pause, meaning that the flow of explanation was consistently and endlessly interrupted. This was painfully obvious, even to the members of the class who were causing most of the interruptions, a fact we should perhaps give some thought. I remember being further stunned when an out-of-control student expressed his desire to join the army; as I picked my jaw up off the floor and used it to point out to him that he would have to behave in the army, he said “yeah. That’s the point.” I’ve never forgotten the fact that he knew he needed more discipline than we were providing for him. We let him down. Badly.

So, back to my tutee, who was complaining about the behaviour in his Latin class. He described exactly the kind of intermittment “buffering” that the lovely Liam pointed out to me a decade ago, so it sounded all-too familiar, but this week it really hit me just how truly appalling the situation is for so many young learners and just how many of them have come to accept it as part of their school experience. “Just as I think I’m starting to get something,” he said, “the teacher has to stop and then I’ve lost it all over again.” That’s when my heart broke a little.

It’s hard to know who needs to hear this but I suspect it’s all of us: classroom teachers, parents and senior leaders all need to face up to the problem for what it is and reassert our right and our responsibility to be the adults in the room. Disruption – low-level or otherwise – is kryptonite to every child’s understanding and progress. To ignore this is to let all of our children down.

Image genrated by AI

Nobody said it would be this hard

Why does Latin have the reputation of being so difficult? Everybody thinks that it’s difficult and to some extent it is – but so is any language, once you get past, “Bonjour, je m’appelle Emma”.

Grammar is tricky and it’s still not taught in our own language to the degree that it is in most other countries. To listen to educators, writers and commentators report on the increased level of rigour in the teaching of literacy in primary schools, you’d think that the problem was solved. In truth, the level to which grammar is taught discretely in English schools is still woeful by comparison with schools in other countries. To a certain extent, this is a self-perpetuating problem caused by failures in the system over the last couple of generations. Many current teachers admit that they struggle to teach concepts that they themselves were never taught in school, and if I had a £1 for every English teacher that has come to me for help with basic English grammar, I’d have enough for a slap-up meal.

Let’s take a closer look at why some children struggle so much with Latin over and above their other subjects and – specifically – more than any other language they might be learning in school. One obvious reason, I think, is the unfamiliar territory which this dead language presents to family and friends. Many parents and guardians feel able to offer support to their children in other subjects, certainly in the early years. I work with many families who are really involved with their children’s homework and study and children certainly do benefit from this kind of proactive and interested support at home. Lots of families employ me because they care about their children’s studies but they themselves feel ill-equipped to support them in Latin due to their own lack of knowledge; with only around 2.5% of state schools currently offering Latin on their timetable, I don’t anticipate that situation changing in a hurry. As a result of the fact that so few people have experience of Latin as a subject, it maintains a kind of mystique, and that all feeds into its reputation as an inaccessible and challenging subject.

Furthermore, and at the risk of stating the obvious, Latin is an ancient lanaguage and a dead one. What does it mean that the language is dead? Quite simply, that nobody speaks it any more. As a result, the content of what children are asked to translate will often seem very obscure. The ancient world was very different from ours and much of what went on – even in the most mundane aspects of daily life – can seem unfamiliar or even bizarre. Add to this the fact that a lot of the time students will be looking at stories from ancient myths or founding legends and we’re then into a whole new world of weirdness. The thing is, children generally like the weirdness – and indeed the darkness – of these ancient tales; if you think that children don’t appreciate the darkness of the world then explain the thundering success of a children’s author such as Patrick Ness. Children are not necessarily put off by the puzzling nature of what they are translating, but it can certainly contribute to their belief that the material is obscure.

The realities of learning an ancient language compared to a modern one are summed up by this absolutely hilarious snippet which has been doing the rounds on the internet for donkey’s years:

So, we’ve dealt with Latin’s reputation and we’ve explored the inherent fact of it being an ancient, dead language that may make it potentially difficult to access. On top of that lies the truth that Latin as a language is very different from our own and indeed from any others we are likely to be taught in UK schools.

The most important thing to understand is that Latin is a heavily inflected language. What that means is that word-formation matters: we’re not just talking about spelling here, because if you look at a word that is wrongly spelled in English, you will still more than likely be able to recognise it in context and thus understand the sentence. However, in inflected languages, words are modified to express different grammatical categories such as tense, voice, number, gender and mood. The inflection of verbs is called conjugation and this will be familiar to students of all languages, but in Latin (and in other heavily-inflected languages such as German) nouns are inflected too (as are adjectives, participles, pronouns and some numerals). So, words change and therefore become difficult to recognise. What blows students’ minds most in my experience is how this inflection translates into English and how the rendering of that translation can be confusing. For example, ad feminam in Latin means “to the woman” in the sense of “towards the woman”, so I might use the phrase in a sentence such as “the boy ran over to the woman”. However, as well as ad feminam, the word feminae, with that different ending and no preposition, can also mean “to the woman”, but this time in the sense of “giving something to”. I would therefore use feminae in a sentence such as “I gave a gift to the woman”. Using ad feminam in that context would be completely wrong. Trying to unpick why two grammatically different phrases sound the same in English is just one tiny example of myriad of misconceptions and misunderstandings that children can acquire and that can cause problems later down the line. What’s great about one-to-one tutoring, of course, is that these kinds of misconceptions can be uncovered, unpicked and rectified.

Due to its inflection, many Latin words become extremely difficult to recognise as they decline or conjugate. This brings us to what many students find the most disheartening thing about the subject, which is vocabulary learning. If a student has worked hard to learn the meaning of a list of words, imagine their disappointment and frustration when this effort bears no fruit for them when it comes to translating. A child may have learned that do means “give” but will they recognise dant, dabamus or dederunt, which are all versions of that same verb? Well, without explicit instruction, lots of practice and a huge amount of support, probably not. This can be really depressing for students and can lead to them wanting to give up altogether, which is where a tutor comes in.

Another consequence of the fact that Latin is inflected is that a Latin sentence has to be decoded – you can’t just read it from left to right. Breaking the habit of reading from left to right is one of the biggest challenges that we face when trying to teach students how to succeed in Latin. Even when a child has worked hard to learn all of their noun endings and all of their verb endings, they still need a huge amount of support and scaffolding to show them how to process these and map them onto the sentences in front of them. Most Latin teachers really underestimate the amount of time, effort and repetition that it takes to help them to break this habit. Once again, this is where one-to-one tuition can be really powerful: working with a child to model the process is key.

A picture is worth a thousand words

One of the most disquieting things about the world in which we find ourselves is that our eyes and ears can be deceived. In my younger years, I never would have believed this possible. I was raised on movies such as Clash of the Titans and One Million Years BC, so the idea that special effects would ever become genuinely convincing seemed ludicrous to me. Yet fast forward a few years to the advent of CGI and I was witnessing films such as Jurassic Park, so I guess I should have seen the next stage coming.

The advent of AI is genuinely unsettling. We already inhabit a world in which it is possible to make anyone say anything. Take an image of someone famous, take their voice, feed it into the right kind of software managed by someone with decent skills and bingo – you’ve got Sadiq Khan saying “I control the Met police and they will do as the London Mayor orders” and suggesting that the Armistice Day memorial service be moved in order to make way for a pro-Palestininan march to take place. Even Sadiq Khan himself agreed that it sounded exactly like him. It kind of was him. Only he never said it. So this is where we are.

One of the earliest cases of mass hysteria over a fake photograph took place in the early 1900s, when nine-year-old Frances Griffiths and her mother – both newly arrived in the UK from South Africa – were staying with Frances’s aunt Polly and cousin Elsie, in the village of Cottingley in West Yorkshire. Elsie and Frances played together in the valley at the bottom of the garden, and said that they went there to see the fairies. To prove it, Elsie borrowed her father’s camera and dark room and produced a series of photographs of the two of them … well … playing with fairies. While Elsie’s father immediately dismissed the photographs as a fake and a prank on the part of the girls, Elsie’s mother was convinced by them. The pictures came to the attention of writer Sir Arthur Conan Doyle, who used them to illustrate an article on fairies he had been commissioned to write for the Christmas 1920 edition of The Strand Magazine. As a spiritualist, like many thinking men of his time, Doyle was convinced by the photographs, and interpreted them as clear and visible evidence of psychic phenomena. He was not alone.

One of five photographs taken by Elsie Wright (1901–1988) and Frances Griffiths (1907–1986)

This week’s horrendouly doctored photograph of the Princess of Wales and her three children was an extraordinary example of incompetence on the part of the Royal family’s comms team. Into a world which is already somewhat alive with gossip about the welfare and whereabouts of the princess, they released a photograph so badly edited that it was immediately withdrawn by major news outlets such as Reuters and the Associated Press. Reporting in the mainstream media has remained broadly philosophical and seems to accept claims by the palace that Kate herself had simply been messing about with Photoshop, that the photograph is a poorly-executed mash-up of several frames. Those of us that hang around on the internet, however, will know that the incident has sent the world into meltdown, with theories as to the whereabouts and welfare of the princess going wild. Many people believe that the botched photograph is a complete fake that proves Kate is currently either unwilling or unable to be photographed for real. Some have even convinced themselves that she is dead.

In a world where trust in authorities is becoming more and more eroded, I wonder whether the advent of AI will make us more and more afraid. I am a little afraid myself. Photographs used to be damning evidence and fake versions of them so obvious that they held no sway in a court of law or in the court of (reasoned) public opinion. These days, not only can convincing photographs be easily faked, but this fact opens up what is perhaps an even more frightening prospect: that anyone will be able to get away with anything, simply by claiming that the evidence as to their guilt is faked. Caught me with my hands in the till? It’s a fake. Caught me on camera with the secretary, darling? It’s a fake. Caught me attacking that person? It’s a fake. I find myself wondering how any of us will ever be sure of anything in the future.

The reluctant Luddite

I am anything but a Luddite. Technology is remarkable and wonderful and I could not be luckier to have been born in the late 20th century and have the privilege of seeing our access to the written word proliferate thanks to the digital world.

As someone cursed with poor (and increasingly deteriorating) eyesight, I thank my lucky stars on a daily basis for the advent of smart screens, giving me the power to choose the nature, size and resolution of fonts, not to mention the simply glorious dawn of the audiobook. The younger among you will not recall, but the reading options for people with poor eyesight even just 20 years ago were dismal: a vanishingly small number of books were put onto audio CD and very few places stocked them. These days, the best actors are squabbling over the reading rights to books. Not long ago, I listened to a simply perfect narration of The Dutch House by Ann Pratchett, read by some chap called Tom Hanks. In a world where current research seems to indicate a worrying downturn in children reading for pleasure, I support any and all routes for them to access stories and tales, by whatever means.

As a result of all this, I always feel slightly uncomfortable when I find myself making a case against digital technology. I am the last person to criticise for I acknowledge and appreciate the huge benefits that the advent of the internet and digital technology have brought to me. Not only could I not do my job without them, my life would be infinitely poorer and less diverse. Yet one must always be cautious of what one is throwing away, and when it comes to children’s development of literacy we should be particularly so. First and foremost, we should be hyper-focused on the best ways of helping children to learn to read and write.

In January, the Guardian highlighted that “a ground-breaking study shows kids learn better on paper than on screen,” but the truth is that this information has been out there for at least two decades. Modern cognitive science evidences that motor and sensory aspects of our behaviour have a far-reaching impact on our knowledge and recall. Of course it does. Our brain is an embodied phenomenon that makes sense of the world through the physical data it receives. In a study carried out way back in 2005, subjects were shown a series of words and asked to indicate whether each word was positive or negative by moving a joystick. Half of the subjects were told to indicate that a word was positive or “good” by pulling the joystick towards their bodies, while the other half were told to indicate “good” by pushing it away. A consistent correlation was observed between meaning and movement: the quickest, most accurate and most confident responses were produced by the subjects who were told to indicate “good” by pulling the joystick towards themselves, and to indicate “bad” by pushing it away. The hypothesis is that this relates to our natural embodied state – what’s “good” feels natural drawn physically towards us, what’s “bad” feels like something we should naturally push away. This direct and inherent involvement of the body and senses in our cognitive processes helps to explain how writing by hand (as opposed to on a keyboard or a tablet) helps us to learn letters and words most efficiently. The fact that forming letters by hand is superior to doing so with the use of technology is well accepted among cognitive scientists and literacy specialists.

Furthermore, it is not just the early-years essentials of learning to write that are supported by the process of hand-writing. A study in 2021 compared subjects’ recall of words learned either by typing or writing by hand and found that recall was better when words had been learned using a pen and paper. In another study, a small group of adults learned symbols from an unfamiliar language that they then had to reproduce with either a pen or a keyboard. When they had finished learning the symbols, there were no differences in recall between the two methods, but the keyboard users forgot a significant amount of what they had learned as time passed. In other words, the process of handwriting the symbols was much more effective for long-term recall. Evidence for the effectiveness of handwriting over typing when it comes to learning is now pretty overwhelming and neuroscientists suggest that learning with a pen and paper is better because it is more “embodied,” meaning that it involves more complex sensory-motor feedback for each letter as it is written down. This complexity leaves a more distinctive blueprint in our memories and hence makes things easier to memorise and recall.

I have written before on a methodology I teach to help students to learn their set texts off by heart. The process involves writing down the first letter of each word and works only if students do so by hand. The effectiveness of the method is increased hugely if the student can be persuaded to say the whole word aloud as they write the letter. So, to learn the opening line of Portia’s speech to Shylock in The Merchant of Venice, students would say out loud “The quality of mercy is not strained” while writing the letters “T q o m i n s” in time with their articulation of the words. The physicality of the process and the immersive nature of writing, saying and repeating is quite remarkably powerful and I have never had a student fail to learn the texts using this method.

The data and current research on the importance of physical texts and handwriting have not gone unnoticed. Sweden, a country often cited as superior to ours when it comes to education, experienced a downtrend in literacy levels from 2016 onwards and is back-peddling wildly on their roll-out of digital technology in schools, returning to a focus on physical books and handwriting. What’s worrying for me is that the trend may be going in the opposite direction in the UK. Perhaps most worrying of all, the major examination boards have all indicated their desire to move towards digital examinations, despite the overwhelming chorus of dismay from Headteachers across the country who know that they simply do not have the infrastructure to support such a move. It is unsurprising that examination boards want to push the digital model, as the current process of collecting and digitising examination scripts no doubt costs them a fortune; but beyond the logistical nightmare for schools that the digitisation of examinations will present, I genuinely fear for the impact on students’ literacy and understanding. A move towards digital examinations will push schools further down the road of letting students do everything on screen (many private schools and well-funded academies are already there) and the effect on their learning will be catastrophic. Some of the students I work with are already in this position and their grasp of the texts they are learning is woeful; their teachers allow them access to a simply overwhelming number of documents, all of which they are expected to have the skills to access and draw information from, when in reality they have little to no idea what’s actually in front of them and how that relates to what they need to commit to memory.

So I find myself a somewhat reluctant Luddite, telling my students to reach for a notepad and pen and encouraging them to form letters on a page by hand. The irony in the fact that I am doing so over Zoom is not lost on me, but here’s the thing: technology is incredible, it is life-changing, it is illuminating, it is wonderfully democratic and a great leveller for those of us with physical disabilities. We must, however, be circumspect with how we use it and thus ensure that we do not unwittingly lose more than we gain.

Julius Caesar and the longest Leap Year in history

When it came to taking charge of chaotic situations, Julius Caesar did not mess about. The stories surrounding his courage on the battlefield, his talent for strategic thinking and his downright tenacity are countless, but did you know that tackling the hopelessly disorganised Roman calendar and introducing the concept of the Leap Year was also among Caesar’s claims to fame?

Picture the scene. You’re a farmer in the 1st century BC and – according to the calendar, which circled around the ritual of state religion – you ought to be doling out ripe vegetables ready for the festivals of plenty. Yet to you and any of your slave-labourers, for whom the passage of the seasons are essential, it is clear that those harvests are months away from fruition. How did this end up happening? Well, the Roman calendar had become so out of sync with astronomical reality that annual festivals were starting to bear little resemblance to what was going on in the real world. Something had to be done.

Julius Caesar wanted to fix the mess but this was no mean feat: to shift the entire Roman empire and all its provinces onto a calendar that was properly aligned with both the rotation of Earth on its axis (one day) and its orbit of the Sun (a year). Caesar’s solution created not only the longest year in history, adding months to the calendar during that year, it also anchored the calendar to the seasons and brought us the leap year. It was a phenomenal task. We are in 46BC, otherwise known as “the year of confusion”.

Centuries prior to Caesar’s intervention, the early Roman calendar was drawn up according to the cycles of the Moon and the agricultural year. The origins of the calendar being focused on agriculture gave rise to the phenomenon of a calendar with only 10 months in it, starting in spring, with the tenth and final month of the year roughly equivalent to what we now know as December. Six of the months had 30 days, and four had 31 days, giving a total of 304 days. So what about the rest? Well, this is where it gets really weird. For the two “months” of the year when there was no work being done in the fields, those days were simply not counted. The Sun continued to rise and set but – according to the early Roman calendar, no “days” officially passed. As far back as 731BC people realised that this was a little unhinged, and King Numa, the second King of Rome tried to improve the situation by introducing two extra months to cover that dead winter period. He added 51 days to the calendar, creating what we now call January and February, and this extension brought the calendar year up to 355 days.

If you think that 355 days seems like an odd number, you’d be right. The number took its starting point from the lunar year (12 lunar months), which is 354 days long. However, due to Roman superstitions about even numbers being unlucky, an additional day was added to make a nice non-threatening 355. At the same time, and for the same reason, the months of the year were arranged in such a way that they all had odd numbers of days, except for February, which had 28. February, as a result, was considered to be unlucky and became a time during which the dead were honoured as well as a time of ritual purification.

This all looks like good progress, but it was a situation that still left the Romans around 11 days out from the Solar year and even with all the improvements made, it remained inevitable that the calendar would gradually become more and more out of sync with the seasons, which are controlled by the Earth’s position in relation to the sun. By the 2nd century BC things had got so bad that a near-total eclipse of the Sun was observed in Rome in what we would now consider to be mid-March, but it was recorded as having taken place on 11th July.

Increasingly unable to escape the problem, the College of Pontiffs in Rome resorted to inserting an additional month called Mercedonius on an ad-hoc basis to try to realign the calendar. This did not go well, since public officials tended to pop the month in whenever it suited them best politically, without sufficient focus on the goal of re-aligning the calendar with the seasons. According to Suetonius, if anything it made the situation worse: “the negligence of the Pontiffs had disordered the calendar for so long through their privilege of adding months or days at pleasure, that the harvest festivals did not come in summer nor those of the vintage in the autumn”.

During 46BC – Caesar’s year of confusion – there was already an extra Mercedonius month planned for that year. But Caesar’s Egyptian astronomical advisor Sosigenes, warned that Mercedonius wasn’t going to be enough this time and that things were getting drastic. On the astronomer’s advice, Caesar therefore added another two extra months to the year, one of 33 days and one of 34, to bring the calendar in line with the Sun. These additions created the longest year in history: 15 months, lasting 445 days. Caesar’s drastic intervention brought the calendar back in line with the seasons, meaning that the practice of the ad hoc extra month of Mercedonius could be abandoned.

Of course, getting the calendar to line up with the Sun is one thing; keeping it that way is quite another. As an astronomer, Sosigenes was well aware of the problem. The issue arises from the inconvenient fact that there aren’t a nice round number of days (i.e. Earth rotations) per year (i.e. Earth orbits of the Sun). The number of Earth rotations on each of its trips around the Sun is – I am reliably infomed – roughly 365.2421897. Hence the problem and hence the need for a leap year. The Earth fits in almost an extra quarter-turn every time it does a full orbit of the Sun. Sosigenes therefore calculated that adding an extra day every four years – in February – would help to fix the mismatch. It doesn’t completely solve the problem forever, but it was a jolly good stop-gap.

Photo by Dan Meyers on Unsplash

Pyramid schemes

For every dubious claim in education, there’s a pyramid. Educationalists love them. Whether it be Bloom’s taxonomy, Maslow’s hierarcy of need or Dale’s cone of experience (otherwise known as the learning pyramid) it’s got to be presented in that shape, preferably one with a rainbow of colours. A pyramid diagram means it must be true.

Quite how anyone could ever be convinced by statements such as “we recall only 10% of what we read” is fascinating to me. Think about it. We recall only 10% of what we read?! That’s demonstrably ridiculous. This is not the only verifiably false claim I have had presented to me during my 21-year career in the classroom. I’ve listened to countless dubious assertions about how the brain works, made by people who probably struggled to pass their biology O level. I’ve sat through demonstrations of “Brain Gym”, during which I was told that waggling your head back and forward “oxygenates your frontal cortex”. I’ve been told that mind-map diagrams are the best and only way to present information to students because they look a bit like the branch-like structure of brain cells under a microscope. I’ve been told that some children’s brains work better on the left than they do on the right, and that whether they are “left-brained or right-brained” will influence their learning outcomes. These are the kinds of mind-bogglingly ridiculous assertions that were made in schools all over the country while exhausted teachers sat on plastic chairs in draughty halls and listened to them. The insult to our intelligence, never mind the sorry waste of taxpayers’ money on this drivel, makes me feel quite ill.

Yesterday I attended an online presentation given by John Nichols, the President of The Tutors’ Association and someone I worked with when I was a member of the Board of Directors of that Association a few years ago. John is an intelligent man of great integrity and has an excellent working knowledge of educational theory in all its glorious mutations, not all of them for the good. He took us on a whistlestop tour of some enduring ideas from psychologists in the 1950s, through the persistent neuromyths that have been debunked a thousand times but just won’t die, right up to the useful stuff at last being brought to us by neuroscientists about working memory, cognitive load and schema theory. It is truly heartening to know that this kind of information is being shared with tutors who are members of the Association and with luck it will start to filter through and influence the way people work.

Teachers are a cynical bunch and it would be easy for those of us who have been drowning in the tsunami of nonsense we’ve been swept away by over the years to be cynical about the more recent developments in educational theory. I am not and here’s why: they’re applicable to learning at a practical level and they work. When you apply the key principles of retrieval practice and spaced learning, you see an immediate and dramatic improvement in learning outcomes for your students. When you bear in mind cognitive load and attempt to reduce the pressure on students’ working memory in the classroom, you likewise see results. None of this was true of the old stuff, which caused nothing but obfuscation and distraction in the classroom. Even when I first joined the profession as a rookie and was regretably at my most susceptible, there was a little voice in my head telling me that this stuff was – to borrow the phrase of my old Classics master – a load of old hooey.

A part of me wishes that I’d listened to that voice sooner, but I should not be too hard on my former self, I think: it is difficult to stand against a tidal wave of so-called information when your bosses are telling you it’s all real and are also telling you that you’ll be marked down as a bad teacher if you don’t dance to their tune. When I think about the wasted hours I spent in my career trying to apply principles that were clearly nonsense because I was told to, I could weep. All of that time could have been so much better spent.

Happily, nobody now dictates to me how I work. I apply the principles that are evidence-based and work for my students. The overwhelming majority of them respond readily. For some, the simplest of techniques can feel like a revelation or a miracle, which only serves to show how far some schools have yet to go in distilling this information to their frontline teachers. To be honest, I am sympathetic to schools who remain suspicious about advice on how children learn. You can only try and sell people so many pyramid schemes before they develop a pretty cynical attitude towards any kind of salesmen.

Photo by Gaurav D Lathiya on Unsplash

First, do no harm

primum non nocere: first, do no harm.

A central tenet of the Hippocratic oath

As Tom Bennet OBE wrote on the platform formerly known as Twitter this week, “Even qualified practitioners are bound to ‘do no harm’. But the desire to support children leads many schools to well-meant but potentially damaging mental health ‘interventions’.”

This week I have listened to a quite horrifying piece of investigative journalism by the Financial Times into Goenka mindfulness retreats, at which attendees are encouraged to practise an extreme kind of meditation known as Vipassana. People on the retreat are not allowed to speak and strongly discouraged from leaving for 10 days. They are awakened at 4.00am, deprived of food and taught to meditate for multiple hours per day. Anyone who struggles with the process or becomes confused or distressed is encouraged to keep meditating. For those of you with even the most basic grasp of mental health and wellbeing, it will not come as a massive shock to discover that some people are affected very negatively by this process. I recommend you listen to the podcast but please be aware that it does not shy away from some very difficult material: there are people who have lost their loved ones to this process.

Human beings are social animals. We have evolved to live in groups and we know that extreme social isolation and withdrawal has a very negative effect on mental health and wellbeing in an extremely short time. The dangerous impact of solitary confinement is well-documented and has caused neuroscientists to campaign against its prolonged use in the penal system. Even good old-fashioned and ever-familiar loneliness has been proved to have a significant impact on a person’s health and longevity, never mind their psychological well-being. It should not surprise us in the least to discover that a process which demands people shut themselves off from each other and concentrate entirely and exclusively on the what’s inside their own head carries the risk of a psychotic break.

As part of my studies during my degree in Classics I did a course on the rise of Christianity in the Roman world. I recall reading an account of the life of St Antony by the Bishop Athanasius and being particularly struck by a passage that reports upon his demeanour when leaving a fortress in which he had shut himself for 20 years in order to commune with God and battle his demons. It reads as follows:

“Antony, as from a shrine, came forth initiated in the mysteries and filled with the spirit of God. Then for the first time he was seen outside the fort by those who came to see him. And they, when they saw him, wondered at the sight, for he had the same habit of body as before … but his soul was free from blemish, for it was neither contracted as if by grief, nor relaxed by pleasure, nor possessed by laughter or dejection. For he was not troubled when he beheld the crowd, nor overjoyed at being saluted by so many.”

While I do not wish to mock or offend anyone’s deeply-held beliefs, it seems pretty clear to me that this is a description of someone who has completely detached from other human beings and is suffering from the psychological effects of that process. While the religiously-minded among you may see this as an account of someone in touch with the holy spirit, I see it as an account of someone who is suffering from a psychotic break. Antony is described as being unmoved by and disconnected from the people around him, in possession of a strange kind of detachment. Given that he had spent 20 years in isolation while – in his mind – battling between good and evil, this is not greatly surprising.

During my final few years in mainstream education there was a big push on “mindfulness” for all students. This was what Tom Bennet was referring to in the Tweet I quoted at the start of this blog and I share his concerns about this growing trend. The mental health of young people is a painful and emotive issue and has been brought into sharp relief once again with calls from a grieving mother asking for mindfulness to be rolled out across all state schools (although it is already being promoted and practised in many). As Daniel Bundred wrote on the same platform as Tom a few months ago, “Schools probably shouldn’t do mindfulness, because most teachers are fundamentally unqualified to lead mindfulness, and entirely unequipped to deal with the potential outcomes of it.” As he puts it, “Mindfulness strikes me as being very similar to guided meditation in approach and potentially outcome; how many teachers could handle a student experiencing ego-death in their classroom? Ego-death is a potential outcome of successful meditation, it’s not desirable in tutor time.” Daniel here is referencing exactly the kind of experiences that the young people who underwent a psychotic break at the Goenka retreats have experienced. This is of course the worst-case scenario and while not widespread it is crucially important consider if we are to stick to the concept of “do no harm”; the advocates of the Goenka retreat point to the many people who say that meditation has helped them, as if the handful of attributable deaths are therefore irrelevant. It is essential to remember that teachers (like the volunteers at the Goenka retreats) are not mental health experts; fiddling about with something as potentially profound and intimate as mindfulness or meditation is profundly dangerous and goes way beyond the remit of educators.

Beyond the enormous risk of potential harm to a student who may have experienced past trauma or may simply not be an appropriate candidate for mindfulness for a variety of reasons, there is an increasing amount of evidence indicating that mindfulness in schools does no good for anybody. A recent study revealed no tangible positive outcomes, which places the profund risk of harm to some in an even more alarming context. Why are we doing something with risks attached to it when there are no estimable benefits anyway? Beyond this, why are we demanding that teachers expend their time and energy on something unnproven and valueless?

Tom Bennet is right. As he puts it: “The best way to support children’s mental health in a school environment? Provide a culture that is safe, calm and dignified. With purposeful activities.” In our desperation to support the most vulnerable of children, we must never forget the simple power of providing routine, stability and boundaries for those whose personal and emotional lives may well (for all we know) be dominated by chaos, trauma and distress. The more we acknowledge that some children face the most horrifying of circumstances, the more essential the security of our education system becomes. School and the reassurance that its stability provides is a lifeline for many of our children. This is what we should be providing for them.

Photo by Colton Sturgeon on Unsplash

False judgements

Emotions got a bad rap from ancient philosophers. Most agreed that the ideal state was a kind of calmness that the Hellenistic philosophers (most famously the Epicureans and the Stoics) called ataraxia. There was even talk of apatheia – a detachment from the chaos of feelings and overwhelm. This is perhaps unsurprising if you understand the birth of western philosophy; if you’re trying to formulate, define and distil the key to the perfect life and the perfect society (which is what the early founders of western philosophy were trying to do) then it probably doesn’t include your citizens experiencing a rollercoaster of emotions. Once you’ve admitted that emotions are a bit of a distraction and often cause issues both on a personal level and for society, it’s not much of an overreach to find yourself arguing for a state of detachment.

The term “stoic” these days is synonymous with having a “stiff upper lip” but this is based on a crucial misunderstanding of the Stoic position. The Stoics did not advocate for iron-clad self-control or suppressing your feelings. Rather, they believed that all emotions were what they called “false judgements”, which meant that they were based on a misunderstanding: if you’re feeling them, you’re still getting it wrong. In the ideal philosophical life that they strove for, a person would have such a great understanding of himself, the world and his place within it that he would not suffer at the slings and arrows of outrageous fortune: he would simply nod and know the right thing to do. One example given is that a Stoic would run into a burning building in order to attempt to save a child because that is the right thing to do; they also argued, however, that a true Stoic would feel no distress when his mission failed. Weird, isn’t it? Interesting, though.

One of the frustrating things about this period of philosophy is that much of the writings that we have are general “sayings”, snippets or purported quotations which appear in the works of later authors, usually writing in Latin rather than in Greek, and reporting on what a particular thinker or school of thinkers believed. The reality of this of course is that they may be wrong. For example, there is a famous quotation attributed to Epicurus that states “the wise man is happy on the rack”. Quite how this works within a school of philosophy that was dedicated to the avoidance of pain is puzzling. If the quotation is correct, our best guess is that the Epicureans certainly spent a lot of their time considering the correct attitude towards unavoidable pain, for this was one of the biggest challenges to their philosophical position; presumably the “wise man” – someone at the pinnacle of philosophical endeavour – would know how to cope with pain in extremis.

Most people see Epicureanism and Stoicism as polar opposites and they were indeed rival schools of philosophy at the time. As so often, however, there was more that united them than divided them. Both schools were arguing and aiming for the perfect life and the state of detachment that philosophers before them had explored; both schools were concerned with how to manage our responses to pain and distress. Perhaps the biggest difference is that the Stoics believed in proactive, conscious and deliberate involvement in society and its structures, whereas the Epicureans were a bit more lethargic about the whole idea – getting involved with politics is painful and distressing, so is it really rational to bother?

One philosopher, writing before the Stoics and the Epicureans, was unusual in his take on emotions. Aristotle argued that emotions were appropriate and necessary: the trick was understanding when and how you should be feeling them and what to do with them. He spoke of “righteous anger” and argued that a good philosopher would indeed feel such a thing. It is difficult to explain how truly radical this position was, when the way the philosophical movement was drifting was towards ataraxia and apatheia. Aristotle also smashed through the Socratic idea that philosophical ideals such as “courage” and “justice” could be defined in one way and that if one could not do so then one lacked an understanding of them. Aristotle argued that there were multiple forms of “courage” and “justice” and that nobody could define them in one simple way nor apply their principles in individual cases without discussion, debate and compromise. What a genius he was.

Why the hell am I writing about this? Well, I spoke to a friend yesterday who has taken a decision about which she feels guilty. I cannot divulge the details of this decision as I do not want to betray her confidence. Suffice to say that it was a professional decision, the right decision and one which the people affected will hopefully benefit from in the long-run. There is no doubt – in my mind and even in hers – that the decision was right and good. Yet she still feels what she describes as “guilty” about it.

This reminded me yet again of The Greeks and the Irrational by ER Dodds, a book written in the 1950s, which I mentioned in another blog a few weeks ago. One of the chapters in the book argues that the Athenian world was a “shame culture” and that later ancient societies – the Hellenistic world and the Roman worlds – began the shift towards a “guilt culture”. I have thought about this on and off all of my life. The very thought that the nature of one’s emotions can be dictated by the society in which one grows up is fascinating to me. Dodds argues (rightly, I think) that modern society is more person-centric and hence feelings such as guilt can be internalised; in Athens, one’s personal standing and engagement with society was more relevant (a symptom perhaps of living in a small and emergent city-state) and therefore a sense of shame before others was more powerful than any kind of internalised guilt.

As I listened to my friend who left me some WhatsApp voice messages (I love them – it’s like receiving a personalised podcast!) I found myself wondering whether the Stoics had it right. Sometimes emotions truly are false judgements. My friend has no reason to feel guilty about her actions and she should strive to release herself from the false state of mind in which this feeling distresses her. According to the Stoic ideal she has prevailed in her actions but has not yet achieved the ideal state of detachment. So how should she achieve this goal? Well, I guess it depends on your approach to these things. A Stoic would advocate for rigorous rational analysis and say that this will eventually lead to release from one’s feelings. This is not, in fact, a million miles away from cognitive behavioural therapy, the therapy model supported by psychiatrists and many psychologists, who would say that she needs to question why she feels guilty and challenge her reasons for doing so. A psychologist with leanings towards the psychodynamic model would argue that she needs to explore where her feelings might stem from – does the situation remind her of experiences in her past, during which she has been made to feel or to carry guilt that perhaps should not have been hers? (Pretty sure the Stoics wouldn’t have been up for that one).

Whatever the answer in this particular circumstance, personally I find myself returning to the Stoics time and again. They were a fascinating turning point in philosophical history and paved the way – I believe – towards modern psychiatry. After all, what is the difference between sanity and insanity if not the difference between the rational and the irrational, the true and the untrue, the controlled and the uncontrolled? I will leave you with the Stoic image of how the individual should relate to society – not because I advocate for it, necessarily, but because it’s a classic and a model I have never stopped thinking about since I first learned about it in the 1990s. The Stoics believed that individuals could not control fate but they also argued that individuals had free will. So an individual person is like a dog tied to the back of a wagon. Whatever the dog’s actions, the wagon will go on its way. So how does the dog have free will? Well, he can resist the wagon and be dragged along, impeding the wagon’s progress and damaging himself along the way. Alternatively, he can trot along like a good dog and help the wagon to proceed smoothly.

This incredible photo is by Jaseel T on Unsplash.
It was taken in the Museum of the Future in Dubai