Julius Caesar and the longest Leap Year in history

When it came to taking charge of chaotic situations, Julius Caesar did not mess about. The stories surrounding his courage on the battlefield, his talent for strategic thinking and his downright tenacity are countless, but did you know that tackling the hopelessly disorganised Roman calendar and introducing the concept of the Leap Year was also among Caesar’s claims to fame?

Picture the scene. You’re a farmer in the 1st century BC and – according to the calendar, which circled around the ritual of state religion – you ought to be doling out ripe vegetables ready for the festivals of plenty. Yet to you and any of your slave-labourers, for whom the passage of the seasons are essential, it is clear that those harvests are months away from fruition. How did this end up happening? Well, the Roman calendar had become so out of sync with astronomical reality that annual festivals were starting to bear little resemblance to what was going on in the real world. Something had to be done.

Julius Caesar wanted to fix the mess but this was no mean feat: to shift the entire Roman empire and all its provinces onto a calendar that was properly aligned with both the rotation of Earth on its axis (one day) and its orbit of the Sun (a year). Caesar’s solution created not only the longest year in history, adding months to the calendar during that year, it also anchored the calendar to the seasons and brought us the leap year. It was a phenomenal task. We are in 46BC, otherwise known as “the year of confusion”.

Centuries prior to Caesar’s intervention, the early Roman calendar was drawn up according to the cycles of the Moon and the agricultural year. The origins of the calendar being focused on agriculture gave rise to the phenomenon of a calendar with only 10 months in it, starting in spring, with the tenth and final month of the year roughly equivalent to what we now know as December. Six of the months had 30 days, and four had 31 days, giving a total of 304 days. So what about the rest? Well, this is where it gets really weird. For the two “months” of the year when there was no work being done in the fields, those days were simply not counted. The Sun continued to rise and set but – according to the early Roman calendar, no “days” officially passed. As far back as 731BC people realised that this was a little unhinged, and King Numa, the second King of Rome tried to improve the situation by introducing two extra months to cover that dead winter period. He added 51 days to the calendar, creating what we now call January and February, and this extension brought the calendar year up to 355 days.

If you think that 355 days seems like an odd number, you’d be right. The number took its starting point from the lunar year (12 lunar months), which is 354 days long. However, due to Roman superstitions about even numbers being unlucky, an additional day was added to make a nice non-threatening 355. At the same time, and for the same reason, the months of the year were arranged in such a way that they all had odd numbers of days, except for February, which had 28. February, as a result, was considered to be unlucky and became a time during which the dead were honoured as well as a time of ritual purification.

This all looks like good progress, but it was a situation that still left the Romans around 11 days out from the Solar year and even with all the improvements made, it remained inevitable that the calendar would gradually become more and more out of sync with the seasons, which are controlled by the Earth’s position in relation to the sun. By the 2nd century BC things had got so bad that a near-total eclipse of the Sun was observed in Rome in what we would now consider to be mid-March, but it was recorded as having taken place on 11th July.

Increasingly unable to escape the problem, the College of Pontiffs in Rome resorted to inserting an additional month called Mercedonius on an ad-hoc basis to try to realign the calendar. This did not go well, since public officials tended to pop the month in whenever it suited them best politically, without sufficient focus on the goal of re-aligning the calendar with the seasons. According to Suetonius, if anything it made the situation worse: “the negligence of the Pontiffs had disordered the calendar for so long through their privilege of adding months or days at pleasure, that the harvest festivals did not come in summer nor those of the vintage in the autumn”.

During 46BC – Caesar’s year of confusion – there was already an extra Mercedonius month planned for that year. But Caesar’s Egyptian astronomical advisor Sosigenes, warned that Mercedonius wasn’t going to be enough this time and that things were getting drastic. On the astronomer’s advice, Caesar therefore added another two extra months to the year, one of 33 days and one of 34, to bring the calendar in line with the Sun. These additions created the longest year in history: 15 months, lasting 445 days. Caesar’s drastic intervention brought the calendar back in line with the seasons, meaning that the practice of the ad hoc extra month of Mercedonius could be abandoned.

Of course, getting the calendar to line up with the Sun is one thing; keeping it that way is quite another. As an astronomer, Sosigenes was well aware of the problem. The issue arises from the inconvenient fact that there aren’t a nice round number of days (i.e. Earth rotations) per year (i.e. Earth orbits of the Sun). The number of Earth rotations on each of its trips around the Sun is – I am reliably infomed – roughly 365.2421897. Hence the problem and hence the need for a leap year. The Earth fits in almost an extra quarter-turn every time it does a full orbit of the Sun. Sosigenes therefore calculated that adding an extra day every four years – in February – would help to fix the mismatch. It doesn’t completely solve the problem forever, but it was a jolly good stop-gap.

Photo by Dan Meyers on Unsplash

Pyramid schemes

For every dubious claim in education, there’s a pyramid. Educationalists love them. Whether it be Bloom’s taxonomy, Maslow’s hierarcy of need or Dale’s cone of experience (otherwise known as the learning pyramid) it’s got to be presented in that shape, preferably one with a rainbow of colours. A pyramid diagram means it must be true.

Quite how anyone could ever be convinced by statements such as “we recall only 10% of what we read” is fascinating to me. Think about it. We recall only 10% of what we read?! That’s demonstrably ridiculous. This is not the only verifiably false claim I have had presented to me during my 21-year career in the classroom. I’ve listened to countless dubious assertions about how the brain works, made by people who probably struggled to pass their biology O level. I’ve sat through demonstrations of “Brain Gym”, during which I was told that waggling your head back and forward “oxygenates your frontal cortex”. I’ve been told that mind-map diagrams are the best and only way to present information to students because they look a bit like the branch-like structure of brain cells under a microscope. I’ve been told that some children’s brains work better on the left than they do on the right, and that whether they are “left-brained or right-brained” will influence their learning outcomes. These are the kinds of mind-bogglingly ridiculous assertions that were made in schools all over the country while exhausted teachers sat on plastic chairs in draughty halls and listened to them. The insult to our intelligence, never mind the sorry waste of taxpayers’ money on this drivel, makes me feel quite ill.

Yesterday I attended an online presentation given by John Nichols, the President of The Tutors’ Association and someone I worked with when I was a member of the Board of Directors of that Association a few years ago. John is an intelligent man of great integrity and has an excellent working knowledge of educational theory in all its glorious mutations, not all of them for the good. He took us on a whistlestop tour of some enduring ideas from psychologists in the 1950s, through the persistent neuromyths that have been debunked a thousand times but just won’t die, right up to the useful stuff at last being brought to us by neuroscientists about working memory, cognitive load and schema theory. It is truly heartening to know that this kind of information is being shared with tutors who are members of the Association and with luck it will start to filter through and influence the way people work.

Teachers are a cynical bunch and it would be easy for those of us who have been drowning in the tsunami of nonsense we’ve been swept away by over the years to be cynical about the more recent developments in educational theory. I am not and here’s why: they’re applicable to learning at a practical level and they work. When you apply the key principles of retrieval practice and spaced learning, you see an immediate and dramatic improvement in learning outcomes for your students. When you bear in mind cognitive load and attempt to reduce the pressure on students’ working memory in the classroom, you likewise see results. None of this was true of the old stuff, which caused nothing but obfuscation and distraction in the classroom. Even when I first joined the profession as a rookie and was regretably at my most susceptible, there was a little voice in my head telling me that this stuff was – to borrow the phrase of my old Classics master – a load of old hooey.

A part of me wishes that I’d listened to that voice sooner, but I should not be too hard on my former self, I think: it is difficult to stand against a tidal wave of so-called information when your bosses are telling you it’s all real and are also telling you that you’ll be marked down as a bad teacher if you don’t dance to their tune. When I think about the wasted hours I spent in my career trying to apply principles that were clearly nonsense because I was told to, I could weep. All of that time could have been so much better spent.

Happily, nobody now dictates to me how I work. I apply the principles that are evidence-based and work for my students. The overwhelming majority of them respond readily. For some, the simplest of techniques can feel like a revelation or a miracle, which only serves to show how far some schools have yet to go in distilling this information to their frontline teachers. To be honest, I am sympathetic to schools who remain suspicious about advice on how children learn. You can only try and sell people so many pyramid schemes before they develop a pretty cynical attitude towards any kind of salesmen.

Photo by Gaurav D Lathiya on Unsplash

First, do no harm

primum non nocere: first, do no harm.

A central tenet of the Hippocratic oath

As Tom Bennet OBE wrote on the platform formerly known as Twitter this week, “Even qualified practitioners are bound to ‘do no harm’. But the desire to support children leads many schools to well-meant but potentially damaging mental health ‘interventions’.”

This week I have listened to a quite horrifying piece of investigative journalism by the Financial Times into Goenka mindfulness retreats, at which attendees are encouraged to practise an extreme kind of meditation known as Vipassana. People on the retreat are not allowed to speak and strongly discouraged from leaving for 10 days. They are awakened at 4.00am, deprived of food and taught to meditate for multiple hours per day. Anyone who struggles with the process or becomes confused or distressed is encouraged to keep meditating. For those of you with even the most basic grasp of mental health and wellbeing, it will not come as a massive shock to discover that some people are affected very negatively by this process. I recommend you listen to the podcast but please be aware that it does not shy away from some very difficult material: there are people who have lost their loved ones to this process.

Human beings are social animals. We have evolved to live in groups and we know that extreme social isolation and withdrawal has a very negative effect on mental health and wellbeing in an extremely short time. The dangerous impact of solitary confinement is well-documented and has caused neuroscientists to campaign against its prolonged use in the penal system. Even good old-fashioned and ever-familiar loneliness has been proved to have a significant impact on a person’s health and longevity, never mind their psychological well-being. It should not surprise us in the least to discover that a process which demands people shut themselves off from each other and concentrate entirely and exclusively on the what’s inside their own head carries the risk of a psychotic break.

As part of my studies during my degree in Classics I did a course on the rise of Christianity in the Roman world. I recall reading an account of the life of St Antony by the Bishop Athanasius and being particularly struck by a passage that reports upon his demeanour when leaving a fortress in which he had shut himself for 20 years in order to commune with God and battle his demons. It reads as follows:

“Antony, as from a shrine, came forth initiated in the mysteries and filled with the spirit of God. Then for the first time he was seen outside the fort by those who came to see him. And they, when they saw him, wondered at the sight, for he had the same habit of body as before … but his soul was free from blemish, for it was neither contracted as if by grief, nor relaxed by pleasure, nor possessed by laughter or dejection. For he was not troubled when he beheld the crowd, nor overjoyed at being saluted by so many.”

While I do not wish to mock or offend anyone’s deeply-held beliefs, it seems pretty clear to me that this is a description of someone who has completely detached from other human beings and is suffering from the psychological effects of that process. While the religiously-minded among you may see this as an account of someone in touch with the holy spirit, I see it as an account of someone who is suffering from a psychotic break. Antony is described as being unmoved by and disconnected from the people around him, in possession of a strange kind of detachment. Given that he had spent 20 years in isolation while – in his mind – battling between good and evil, this is not greatly surprising.

During my final few years in mainstream education there was a big push on “mindfulness” for all students. This was what Tom Bennet was referring to in the Tweet I quoted at the start of this blog and I share his concerns about this growing trend. The mental health of young people is a painful and emotive issue and has been brought into sharp relief once again with calls from a grieving mother asking for mindfulness to be rolled out across all state schools (although it is already being promoted and practised in many). As Daniel Bundred wrote on the same platform as Tom a few months ago, “Schools probably shouldn’t do mindfulness, because most teachers are fundamentally unqualified to lead mindfulness, and entirely unequipped to deal with the potential outcomes of it.” As he puts it, “Mindfulness strikes me as being very similar to guided meditation in approach and potentially outcome; how many teachers could handle a student experiencing ego-death in their classroom? Ego-death is a potential outcome of successful meditation, it’s not desirable in tutor time.” Daniel here is referencing exactly the kind of experiences that the young people who underwent a psychotic break at the Goenka retreats have experienced. This is of course the worst-case scenario and while not widespread it is crucially important consider if we are to stick to the concept of “do no harm”; the advocates of the Goenka retreat point to the many people who say that meditation has helped them, as if the handful of attributable deaths are therefore irrelevant. It is essential to remember that teachers (like the volunteers at the Goenka retreats) are not mental health experts; fiddling about with something as potentially profound and intimate as mindfulness or meditation is profundly dangerous and goes way beyond the remit of educators.

Beyond the enormous risk of potential harm to a student who may have experienced past trauma or may simply not be an appropriate candidate for mindfulness for a variety of reasons, there is an increasing amount of evidence indicating that mindfulness in schools does no good for anybody. A recent study revealed no tangible positive outcomes, which places the profund risk of harm to some in an even more alarming context. Why are we doing something with risks attached to it when there are no estimable benefits anyway? Beyond this, why are we demanding that teachers expend their time and energy on something unnproven and valueless?

Tom Bennet is right. As he puts it: “The best way to support children’s mental health in a school environment? Provide a culture that is safe, calm and dignified. With purposeful activities.” In our desperation to support the most vulnerable of children, we must never forget the simple power of providing routine, stability and boundaries for those whose personal and emotional lives may well (for all we know) be dominated by chaos, trauma and distress. The more we acknowledge that some children face the most horrifying of circumstances, the more essential the security of our education system becomes. School and the reassurance that its stability provides is a lifeline for many of our children. This is what we should be providing for them.

Photo by Colton Sturgeon on Unsplash

False judgements

Emotions got a bad rap from ancient philosophers. Most agreed that the ideal state was a kind of calmness that the Hellenistic philosophers (most famously the Epicureans and the Stoics) called ataraxia. There was even talk of apatheia – a detachment from the chaos of feelings and overwhelm. This is perhaps unsurprising if you understand the birth of western philosophy; if you’re trying to formulate, define and distil the key to the perfect life and the perfect society (which is what the early founders of western philosophy were trying to do) then it probably doesn’t include your citizens experiencing a rollercoaster of emotions. Once you’ve admitted that emotions are a bit of a distraction and often cause issues both on a personal level and for society, it’s not much of an overreach to find yourself arguing for a state of detachment.

The term “stoic” these days is synonymous with having a “stiff upper lip” but this is based on a crucial misunderstanding of the Stoic position. The Stoics did not advocate for iron-clad self-control or suppressing your feelings. Rather, they believed that all emotions were what they called “false judgements”, which meant that they were based on a misunderstanding: if you’re feeling them, you’re still getting it wrong. In the ideal philosophical life that they strove for, a person would have such a great understanding of himself, the world and his place within it that he would not suffer at the slings and arrows of outrageous fortune: he would simply nod and know the right thing to do. One example given is that a Stoic would run into a burning building in order to attempt to save a child because that is the right thing to do; they also argued, however, that a true Stoic would feel no distress when his mission failed. Weird, isn’t it? Interesting, though.

One of the frustrating things about this period of philosophy is that much of the writings that we have are general “sayings”, snippets or purported quotations which appear in the works of later authors, usually writing in Latin rather than in Greek, and reporting on what a particular thinker or school of thinkers believed. The reality of this of course is that they may be wrong. For example, there is a famous quotation attributed to Epicurus that states “the wise man is happy on the rack”. Quite how this works within a school of philosophy that was dedicated to the avoidance of pain is puzzling. If the quotation is correct, our best guess is that the Epicureans certainly spent a lot of their time considering the correct attitude towards unavoidable pain, for this was one of the biggest challenges to their philosophical position; presumably the “wise man” – someone at the pinnacle of philosophical endeavour – would know how to cope with pain in extremis.

Most people see Epicureanism and Stoicism as polar opposites and they were indeed rival schools of philosophy at the time. As so often, however, there was more that united them than divided them. Both schools were arguing and aiming for the perfect life and the state of detachment that philosophers before them had explored; both schools were concerned with how to manage our responses to pain and distress. Perhaps the biggest difference is that the Stoics believed in proactive, conscious and deliberate involvement in society and its structures, whereas the Epicureans were a bit more lethargic about the whole idea – getting involved with politics is painful and distressing, so is it really rational to bother?

One philosopher, writing before the Stoics and the Epicureans, was unusual in his take on emotions. Aristotle argued that emotions were appropriate and necessary: the trick was understanding when and how you should be feeling them and what to do with them. He spoke of “righteous anger” and argued that a good philosopher would indeed feel such a thing. It is difficult to explain how truly radical this position was, when the way the philosophical movement was drifting was towards ataraxia and apatheia. Aristotle also smashed through the Socratic idea that philosophical ideals such as “courage” and “justice” could be defined in one way and that if one could not do so then one lacked an understanding of them. Aristotle argued that there were multiple forms of “courage” and “justice” and that nobody could define them in one simple way nor apply their principles in individual cases without discussion, debate and compromise. What a genius he was.

Why the hell am I writing about this? Well, I spoke to a friend yesterday who has taken a decision about which she feels guilty. I cannot divulge the details of this decision as I do not want to betray her confidence. Suffice to say that it was a professional decision, the right decision and one which the people affected will hopefully benefit from in the long-run. There is no doubt – in my mind and even in hers – that the decision was right and good. Yet she still feels what she describes as “guilty” about it.

This reminded me yet again of The Greeks and the Irrational by ER Dodds, a book written in the 1950s, which I mentioned in another blog a few weeks ago. One of the chapters in the book argues that the Athenian world was a “shame culture” and that later ancient societies – the Hellenistic world and the Roman worlds – began the shift towards a “guilt culture”. I have thought about this on and off all of my life. The very thought that the nature of one’s emotions can be dictated by the society in which one grows up is fascinating to me. Dodds argues (rightly, I think) that modern society is more person-centric and hence feelings such as guilt can be internalised; in Athens, one’s personal standing and engagement with society was more relevant (a symptom perhaps of living in a small and emergent city-state) and therefore a sense of shame before others was more powerful than any kind of internalised guilt.

As I listened to my friend who left me some WhatsApp voice messages (I love them – it’s like receiving a personalised podcast!) I found myself wondering whether the Stoics had it right. Sometimes emotions truly are false judgements. My friend has no reason to feel guilty about her actions and she should strive to release herself from the false state of mind in which this feeling distresses her. According to the Stoic ideal she has prevailed in her actions but has not yet achieved the ideal state of detachment. So how should she achieve this goal? Well, I guess it depends on your approach to these things. A Stoic would advocate for rigorous rational analysis and say that this will eventually lead to release from one’s feelings. This is not, in fact, a million miles away from cognitive behavioural therapy, the therapy model supported by psychiatrists and many psychologists, who would say that she needs to question why she feels guilty and challenge her reasons for doing so. A psychologist with leanings towards the psychodynamic model would argue that she needs to explore where her feelings might stem from – does the situation remind her of experiences in her past, during which she has been made to feel or to carry guilt that perhaps should not have been hers? (Pretty sure the Stoics wouldn’t have been up for that one).

Whatever the answer in this particular circumstance, personally I find myself returning to the Stoics time and again. They were a fascinating turning point in philosophical history and paved the way – I believe – towards modern psychiatry. After all, what is the difference between sanity and insanity if not the difference between the rational and the irrational, the true and the untrue, the controlled and the uncontrolled? I will leave you with the Stoic image of how the individual should relate to society – not because I advocate for it, necessarily, but because it’s a classic and a model I have never stopped thinking about since I first learned about it in the 1990s. The Stoics believed that individuals could not control fate but they also argued that individuals had free will. So an individual person is like a dog tied to the back of a wagon. Whatever the dog’s actions, the wagon will go on its way. So how does the dog have free will? Well, he can resist the wagon and be dragged along, impeding the wagon’s progress and damaging himself along the way. Alternatively, he can trot along like a good dog and help the wagon to proceed smoothly.

This incredible photo is by Jaseel T on Unsplash.
It was taken in the Museum of the Future in Dubai

Perchance to dream?

Last night I dreamt that Roald Dahl was in prison. Not exactly “I went to Manderley again” as an opening line, but it’s the truth.

Despite centuries of interest in the subject and recent studies with all the benefits of modern science, dreams are still not fully understood. They are generally acknowledged to be a by-product of evolution and quite possibly the brain’s way of processing and sorting information, but exactly how and why they occur is still debated. Some neuroscientists and psychologists argue that they help us to organise our memories, others suggest that they are part of the important process of forgetting or “dumping” unnecessary clutter from our minds. Some believe that they are a way of safely practising difficult scenarios, and some have even claimed that the frequency of dreams in which we are being chased – particularly in childhood – is evidence for their origins in our early evolutionary history. I’m not sure I buy that, not least because it falls into the trap of believing that everything that evolves does so for an obvious purpose. Dreams may simply be a by-product of our extraordinarily large and complex brain-structures: they may not necessarily be essential or advantageous in the battle of survival and reproduction. One thing’s for sure, it is frequently difficult to explain how a particular story ends up being told in one’s mind overnight; last night, my brain placed a long-dead children’s author behind bars.

Dreams mainly occur while we are in REM sleep, which for adult humans makes up only around two hours per night of our sleep time. Yet some research indicates that a human foetus in utero, by the time it reaches the third trimester, spends around 20 hours out of each 24-hour cycle in REM sleep. Is the foetus dreaming for all of that time? If so, what on earth is it dreaming about and how does that relate to the commonly-accepted idea that dreams are remnants of our thoughts?

When I was doing my PhD I spent an inordinate amount of time going down rabbit holes of research into this kind of thing. The ancient work I studied (which I have written about in a little more detail before) mentions in passing that messages from the gods come to us in the hazy state between sleeping and waking, a state now defined as “hypnogogic” and one into which there has been a considerable amount of research. I became fascinated by the idea of different brain-states and how people may experience phenomena such as audible hallucinations and thus become convinced that they are receiving messages from a divine source. I read all sorts of stuff written by anthropologists, neurologists and psychologists and realised just how little I knew about the grey matter inside my own skull.

When it comes to studying, one of the things worth knowing about the brain is that “memory is the residue of thought” meaning that “the more you think about something, the more likely it is that you’ll remember it later.” (Daniel T. Willingham). This might seem obvious but you wouldn’t believe how little consideration is given to this fact in our education system. Students will only recall things that they are actively thinking about – reading and highlighting, for example, are both passive activities which are very unlikely to aid recall. If you need to absorb, understand and recall the information written on a page, you should put the book down and reproduce its contents in your own words in order to have any chance of being able to remember it. This process forces you brain to begin forming memories, which are in fact reconstructions: memory doesn’t work like a recording, it is rather the brain constantly reconstructing its past experiences, which explains why eye-witness accounts are so unreliable and why each individual may remember the same situation very differently from other people.

All of this means – I’m afraid – that those fantasies people have about listening to recordings while they sleep and miraculously waking up knowing the information on the recording really are that – just fantasies. The brain is not a computer: you can’t do a reboot and download while it’s powered down. Much as one would like to wake up like Neo in The Matrix with a newfound perfect knowledge of and ability to perform Kung Fu, the reality is that learning new information or a new skill requires constant use, review and practice.

All of that said, it is undeniable that sleep (and – for reasons we have yet to understand – dreaming) is essential for good learning. This is not only because exhaustion is detrimental to study, it is also because that downtime really is important for the brain to be able to do its job properly, especially when we are making big demands of it. Further to this, “sleeping on a problem” can often make a huge difference, in ways that are once again not fully understood. My father, a brilliant engineer, often reported waking up with a solution to a problem he had been grappling with and failing to solve during his waking hours. Similarly, I have found that I can be completely stuck on a crossword clue but when I come back to it the next day and pick up the clue again, the solution seems blindingly obvious, even though I have given it no proactive thought in the last 24 hours. This kind of background problem-solving really is a fascinating quirk of brain-states and one I wonder whether neuroscientists will be able to explain in the future.

Many parents worry that their children are not getting enough sleep and there is certainly a lot of evidence that many young people, particularly teenagers, are sleep-deprived. The best advice remains to observe good digital hygiene: do not under any circumstances allow your child to take their devices to bed. Personally, I do have my phone beside my bedside but all notifications switch off after my bedtime (you can set emergency numbers from loved ones as exceptions to this rule, by the way) so it does not disturb me after I have gone to bed and I am not fascinated enough by it to have the urge to check it during the night. This is not true of most teenagers when it comes to their smart phones, and they need protecting from this temptation.

I have resolved to read more about dreaming and sleep-states, as I have no doubt that the research has moved on since I last dipped into this field. One of my favourite games to play is to try to trace where my dreams have come from. Why did I put Roald Dahl behind bars? Well, this week I’ve been watching a police drama with lots of scenes in cells, plus I have also read a fair bit about “cancel culture” over the last few weeks, which may have set off a chain of links in my mind to something I read about Dahl’s works being edited to remove language that is deemed not to resonate with the current zeitgeist. Is that where it all came from? Quite probably. Dreams are rarely, if ever, significant. I look forward to increasing my knowledge. Perhaps we now know whether androids dream of electric sheep.

Photo by Ihor Malytskyi on Unsplash

Post-mock post-mortem?

No matter how many years I spent at the chalkface, I remained unconvinced as to the value of dissecting children’s Mock papers in class. While there was always an urge to pore over mistakes and demonstrate to students exactly what they should have written, I never felt that the process added as much value as I would have liked. Now that I am separated from the classroom, it is perhaps easier to reflect on why that might be.

Even if students have already received their overall grades (my old school used to dish them out in enevelopes to give them the “full experience” of receiving their results), the class in which students first gain sight of their papers in the one where they see how they performed in the separate papers of each exam. In most schools, they may also have just been told their overall grade by the teacher. This, to me, is the problem. Ever since Black and Wiliam first published their seminal work on assessment for learning (a concept they now wish they had named “responsive teaching”), the authors observed that students take significantly less notice of feedback if there is a grade attached to it, rendering the process of feedback close to pointless. This should not surprise us greatly: it is a natural response to be fixated on how you performed overall rather than the minutiae of why that result has come to pass, especially when the overall performance grade is high-stakes. It is very difficult for students to let go of their emotional response to their grade (whether it be good or bad) and concentrate on the feedback offered. This goes especially for students who are shocked and/or upset by their result, and thus calls into question the wisdom of the entire process.

It is difficult for classroom teachers to know what to do for the best. Every instinct drives any good teacher to provide detailed feedback to individual students and to the class, but to do this effectively can be close to impossible for a variety of reasons. Imagine a class in which some students have performed superbly, others have truly bombed. The inevitable emotional response from students to their performance will make the class in which feedback takes place highly-charged and potentially difficult to manage. Moreover, students who perform most poorly will probably benefit the least from the process, which leads me to conclude that there is little point in doing it at all. To not do so, on the other hand, can feel like letting those students down and failing to explain to them where they went wrong. It would take an immense amount of self-belief and confidence.

Yet let us consider the point of feedback. If students are not shown explicitly how they can improve their grade next time round, it is inherently pointless. This may well mean that the traditional “going through the paper” is close to irrelavant to those students who performed badly in it, since they will gain little to nothing from the process of being shown the correct answers. With my own tutees I am giving them headline information about their performance by telling them the areas they need to focus on and/or the types of questions we need to practise. We will then practise other questions of the same type. This is much more effective than smoking over the smouldering embers of their cataclysmic performance under pressure – a process which is simply too threatening and disheartening to be of value.

I am more and more coming to the conclusion that Mock exams should be there to inform the teacher what the students don’t know, affording them the opportunity to focus their teaching time on those particular areas in the remaining weeks of the academic year. Mocks are not something which most students can successfully analyse or diagnose their own problems. The pressure on teachers to “go through” the Mocks at a granular level is huge, but really the process has limited – if any – value to students. We need to trust teachers to provide and guide the learning curve that students should go through, based on how they performed.

Photo by Joshua Hoehne on Unsplash

Self-talk

The importance of what we say to ourselves in our own heads has been highlighted to me in the last fortnight. A couple of weeks ago, I wrote about my reticence when it comes to travel, and found that – by the time I had finished my blog post – I had brought myself round to the idea of getting onto the plane. The very process of voicing my fears and then talking myself through the reasons that I was choosing to go abroad helped to turn things around for me, to reframe my perspective. This reminded me how powerful our own minds can be, what a difference we can make to ourselves when we take charge of our own self-talk.

Teenagers are particularly poor at self-talk, since their brains are still developing and they do not have the life-experience to have learned how to manage their feelings and their responses properly. Many young people who struggle with study can find themselves in a terrible negative loop of work-avoidance followed by beating themselves up for the work-avoidance, the result of which is such a negative experience that it only drives them to avoid the work even more. Many parents end up watching in horror from the sidelines as their child becomes more and more detached from their studies and less and less inclined towards motivation. I have written more than once on how tutoring can assist in breaking this awful cycle by demonstrating some easy wins to a child who has become convinced they can’t do something, thus sparking their motivation once they gain a small taste of success.

Yet negative self-talk is by no means confined to the young, indeed I am constantly reminded how prevalent it is in the adult population. Over the festive season I met with more than one friend who reminded me that many people say the most dreadfully negative things to themselves, and it worries me greatly. Believe me, I am not implying that we should all adopt some kind of ghastly instagram-meme-style positive self-talk: I have no truck with telling myself I am beautiful (demonstrably false) or brilliantly clever (I’m pretty average, like most of us). What I mean is that we would all benefit from checking the manner in which we speak to ourselves: the things that we say and the way in which we say them. A good general rule is this: if it’s something that you wouldn’t say out loud to a friend in distress, then why on earth are you saying it to yourself in your own head? Why should we expect ourselves to put up with insults and cruelties from our own internal voice that we would not tolerate from a friend or a partner? Would you tell an upset friend that she is being “stupid” or “pathetic” or that she needs to “get a grip”? No? Ok, then consider why you would not do such a thing. One reason, of course, is that it is not kind. But there’s more to it than that. Not only is it not kind, it is not helpful. We all realise that saying such things to a person in distress is the least likely path to resolution for them. Most people understand (either consciously or instinctively) that a person in distress needs space to talk and to express their feelings, affirmation and acknowledgement that those feelings are valid and understandable, followed then (and only then) by some support in getting those feelings into perspective. If you do this for your friends (as so many people do) but never think to apply those same principles to yourself inside your own head, then maybe it’s time for a re-think.

I have one friend who consistently calls herself “thick” when this is palpably untrue. She is a highly successful, well-qualified, interesting and capable woman. Yet whenever she can’t do something, is introduced to a new skill or finds something difficult, her default response is “it’s because I’m thick” or “I’m just thick, I don’t get it”. On one level, I don’t really think that she truly 100% believes her own words: when challenged, she will acknowledge that she is incontrovertibly capable in her chosen fields. Yet on another level, let’s just think about the fact that she calling herself “thick” out loud to me and to others and no doubt internally to herself, on a constant loop in her own mind. This simply cannot be healthy and nobody should do this to themselves. There are a million things I can’t do, have little to no natural affinity for or understanding of. I frankly never tell myself that I am “stupid” or “thick” as a result. Instead, I would say something like, “I’m not particularly good at that”. If it’s a skill I aim to acquire or at least something I wish to improve at to a basic level of competency, then I will say “I’m not particularly good at that yet – I’m working on it”. This kind of talk is advocated by those who have bought into the “growth mindset” model, something which (like most things) started quite sensibly from an evidence-based model and mutated into an epidemic of box-ticking as schools across the country attempted to apply it at an institutional level. But forget growth mindset – there is much more interesting research to support the use of appropriate self-talk.

If you haven’t read Staying Sane by Raj Persuad then I highly recommend it. The book takes a radical approach to mental health by exploring the ways in which we can all guard against the tendencies towards anxiety, depression and other common mental health conditions. Persaud explores how everyone can support themselves and build their resilience for the future. He has a whole chapter on self-talk and one on being your own shrink. He scripts how you should talk to yourself when you’re experiencing feelings of distress or overwhelm and the first time I tried it I could not quite believe the difference it made. It was genuinely extraordinary. But when you think about it, why should this be so surprising? It actually makes perfect sense. Imagine again the scenario in which a distressed friend is sobbing her heart out, saying she feels lonely and anxious. Then picture yourself telling her to shape up and stop whingeing, that her tears are embarassing and pathetic. It’s genuinely unimaginable, isn’t it? Simply and utterly awful. Nobody would do this. Yet this is exactly what so many people do say themselves inside their own heads. In place of this kind of self-abuse (for this is what it is), Persaud advocates talking to yourself along these lines: “you’re feeling really upset, and that is perfectly understandable because X has happened and/or this situation has triggered memories of Y. Hang in there. This feeling will pass. You just need to ride out the storm.”

The first time you try it, it feels a little strange. However, I guarantee you that the impact will be so great that the strangeness will wear off immediately. Being your own friend is a far more sensible approach than giving yourself a kick up the butt every time you’re having a bad day. Since when did that particular approach work for anybody, ever? So, if you recognise yourself in any of this, maybe it’s time for a belated New Year’s resolution: stop talking unkindly to yourself. Stop insulting yourself. Stop saying things to yourself that you would not say to anybody else and start saying the things you would say to your friends when they need support. It’s the least you can do for yourself and a better path to sanity.

None of this has anything to do with smug self-satisfaction or any kind of conviction that you are anything more than an ordinary person doing your best. All Persaud advocates for is affording yourself the same kind of empathy and dignity that you would afford to others. “Do unto others as you would have others do unto you” is a mantra repeated several times in the Bible and can be found as a principle in most major world religions. It’s a great mantra. Yet quite often, especially for people who have habitually negative thought patterns, the saying really needs turning around. “Do unto yourself as you would have yourself do unto others”. Be kind to yourself. Be strong for yourself. Be understanding of yourself. Trust me, it makes life a whole lot easier.

Photo by Adi Goldstein on Unsplash

A time and a place

The appropriate use of humour has been on my mind this week, as I find myself back in the chilly UK. My week in the sunshine was definitely worth the journey, which was remarkably tolerable, certainly by comparison with other experiences I have had in the past. Nothing alarming happened on the flight, although my husband remarked that he would be keeping himself well strapped into his emergency exit seat, given recent events.

Our week in a hotel on the outskirts of Marrakesh was a new experience for me, as I have never before travelled to a country where the dominant religion is Islam. Hearing the early call to prayer was an amazing experience, as were the sights and sounds of the historic city and the souks. Most incredible of all, however, was the hot air balloon ride my husband talked me into.

I noticed the option on our hotel’s list of activities and remarked that I could certainly see the appeal but was not sure whether or not I felt able to go ahead with what seemed like such a risky activity. Standing in a basket, thousands of feet up into the air, dangling from a sack full of hot air has always seemed to me to be a somewhat insane proposition, but my husband gawped at me in disbelief. “But you’ve been up in a light aircraft with me!” he spluttered. (My husband gained his pilot’s licence many years before we met). Long story short, he enlightened me as to the fact that – statistically – light aircraft are infinitely more dangerous than hot air balloons (a fact he didn’t pass on to me before I gave the light aircraft a go). My husband reads air accident reports as a hobby (everybody needs one), and explained that balloon accidents tend to be what amounts to no more than a bumpy landing, leaving someone with a broken wrist or collar bone – they don’t tend to result in fatalities. So, armed with my husband’s superior knowledge of all things air crash-related, I agreed. We booked ourself onto the flight.

The flight was at dawn, which meant we saw the sun rise over the Atlas mountains, a simply incredible sight. The flight itself was absolutely wonderful, with no sense of motion apparent – as you move with the wind, you can’t feel the wind as you move, making the process remarkably tranquil. The silence is also striking, when you’re used to the engine noise of any other means of flight. Not only did I enjoy the experience, I would do it again in a heartbeat. As it turned out, I was not in the least bit afraid once we got there, and the French pilot dispelled any last-minute nerves with a tension-breaking bit of humour. Once we were a few feet off the ground, he turned to us and said, “First time in a balloon?” We nodded vigorously. “Me too!” he said, as he gave the burners a blast.

This kind of humour is right up my street and is without question the best way to win me over in pretty much any situation. The last time I thought about this in any depth was when I first went to a local osteopath. I have always been nervous of osteopathy, as I have scoliosis of the spine and my vertebrae don’t really behave like everybody else’s. As a result, I have awful visions of someone trying to crack my spine in a way it just won’t work and somehow breaking it, leaving me paralysed or worse. I always arrive in any clinic with a list of don’ts and caveats as long as my arm, and most osteopaths nod sagely and do exactly as they’re told.

Ian, however, is different.

“Look,” I said to him, in our first appointment. “You need to understand that my spine is quite rigid in places and won’t bend in the way you might expect. I’m most anxious not to get injured so it’s really important that you don’t do anything beyond what I’m confortable with.”

“No problem,” said Ian. “But what you need to understand is that if I break your neck …”

I started to babble. “Oh gosh, no, I totally realise that your career is in the balance and that as a professional you will take the ultimate care. I wasn’t suggesting that you would be anything other than hyper-cautious, I do realise that, it’s just I’m …”

“No no” he interrupted. “If I break your neck, then I’m left with a body to dispose of. And it’s not as easy as you might think. Especially if I’ve got a lot of appointments.”

I stared at him for a moment, then reacted in the only way appropriate. I laughed my head off. What an absolute legend. While this kind of humour might not be for everyone, it absolutely works for me in moments of tension. When I was 16, my orthodontist reflected on our 12-year journey of hideous braces and major surgery. My teeth were not perfectly straight, but they were roughly in line and infinitely better than when we started. What he wanted to do was to reflect on our excellent progress and a job well done. What he actually said was, “you can’t make a silk purse out of a sow’s ear”. Just as well for him that I found it hilarious.

Humour is my go-to response in most tense situations and has helped me to deal with innumerable challenges in my life. I am not alone in this. I know one couple who have visited North Korea as tourists (it is possible, believe it or not) and recall one of them saying that the main problem she had was not laughing in moments when ultimate seriousness was demanded – when, for example, witnessing the 24-hour wailing that goes on in the room where the bodies of deceased illustrious leaders lie in state. The performative grief was so ludicrous that she was completely gripped by the urge to laugh, especially since they had just done the tour of the government building which included a map of the world without the USA on it, plus an Apple Macbook Pro sat on the desk underneath it. I totally understand this urge towards inappropriate laughter. I am the sort of person that has to be careful not to laugh at funerals – that feeling of tense, wild hysteria often overtakes me at the most inppropriate of moments.

There’s a time and a place for everything, but some of us find release in the use of humour at what might seem like the most inappropriate of times. People in particularly stressful jobs probably best understand this kind of gallows humour and to some extent I think it’s cultural too. Wherever I am in the world, nothing makes me feel more at home than someone poking fun at what would otherwise be a tense or serious situation.

Photograph taken by my husband during our balloon flight