First, do no harm

primum non nocere: first, do no harm.

A central tenet of the Hippocratic oath

As Tom Bennet OBE wrote on the platform formerly known as Twitter this week, “Even qualified practitioners are bound to ‘do no harm’. But the desire to support children leads many schools to well-meant but potentially damaging mental health ‘interventions’.”

This week I have listened to a quite horrifying piece of investigative journalism by the Financial Times into Goenka mindfulness retreats, at which attendees are encouraged to practise an extreme kind of meditation known as Vipassana. People on the retreat are not allowed to speak and strongly discouraged from leaving for 10 days. They are awakened at 4.00am, deprived of food and taught to medidate for multiple hours per day. Anyone who struggles with the process or becomes confused or distressed is encouraged to keep meditating. For those of you with even the most basic grasp of mental health and wellbeing, it will not come as a massive shock to discover that some people are affected very negatively by this process. I recommend you listen to the podcast but please be aware that it does not shy away from some very difficult material: there are people who have lost their loved ones to this process.

Human beings are social animals. We have evolved to live in groups and we know that extreme social isolation and withdrawal has a very negative effect on mental health and wellbeing in an extremely short time. The dangerous impact of solitary confinement is well-documented and has caused neuroscientists to campaign against its prolonged use in the penal system. Even good old-fashioned and ever-familiar loneliness has been proved to have a significant impact on a person’s health and longevity, never mind their psychological well-being. It should not surprise us in the least to discover that a process which demands people shut themselves off from each other and concentrate entirely and exclusively on the what’s inside their own head carries the risk of a psychotic break.

As part of my studies during my degree in Classics I did a course on the rise of Christianity in the Roman world. I recall reading an account of the life of St Antony by the Bishop Athanasius and being particularly struck by a passage that reports upon his demeanour when leaving a fortress in which he had shut himself for 20 years in order to commune with God and battle his demons. It reads as follows:

“Antony, as from a shrine, came forth initiated in the mysteries and filled with the spirit of God. Then for the first time he was seen outside the fort by those who came to see him. And they, when they saw him, wondered at the sight, for he had the same habit of body as before … but his soul was free from blemish, for it was neither contracted as if by grief, nor relaxed by pleasure, nor possessed by laughter or dejection. For he was not troubled when he beheld the crowd, nor overjoyed at being saluted by so many.”

While I do not wish to mock or offend anyone’s deeply-held beliefs, it seems pretty clear to me that this is a description of someone who has completely detached from other human beings and is suffering from the psychological effects of that process. While the religiously-minded among you may see this as an account of someone in touch with the holy spirit, I see it as an account of someone who is suffering from a psychotic break. Antony is described as being unmoved by and disconnected from the people around him, in possession of a strange kind of detachment. Given that he had spent 20 years in isolation while – in his mind – battling between good and evil, this is not greatly surprising.

During my final few years in mainstream education there was a big push on “mindfulness” for all students. This was what Tom Bennet was referring to in the Tweet I quoted at the start pf this blog and I share his concerns about this growing trend. The mental health of young people is a painful and emotive issue and has been brought into sharp relief once again with calls from a grieving mother asking for mindfulness to be rolled out across all state schools (although it is already being promoted and practised in many). As Daniel Bundred wrote on the same platform as Tom a few months ago, “Schools probably shouldn’t do mindfulness, because most teachers are fundamentally unqualified to lead mindfulness, and entirely unequipped to deal with the potential outcomes of it.” As he puts it, “Mindfulness strikes me as being very similar to guided meditation in approach and potentially outcome; how many teachers could handle a student experiencing ego-death in their classroom? Ego-death is a potential outcome of successful meditation, it’s not desirable in tutor time.” Daniel here is referencing exactly the kind of experiences that the young people who underwent a psychotic break at the Goenka retreats have experienced. This is of course the worst-case scenario and while not widespread it is crucially important consider if we are to stick to the concept of “do no harm”; the advocates of the Goenka retreat point to the many people who say that meditation has helped them, as if the handful of attributable deaths are therefore irrelevant. It is essential to remember that teachers (like the volunteers at the Goenka retreats) are not mental health experts; fiddling about with something as potentially profound and intimate as mindfulness or meditation is profundly dangerous and goes way beyond the remit of educators.

Beyond the enormous risk of potential harm to a student who may have experienced past trauma or may simply not be an appropriate candidate for mindfulness for a variety of reasons, there is an increasing amount of evidence indicating that mindfulness in schools does no good for anybody. A recent study revealed no tangible positive outcomes, which places the profund risk of harm to some in an even more alarming context. Why are we doing something with risks attached to it when there are no estimable benefits anyway? Beyond this, why are we demanding that teachers expend their time and energy on something unnproven and valueless?

Tom Bennet is right. As he puts it: “The best way to support children’s mental health in a school environment? Provide a culture that is safe, calm and dignified. With purposeful activities.” In our desperation to support the most vulnerable of children, we must never forget the simple power of providing routine, stability and boundaries for those whose personal and emotional lives may well (for all we know) be dominated by chaos, trauma and distress. The more we acknowledge that some children face the most horrifying of circumstances, the more essential the security of our education system becomes. School and the reassurance that its stability provides is a lifeline for many of our children. This is what we should be providing for them.

Photo by Colton Sturgeon on Unsplash

False judgements

Emotions got a bad rap from ancient philosophers. Most agreed that the ideal state was a kind of calmness that the Hellenistic philosophers (most famously the Epicureans and the Stoics) called ataraxia. There was even talk of apatheia – a detachment from the chaos of feelings and overwhelm. This is perhaps unsurprising if you understand the birth of western philosophy; if you’re trying to formulate, define and distil the key to the perfect life and the perfect society (which is what the early founders of western philosophy were trying to do) then it probably doesn’t include your citizens experiencing a rollercoaster of emotions. Once you’ve admitted that emotions are a bit of a distraction and often cause issues both on a personal level and for society, it’s not much of an overreach to find yourself arguing for a state of detachment.

The term “stoic” these days is synonymous with having a “stiff upper lip” but this is based on a crucial misunderstanding of the Stoic position. The Stoics did not advocate for iron-clad self-control or suppressing your feelings. Rather, they believed that all emotions were what they called “false judgements”, which meant that they were based on a misunderstanding: if you’re feeling them, you’re still getting it wrong. In the ideal philosophical life that they strove for, a person would have such a great understanding of himself, the world and his place within it that he would not suffer at the slings and arrows of outrageous fortune: he would simply nod and know the right thing to do. One example given is that a Stoic would run into a burning building in order to attempt to save a child because that is the right thing to do; they also argued, however, that a true Stoic would feel no distress when his mission failed. Weird, isn’t it? Interesting, though.

One of the frustrating things about this period of philosophy is that much of the writings that we have are general “sayings”, snippets or purported quotations which appear in the works of later authors, usually writing in Latin rather than in Greek, and reporting on what a particular thinker or school of thinkers believed. The reality of this of course is that they may be wrong. For example, there is a famous quotation attributed to Epicurus that states “the wise man is happy on the rack”. Quite how this works within a school of philosophy that was dedicated to the avoidance of pain is puzzling. If the quotation is correct, our best guess is that the Epicureans certainly spent a lot of their time considering the correct attitude towards unavoidable pain, for this was one of the biggest challenges to their philosophical position; presumably the “wise man” – someone at the pinnacle of philosophical endeavour – would know how to cope with pain in extremis.

Most people see Epicureanism and Stoicism as polar opposites and they were indeed rival schools of philosophy at the time. As so often, however, there was more that united them than divided them. Both schools were arguing and aiming for the perfect life and the state of detachment that philosophers before them had explored; both schools were concerned with how to manage our responses to pain and distress. Perhaps the biggest difference is that the Stoics believed in proactive, conscious and deliberate involvement in society and its structures, whereas the Epicureans were a bit more lethargic about the whole idea – getting involved with politics is painful and distressing, so is it really rational to bother?

One philosopher, writing before the Stoics and the Epicureans, was unusual in his take on emotions. Aristotle argued that emotions were appropriate and necessary: the trick was understanding when and how you should be feeling them and what to do with them. He spoke of “righteous anger” and argued that a good philosopher would indeed feel such a thing. It is difficult to explain how truly radical this position was, when the way the philosophical movement was drifting was towards ataraxia and apatheia. Aristotle also smashed through the Socratic idea that philosophical ideals such as “courage” and “justice” could be defined in one way and that if one could not do so then one lacked an understanding of them. Aristotle argued that there were multiple forms of “courage” and “justice” and that nobody could define them in one simple way nor apply their principles in individual cases without discussion, debate and compromise. What a genius he was.

Why the hell am I writing about this? Well, I spoke to a friend yesterday who has taken a decision about which she feels guilty. I cannot divulge the details of this decision as I do not want to betray her confidence. Suffice to say that it was a professional decision, the right decision and one which the people affected will hopefully benefit from in the long-run. There is no doubt – in my mind and even in hers – that the decision was right and good. Yet she still feels what she describes as “guilty” about it.

This reminded me yet again of The Greeks and the Irrational by ER Dodds, a book written in the 1950s, which I mentioned in another blog a few weeks ago. One of the chapters in the book argues that the Athenian world was a “shame culture” and that later ancient societies – the Hellenistic world and the Roman worlds – began the shift towards a “guilt culture”. I have thought about this on and off all of my life. The very thought that the nature of one’s emotions can be dictated by the society in which one grows up is fascinating to me. Dodds argues (rightly, I think) that modern society is more person-centric and hence feelings such as guilt can be internalised; in Athens, one’s personal standing and engagement with society was more relevant (a symptom perhaps of living in a small and emergent city-state) and therefore a sense of shame before others was more powerful than any kind of internalised guilt.

As I listened to my friend who left me some WhatsApp voice messages (I love them – it’s like receiving a personalised podcast!) I found myself wondering whether the Stoics had it right. Sometimes emotions truly are false judgements. My friend has no reason to feel guilty about her actions and she should strive to release herself from the false state of mind in which this feeling distresses her. According to the Stoic ideal she has prevailed in her actions but has not yet achieved the ideal state of detachment. So how should she achieve this goal? Well, I guess it depends on your approach to these things. A Stoic would advocate for rigorous rational analysis and say that this will eventually lead to release from one’s feelings. This is not, in fact, a million miles away from cognitive behavioural therapy, the therapy model supported by psychiatrists and many psychologists, who would say that she needs to question why she feels guilty and challenge her reasons for doing so. A psychologist with leanings towards the psychodynamic model would argue that she needs to explore where her feelings might stem from – does the situation remind her of experiences in her past, during which she has been made to feel or to carry guilt that perhaps should not have been hers? (Pretty sure the Stoics wouldn’t have been up for that one).

Whatever the answer in this particular circumstance, personally I find myself returning to the Stoics time and again. They were a fascinating turning point in philosophical history and paved the way – I believe – towards modern psychiatry. After all, what is the difference between sanity and insanity if not the difference between the rational and the irrational, the true and the untrue, the controlled and the uncontrolled? I will leave you with the Stoic image of how the individual should relate to society – not because I advocate for it, necessarily, but because it’s a classic and a model I have never stopped thinking about since I first learned about it in the 1990s. The Stoics believed that individuals could not control fate but they also argued that individuals had free will. So an individual person is like a dog tied to the back of a wagon. Whatever the dog’s actions, the wagon will go on its way. So how does the dog have free will? Well, he can resist the wagon and be dragged along, impeding the wagon’s progress and damaging himself along the way. Alternatively, he can trot along like a good dog and help the wagon to proceed smoothly.

This incredible photo is by Jaseel T on Unsplash.
It was taken in the Museum of the Future in Dubai

Perchance to dream?

Last night I dreamt that Roald Dahl was in prison. Not exactly “I went to Manderley again” as an opening line, but it’s the truth.

Despite centuries of interest in the subject and recent studies with all the benefits of modern science, dreams are still not fully understood. They are generally acknowledged to be a by-product of evolution and quite possibly the brain’s way of processing and sorting information, but exactly how and why they occur is still debated. Some neuroscientists and psychologists argue that they help us to organise our memories, others suggest that they are part of the important process of forgetting or “dumping” unnecessary clutter from our minds. Some believe that they are a way of safely practising difficult scenarios, and some have even claimed that the frequency of dreams in which we are being chased – particularly in childhood – is evidence for their origins in our early evolutionary history. I’m not sure I buy that, not least because it falls into the trap of believing that everything that evolves does so for an obvious purpose. Dreams may simply be a by-product of our extraordinarily large and complex brain-structures: they may not necessarily be essential or advantageous in the battle of survival and reproduction. One thing’s for sure, it is frequently difficult to explain how a particular story ends up being told in one’s mind overnight; last night, my brain placed a long-dead children’s author behind bars.

Dreams mainly occur while we are in REM sleep, which for adult humans makes up only around two hours per night of our sleep time. Yet some research indicates that a human foetus in utero, by the time it reaches the third trimester, spends around 20 hours out of each 24-hour cycle in REM sleep. Is the foetus dreaming for all of that time? If so, what on earth is it dreaming about and how does that relate to the commonly-accepted idea that dreams are remnants of our thoughts?

When I was doing my PhD I spent an inordinate amount of time going down rabbit holes of research into this kind of thing. The ancient work I studied (which I have written about in a little more detail before) mentions in passing that messages from the gods come to us in the hazy state between sleeping and waking, a state now defined as “hypnogogic” and one into which there has been a considerable amount of research. I became fascinated by the idea of different brain-states and how people may experience phenomena such as audible hallucinations and thus become convinced that they are receiving messages from a divine source. I read all sorts of stuff written by anthropologists, neurologists and psychologists and realised just how little I knew about the grey matter inside my own skull.

When it comes to studying, one of the things worth knowing about the brain is that “memory is the residue of thought” meaning that “the more you think about something, the more likely it is that you’ll remember it later.” (Daniel T. Willingham). This might seem obvious but you wouldn’t believe how little consideration is given to this fact in our education system. Students will only recall things that they are actively thinking about – reading and highlighting, for example, are both passive activities which are very unlikely to aid recall. If you need to absorb, understand and recall the information written on a page, you should put the book down and reproduce its contents in your own words in order to have any chance of being able to remember it. This process forces you brain to begin forming memories, which are in fact reconstructions: memory doesn’t work like a recording, it is rather the brain constantly reconstructing its past experiences, which explains why eye-witness accounts are so unreliable and why each individual may remember the same situation very differently from other people.

All of this means – I’m afraid – that those fantasies people have about listening to recordings while they sleep and miraculously waking up knowing the information on the recording really are that – just fantasies. The brain is not a computer: you can’t do a reboot and download while it’s powered down. Much as one would like to wake up like Neo in The Matrix with a newfound perfect knowledge of and ability to perform Kung Fu, the reality is that learning new information or a new skill requires constant use, review and practice.

All of that said, it is undeniable that sleep (and – for reasons we have yet to understand – dreaming) is essential for good learning. This is not only because exhaustion is detrimental to study, it is also because that downtime really is important for the brain to be able to do its job properly, especially when we are making big demands of it. Further to this, “sleeping on a problem” can often make a huge difference, in ways that are once again not fully understood. My father, a brilliant engineer, often reported waking up with a solution to a problem he had been grappling with and failing to solve during his waking hours. Similarly, I have found that I can be completely stuck on a crossword clue but when I come back to it the next day and pick up the clue again, the solution seems blindingly obvious, even though I have given it no proactive thought in the last 24 hours. This kind of background problem-solving really is a fascinating quirk of brain-states and one I wonder whether neuroscientists will be able to explain in the future.

Many parents worry that their children are not getting enough sleep and there is certainly a lot of evidence that many young people, particularly teenagers, are sleep-deprived. The best advice remains to observe good digital hygiene: do not under any circumstances allow your child to take their devices to bed. Personally, I do have my phone beside my bedside but all notifications switch off after my bedtime (you can set emergency numbers from loved ones as exceptions to this rule, by the way) so it does not disturb me after I have gone to bed and I am not fascinated enough by it to have the urge to check it during the night. This is not true of most teenagers when it comes to their smart phones, and they need protecting from this temptation.

I have resolved to read more about dreaming and sleep-states, as I have no doubt that the research has moved on since I last dipped into this field. One of my favourite games to play is to try to trace where my dreams have come from. Why did I put Roald Dahl behind bars? Well, this week I’ve been watching a police drama with lots of scenes in cells, plus I have also read a fair bit about “cancel culture” over the last few weeks, which may have set off a chain of links in my mind to something I read about Dahl’s works being edited to remove language that is deemed not to resonate with the current zeitgeist. Is that where it all came from? Quite probably. Dreams are rarely, if ever, significant. I look forward to increasing my knowledge. Perhaps we now know whether androids dream of electric sheep.

Photo by Ihor Malytskyi on Unsplash

Post-mock post-mortem?

No matter how many years I spent at the chalkface, I remained unconvinced as to the value of dissecting children’s Mock papers in class. While there was always an urge to pore over mistakes and demonstrate to students exactly what they should have written, I never felt that the process added as much value as I would have liked. Now that I am separated from the classroom, it is perhaps easier to reflect on why that might be.

Even if students have already received their overall grades (my old school used to dish them out in enevelopes to give them the “full experience” of receiving their results), the class in which students first gain sight of their papers in the one where they see how they performed in the separate papers of each exam. In most schools, they may also have just been told their overall grade by the teacher. This, to me, is the problem. Ever since Black and Wiliam first published their seminal work on assessment for learning (a concept they now wish they had named “responsive teaching”), the authors observed that students take significantly less notice of feedback if there is a grade attached to it, rendering the process of feedback close to pointless. This should not surprise us greatly: it is a natural response to be fixated on how you performed overall rather than the minutiae of why that result has come to pass, especially when the overall performance grade is high-stakes. It is very difficult for students to let go of their emotional response to their grade (whether it be good or bad) and concentrate on the feedback offered. This goes especially for students who are shocked and/or upset by their result, and thus calls into question the wisdom of the entire process.

It is difficult for classroom teachers to know what to do for the best. Every instinct drives any good teacher to provide detailed feedback to individual students and to the class, but to do this effectively can be close to impossible for a variety of reasons. Imagine a class in which some students have performed superbly, others have truly bombed. The inevitable emotional response from students to their performance will make the class in which feedback takes place highly-charged and potentially difficult to manage. Moreover, students who perform most poorly will probably benefit the least from the process, which leads me to conclude that there is little point in doing it at all. To not do so, on the other hand, can feel like letting those students down and failing to explain to them where they went wrong. It would take an immense amount of self-belief and confidence.

Yet let us consider the point of feedback. If students are not shown explicitly how they can improve their grade next time round, it is inherently pointless. This may well mean that the traditional “going through the paper” is close to irrelavant to those students who performed badly in it, since they will gain little to nothing from the process of being shown the correct answers. With my own tutees I am giving them headline information about their performance by telling them the areas they need to focus on and/or the types of questions we need to practise. We will then practise other questions of the same type. This is much more effective than smoking over the smouldering embers of their cataclysmic performance under pressure – a process which is simply too threatening and disheartening to be of value.

I am more and more coming to the conclusion that Mock exams should be there to inform the teacher what the students don’t know, affording them the opportunity to focus their teaching time on those particular areas in the remaining weeks of the academic year. Mocks are not something which most students can successfully analyse or diagnose their own problems. The pressure on teachers to “go through” the Mocks at a granular level is huge, but really the process has limited – if any – value to students. We need to trust teachers to provide and guide the learning curve that students should go through, based on how they performed.

Photo by Joshua Hoehne on Unsplash


The importance of what we say to ourselves in our own heads has been highlighted to me in the last fortnight. A couple of weeks ago, I wrote about my reticence when it comes to travel, and found that – by the time I had finished my blog post – I had brought myself round to the idea of getting onto the plane. The very process of voicing my fears and then talking myself through the reasons that I was choosing to go abroad helped to turn things around for me, to reframe my perspective. This reminded me how powerful our own minds can be, what a difference we can make to ourselves when we take charge of our own self-talk.

Teenagers are particularly poor at self-talk, since their brains are still developing and they do not have the life-experience to have learned how to manage their feelings and their responses properly. Many young people who struggle with study can find themselves in a terrible negative loop of work-avoidance followed by beating themselves up for the work-avoidance, the result of which is such a negative experience that it only drives them to avoid the work even more. Many parents end up watching in horror from the sidelines as their child becomes more and more detached from their studies and less and less inclined towards motivation. I have written more than once on how tutoring can assist in breaking this awful cycle by demonstrating some easy wins to a child who has become convinced they can’t do something, thus sparking their motivation once they gain a small taste of success.

Yet negative self-talk is by no means confined to the young, indeed I am constantly reminded how prevalent it is in the adult population. Over the festive season I met with more than one friend who reminded me that many people say the most dreadfully negative things to themselves, and it worries me greatly. Believe me, I am not implying that we should all adopt some kind of ghastly instagram-meme-style positive self-talk: I have no truck with telling myself I am beautiful (demonstrably false) or brilliantly clever (I’m pretty average, like most of us). What I mean is that we would all benefit from checking the manner in which we speak to ourselves: the things that we say and the way in which we say them. A good general rule is this: if it’s something that you wouldn’t say out loud to a friend in distress, then why on earth are you saying it to yourself in your own head? Why should we expect ourselves to put up with insults and cruelties from our own internal voice that we would not tolerate from a friend or a partner? Would you tell an upset friend that she is being “stupid” or “pathetic” or that she needs to “get a grip”? No? Ok, then consider why you would not do such a thing. One reason, of course, is that it is not kind. But there’s more to it than that. Not only is it not kind, it is not helpful. We all realise that saying such things to a person in distress is the least likely path to resolution for them. Most people understand (either consciously or instinctively) that a person in distress needs space to talk and to express their feelings, affirmation and acknowledgement that those feelings are valid and understandable, followed then (and only then) by some support in getting those feelings into perspective. If you do this for your friends (as so many people do) but never think to apply those same principles to yourself inside your own head, then maybe it’s time for a re-think.

I have one friend who consistently calls herself “thick” when this is palpably untrue. She is a highly successful, well-qualified, interesting and capable woman. Yet whenever she can’t do something, is introduced to a new skill or finds something difficult, her default response is “it’s because I’m thick” or “I’m just thick, I don’t get it”. On one level, I don’t really think that she truly 100% believes her own words: when challenged, she will acknowledge that she is incontrovertibly capable in her chosen fields. Yet on another level, let’s just think about the fact that she calling herself “thick” out loud to me and to others and no doubt internally to herself, on a constant loop in her own mind. This simply cannot be healthy and nobody should do this to themselves. There are a million things I can’t do, have little to no natural affinity for or understanding of. I frankly never tell myself that I am “stupid” or “thick” as a result. Instead, I would say something like, “I’m not particularly good at that”. If it’s a skill I aim to acquire or at least something I wish to improve at to a basic level of competency, then I will say “I’m not particularly good at that yet – I’m working on it”. Some of this kind of talk is advocated by those who have bought into the “growth mindset” model, something which (like most things) started quite sensibly from an evidence-based model and mutated into an epidemic of box-ticking as schools across the country attempted to apply it at an institutional level. But forget growth mindset – there is much more interesting research to support the use of appropriate self-talk.

If you haven’t read Staying Sane by Raj Persuad then I highly recommend it. The book takes a radical approach to mental health by exploring the ways in which we can all guard against the tendencies towards anxiety, depression and other common mental health conditions. Persaud explores how everyone can support themselves and build their resilience for the future. He has a whole chapter on self-talk and one on being your own shrink. He scripts how you should talk to yourself when you’re experiencing feelings of distress or overwhelm and the first time I tried it I could not quite believe the difference it made. It was genuinely extraordinary. But when you think about it, why should this be so surprising? It actually makes perfect sense. Imagine again the scenario in which a distressed friend is sobbing her heart out, saying she feels lonely and anxious. Then picture yourself telling her to shape up and stop whingeing, that her tears are embarassing and pathetic. It’s genuinely unimaginable, isn’t it? Simply and utterly awful. Nobody would do this. Yet this is what so many people do actually say themselves in their own heads. In place of this kind of self-abuse (for this is what it is), Persaud advocates talking to yourself along these lines: “you’re feeling really upset, and that is perfectly understandable because X has happened and/or this situation has triggered memories of Y. Hang in there. This feeling will pass. You just need to ride out the storm.”

The first time you try it, it feels a little strange. However, I guarantee you that the impact will be so great that the strangeness will wear off immediately. Being your own friend is a far more sensible approach than giving yourself a kick up the butt every time you’re having a bad day. Since when did that particular approach work for anybody, ever? So, if you recognise yourself in any of this, maybe it’s time for a belated New Year’s resolution: stop talking unkindly to yourself. Stop insulting yourself. Stop saying things to yourself that you would not say to anybody else and start saying the things you would say to your friends when they need support. It’s the least you can do for yourself and a better path to sanity.

None of this has anything to do with smug self-satisfaction or any kind of conviction that you are anything more than an ordinary person doing your best. All Persaud advocates for is affording yourself the same kind of empathy and dignity that you would afford to others. “Do unto others as you would have others do unto you” is a mantra repeated several times in the Bible and can be found as a principle in most major world religions. It’s a great mantra. Yet quite often, especially for people who have habitually negative thought patterns, the saying really needs turning around. “Do unto yourself as you would have yourself do unto others”. Be kind to yourself. Be strong for yourself. Be understanding of yourself. Trust me, it makes life a whole lot easier.

Photo by Adi Goldstein on Unsplash

A time and a place

The appropriate use of humour has been on my mind this week, as I find myself back in the chilly UK. My week in the sunshine was definitely worth the journey, which was remarkably tolerable, certainly by comparison with other experiences I have had in the past. Nothing alarming happened on the flight, although my husband remarked that he would be keeping himself well strapped into his emergency exit seat, given recent events.

Our week in a hotel on the outskirts of Marrakesh was a new experience for me, as I have never before travelled to a country where the dominant religion is Islam. Hearing the early call to prayer was an amazing experience, as were the sights and sounds of the historic city and the souks. Most incredible of all, however, was the hot air balloon ride my husband talked me into.

I noticed the option on our hotel’s list of activities and remarked that I could certainly see the appeal but was not sure whether or not I felt able to go ahead with what seemed like such a risky activity. Standing in a basket, thousands of feet up into the air, dangling from a sack full of hot air has always seemed to me to be a somewhat insane proposition, but my husband gawped at me in disbelief. “But you’ve been up in a light aircraft with me!” he spluttered. (My husband gained his pilot’s licence many years before we met). Long story short, he enlightened me as to the fact that – statistically – light aircraft are infinitely more dangerous than hot air balloons (a fact he didn’t pass on to me before I gave the light aircraft a go). My husband reads air accident reports as a hobby (everybody needs one), and explained that balloon accidents tend to be what amounts to no more than a bumpy landing, leaving someone with a broken wrist or collar bone – they don’t tend to result in fatalities. So, armed with my husband’s superior knowledge of all things air crash-related, I agreed. We booked ourself onto the flight.

The flight was at dawn, which meant we saw the sun rise over the Atlas mountains, a simply incredible sight. The flight itself was absolutely wonderful, with no sense of motion apparent – as you move with the wind, you can’t feel the wind as you move, making the process remarkably tranquil. The silence is also striking, when you’re used to the engine noise of any other means of flight. Not only did I enjoy the experience, I would do it again in a heartbeat. As it turned out, I was not in the least bit afraid once we got there, and the French pilot dispelled any last-minute nerves with a tension-breaking bit of humour. Once we were a few feet off the ground, he turned to us and said, “First time in a balloon?” We nodded vigorously. “Me too!” he said, as he gave the burners a blast.

This kind of humour is right up my street and is without question the best way to win me over in pretty much any situation. The last time I thought about this in any depth was when I first went to a local osteopath. I have always been nervous of osteopathy, as I have scoliosis of the spine and my vertebrae don’t really behave like everybody else’s. As a result, I have awful visions of someone trying to crack my spine in a way it just won’t work and somehow breaking it, leaving me paralysed or worse. I always arrive in any clinic with a list of don’ts and caveats as long as my arm, and most osteopaths nod sagely and do exactly as they’re told.

Ian, however, is different.

“Look,” I said to him, in our first appointment. “You need to understand that my spine is quite rigid in places and won’t bend in the way you might expect. I’m most anxious not to get injured so it’s really important that you don’t do anything beyond what I’m confortable with.”

“No problem,” said Ian. “But what you need to understand is that if I break your neck …”

I started to babble. “Oh gosh, no, I totally realise that your career is in the balance and that as a professional you will take the ultimate care. I wasn’t suggesting that you would be anything other than hyper-cautious, I do realise that, it’s just I’m …”

“No no” he interrupted. “If I break your neck, then I’m left with a body to dispose of. And it’s not as easy as you might think. Especially if I’ve got a lot of appointments.”

I stared at him for a moment, then reacted in the only way appropriate. I laughed my head off. What an absolute legend. While this kind of humour might not be for everyone, it absolutely works for me in moments of tension. When I was 16, my orthodontist reflected on our 12-year journey of hideous braces and major surgery. My teeth were not perfectly straight, but they were roughly in line and infinitely better than when we started. What he wanted to do was to reflect on our excellent progress and a job well done. What he actually said was, “you can’t make a silk purse out of a sow’s ear”. Just as well for him that I found it hilarious.

Humour is my go-to response in most tense situations and has helped me to deal with innumerable challenges in my life. I am not alone in this. I know one couple who have visited North Korea as tourists (it is possible, believe it or not) and recall one of them saying that the main problem she had was not laughing in moments when ultimate seriousness was demanded – when, for example, witnessing the 24-hour wailing that goes on in the room where the bodies of deceased illustrious leaders lie in state. The performative grief was so ludicrous that she was completely gripped by the urge to laugh, especially since they had just done the tour of the government building which included a map of the world without the USA on it, plus an Apple Macbook Pro sat on the desk underneath it. I totally understand this urge towards inappropriate laughter. I am the sort of person that has to be careful not to laugh at funerals – that feeling of tense, wild hysteria often overtakes me at the most inppropriate of moments.

There’s a time and a place for everything, but some of us find release in the use of humour at what might seem like the most inappropriate of times. People in particularly stressful jobs probably best understand this kind of gallows humour and to some extent I think it’s cultural too. Wherever I am in the world, nothing makes me feel more at home than someone poking fun at what would otherwise be a tense or serious situation.

Photograph taken by my husband during our balloon flight

Are we there yet?

caelum non animum mutant qui trans mare currunt.

Those who race across the sea change their horizon, not their mind.


On the day this post is published I shall be in Morocco, hopefully in the sunshine. As I write, here in the UK, the sky is dark and rain is hammering at the windows, the miserable weather a perfect encapsulation of the reasons why my husband and I are choosing to travel abroad at this time of year. Yet, as the day of our departure approaches, I find a small portion of myself feeling like I don’t want to go.

This always happens to me. I am not a great traveller, indeed my feelings around the process of travel would be classed by many as a phobia or – at the very least – a strong, visceral aversion. Were I not married to someone who wishes to travel abroad then I suspect that I would have found an excuse never to do so by now. The enormous pressure of running school trips abroad is something I have written about before, and made up a small but significant part of what contributed to my decision to draw my teaching career to a close. Covid hasn’t helped me either, as I must confess I rather enjoyed having all pressure to travel removed from my shoulders and it’s been quite a personal challenge to get myself back into the swing of things now that restrictions have been lifted. I won’t bore you with the details as it would mean far too much over-sharing, but suffice to say I find travelling very challenging and will find every excuse under the sun to do less of it. I don’t like leaving the house, my friends, my family the cats. You name it, I’ll use it as a reason not to go.

Believe me, I am deeply aware that these are First World Problems of the most unsympathetic kind and demand no commiserations whatsoever. I am not moaning. I have no reason to. Nobody forces me to travel and there is a significant part of me that wishes to do so. Doing things outside one’s comfort zone is not only good for the soul, it is one of the many compromises that marriage demands of us – when you have a partner, you cannot simply do exactly what you want to do every minute of every day; you have to consider beloved’s needs and desires also. A bit of travel is part of the deal.

I mentioned my reticence about travelling to a friend the other day and she remarked that she would probably not travel abroad on a regular basis were it not for her partner’s desire to visit exotic places. She works in the business world and a good deal of travelling to multiple continents has been expected of her as a part of her career; this took much of the glamour out of the notion of travel, and has left her feeling somewhat unenamoured with its attractions. In our conversation, she pondered how many of us there might be who also feel this way, people who holiday abroad more because they think they should rather than because they truly want to. I have actually met an extraordinary number of people in my life who will guiltily admit to feeling somewhat ambivalent about travel, probably more than I have met who love it (although I’ve met plenty of those people also). Many people understand the anxieties that travel can cause and will admit that deep down they sometimes wonder whether the whole business is really worth it. So why do we do it?

I have never been convinced of the idea that travel broadens the mind, hence the line from Horace quoted at the top of this piece has always been a favourite for me. In my lifetime I have met some extraordinarily ignorant people who were well-travelled. I shall never forget an older man saying to me “I’ve smelt Calcutta” as an argument-clincher, proving without question his unshakeable belief that the English have done nothing but good for India over the years. Quite extraordinary. Likewise, my husband’s parents did far more travelling in their lives than I ever plan to do, yet my mother-in-law parroted the line “there’s no poverty in China” when telling me about their holiday there. To her credit, she did manage to grasp my point that maybe, just maybe, she had seen what the government-selected guide had wanted her to see and nothing more.

So it seems that visiting other countries does not necessarily educate or broaden the mind – we respond to travel as ourselves, see the world through our own tinted glasses, whether they be rose-coloured or otherwise. I like to think of myself as a reasonably broad-minded and liberal person and I don’t believe that any of this stems from the fact that I have travelled abroad on multiple occasions. My maternal grandmother was a pretty open-minded woman for any generation, never mind for someone who was born at the very beginning of the 20th century, and to my recollection she’d managed one trip to Malta in her lifetime – not exactly a challenging experience, culturally.

But let us not forget how lucky we are, how amazing the modern world is. Should we choose to make it so, the world is our oyster and this can be nothing but good. We take it for granted that we can find ourselves in another continent, another climate and another time zone in less than the time it would take us to drive from London to Glasgow. Travel abroad has become more and more affordable over the last few decades and is an expectation shared by far more people than our grandparents’ generation could have conceived of. When I was a very young student I lodged with a couple who had met during the 1960s, working as cabin crew for BOAC. They used to talk about how the fact that they were visiting different countries all over the world became a barrier between them and their families, who were not wealthy and had never experienced such things. It seems extraordinary now, but for their generation the explosion in exotic travel for all was only just beginning.

Now get this. Thanks to Stanford University, it is possible to find out how long your journey would have taken you in Roman times. Their interactive map of the Roman empire, through which you can find out the best and fastest methods via which you could have reached your intended destination as an intrepid Roman, is enormous fun. My trip to Mauretania, as the Romans called it, would have taken around 30 days, which puts my reluctance to endure a three-hour flight somewhat in perspective! Travel in the ancient world was difficult, expensive and phenomenally dangerous. You certainly didn’t attempt it in the winter, so making the trip at this time of year would have been considered absolute madness. I have genuinely found it helpful to remind myself of this; it has pushed any last-minute nerves and internal whingeing to the side as my brain adjusts its understanding to the realisation of how incredibly, wondrously lucky we all are to have the opportunities that we do.

So, as you read this, think of me now, the anxieties of the challenging journey over, enjoying just one of the innumerable privileges afforded to me as a result of being born in the developed world in the late 20th century. Just writing this has helped me to put things in perspective and I honestly find myself more ready for this trip than I otherwise might have been. The pen (or the laptop) is mightier than the sword when it comes to winning hearts and minds, and it looks like that goes for one’s own heart and mind also. So let’s open the suitcases and dust off my travel pass. I’m ready for boarding.

Photo by Javier Allegue Barros on Unsplash

Terms of Endearment

There was an interesting discussion on Threads last week, which is not something I thought I’d write in a hurry. While the platform formerly known as Twitter is always a raging hotbed of edu-controversies, Threads has remained to date extremely civilised, largely because nobody is saying anything on there most of the time. Last week, however, an Assistant Principal whom I follow on both platforms made the following remark: “Talking to a friend about this the other day and didn’t realise there were such polarised views about this. Are pet names ok in school? As in, is it ok to saying ‘what’s happened, my lovely/darlin/poppet?’ to a pupil?”

The responses were diverse and sometimes extreme, with one teacher even suggesting that pet names “made their skin crawl” and claiming “it’s inappropriate and creepy. I’d be horrified if someone in a position of power used such a term to me so kids deserve the same respect.” Hmmmm, I thought. Are pet names really such a problem?

A more nuanced view followed: “I find it grates a bit for me when I hear it so I’m not keen but that doesn’t mean I think it’s a major issue. I do think it’s one of those things where the appropriateness probably depends on the member of staff/the pupil/ the context and those things aren’t always easy to judge.”

Always up for a debate, I waded in and pointed out (alongside others) that regional variations are without a doubt something to be considered before we form the view of “definitely unacceptable”. Pet names – and indeed, particular examples of pet names – are used far more in certain regions of the UK than in others. Personally, I cling to the idea that teachers, while they should always be professional, should also be themselves. If terms of endearment are part of a teacher’s vernacular then I would think it only natural for them to use them in certain contexts, wherever they live now. Students need to learn about such things after all; regional variations in vocabulary, accent and phraseology are a part of our diversity.

One of the many elephants in the room best to address head-on is what I say to a child in my position as a middle-aged woman is perhaps not what I would choose to say were I a man or perhaps even a younger woman. Once you’re in the same bracket as “mum” or (hideous to admit but increasingly undeniable) “nan” for the majority of students, most of your words are automatically assigned a kind of maternal, non-threatening tone. Something I have thought about considerably in recent years is that if I am going to use endearments then these should be shared out equally to the boys as well as the girls. It was pointed out to me a few years ago, to my considerable shock, how differently adults tend to speak to boys compared to girls and it is something I have worked on ever since. Both boys and girls seem to me to actually rather like terms of endearment, when used in the right context and in the right way.

Context is everything. Terms of endearment can of course be used to patronise and silence individuals, particularly women, and I am certainly not going to make a case for them being appropriate in all fields. It would not, for example, be appropriate for a male Member of Parliament to tell a female member to “call down, love”, although the tone of certain cabinet ministers has indeed got dangerously close to this threshold a number of times. In teaching, however, I do not believe that assuming a parental tone with children is inappropriate. In addition, my desire to remain sensitive to regional variations is more important to me than preaching any kind of universal language. Despite being a passionate feminist, I have never thought it appropriate or indeed desirable to kick off at every London cabbie that calls me “love” or every Geordie that calls me “pet” as – to be frank – I would argue that doing so would demonstrate more ignorance on my part than the use of such terms is claimed to indicate on theirs. We live in a rich and diverse society, where language means different things to different people, and we should all be thoughtful and grown up enough to deal with this without getting an attack of the vapours every time we venture outside our own close-knit social milieu.

As many people pointed out in the discussion, tone is crucially important. A term of endearment is, in my opinion, a nice thing. If endearments are a part of one particular teacher’s vernacular then I think that’s fine, so long as those endearments are used consistently with lots of different students and are not used to patronise, denigrate or control others. In my 21 years of teaching, I have never heard this to be the case. Teenagers, it seems to me, often stop being spoken to in such a way as they age, and it is actually something of a shame; adults tend to assume they don’t like affectionate terms (probably because so many teenagers do spend a lot of their time bristling and shrugging them off) but actually they crave our attention and our affection more than we know.

My view would be that if endearments come naturally to someone, I would not discourage them actively from using them in schools, so long as they are used fairly and genuinely. While professionalism and boundaries are crucially important, we should not be losing our individuality or indeed our humanity in the name of this.

Image generated by AI