I have always thought of myself as someone who is fundamentally miswired when it comes to getting myself about. My internal compass is completely absent. Not just slightly out of kilter, you understand, but fundamentally and dramatically wrong — enough to turn a ten-minute walk into a humiliating loop past the same shop three times. My sense of direction isn’t just bad, it is non-existent, as if I have been born without whatever quiet internal certainty tells other people, “this way makes sense.” I’ve always envied those who have that innate sense of the right way: my father navigates the world like an indigenous man on walkabout. Sadly, I did not inherit a single iota of his instincts, to the extent that I’m surprised that I know the difference between left and right.
For most of my life, my appalling sense of direction has been a significant handicap and I suspect it had an impact on my confidence in other spheres. If I can’t reliably find my way across a town or even within the building in which I work, how on earth was I supposed to navigate anything larger: major life decisions, life ambitions, the invisible map of life? Other people seem to stride forward with invisible coordinates guiding them, while I would hover at crossroads, second-guessing, recalculating, almost always choosing a path with the distinct feeling that I might soon regret it.
Then, almost overnight, the world changed. Or rather, the tools in my pocket did. What used to require a well-thumbed but mysterious (to me, at any rate) A to Z, guesswork and a regular prostrating of my dignity onto the mercy of strangers, all of this has now been superceded by the reassuring warm glow of technology. Even more wondrously, it doesn’t sigh or say “I told you so” when I end up getting lost. It simply recalculates, endlessly patient, as if wrong turns are not failures but part of the process.
At first, I used it defensively. I would check directions obsessively, zooming in on routes, memorising landmarks as if preparing for an exam. But slowly, something unexpected happened. I stopped treating navigation as a test I had to pass and started treating it as a conversation. I could walk, drift even, knowing that if I wandered too far off course, I wouldn’t be lost — I would just be somewhere new, with a way back always available thanks to the super-duper magic pocket-wizard.
When you no longer fear being lost, the world opens up in subtle ways. Streets become less like corridors you must follow correctly and more like possibilities you can explore. A wrong turn isn’t a mistake: it’s a detour with an exit strategy. The pressure to always “get it right” dissolves, replaced by a quiet confidence that you can recover, adjust and continue. Even more pleasingly, I no longer find myself late for an appointment, lost and crying. (Yes, humiliatingly, that has happened: at one low point in 1993 I spent an hour and a half trying to find the location of a lecture I was supposed to attend in London for my degree: in despair, I sat down on some steps and blubbed, only to realise after a couple of minutes that I was actually sitting on the steps of the building in which I was meant to be attending the lecture).
Life in the 1990s felt like one long navigation problem for me. Choosing the wrong path early onmeant ending up miles away from where I was meant to be, with no simple way back that didn’t involve an expensive taxi ride. Now, I wonder if direction isn’t about always knowing where you’re going, but about trusting that you can keep moving, even when you don’t. That’s the freedom that technology has afforded me. The tiny blue dot on my iPhone’s digital map — steady, present and always updating — feels like a metaphor for something I never realised I needed: reassurance that my current position has a way out and a way forward.
I still have a terrible sense of direction and I am at peace with that. If you took away my iPhone, I would probably end up circling that same corner shop, wondering how I got there again. But I no longer see that as a personal flaw so much as a different way of moving through the world. I am someone who wanders, who doubles back, who explores by accident. And now, with my talisman in my pocket, I am reminded that no step is irreversible; wandering feels less like failure and more like a way of discovering paths I never would have chosen on purpose. Maybe that’s what direction really is: not a straight line, not a fixed bearing, but the ability to keep going, to adapt: to trust that, even if you don’t know exactly where you are, you are never lost forever.
As someone who works almost exclusively with young people between the ages of 14 and 16, I am well-versed in the problem of teenagers and sleep. Any teacher who has been scheduled a challenging class on a Monday morning will understand the issue, but since working online I regularly have the pleasure of being presented with a youngster who has quite obviously been peeled unceremoniously from their bedsheets less than a minute prior our session. I have often advised parents to book a later session, on the grounds that their child is simply not in a fit state to absorb anything prior to mid-morning. There are some that can cope, but for others it becomes apparent that their parents will be wasting the money that they pay me, so groggy and unengaged is their child when the session begins. But just why do teenagers find mornings so palpably difficult?
At the centre of everything is our circadian rhythm, the internal clock which governs when we feel sleepy and when we feel alert. In younger children, as every parent knows, the clock tends to run on the early shift; but somewhere around puberty, the clock shifts — enough to have a pretty big impact. In adolescents, the hormone melatonin, which is central to what drives our urge to sleep, is released later in the evening compared to both children and older adults. This shift in their natural sleep phase means that teenagers genuinely feel more alert late into the evening and do not feel ready to sleep until much later than they used to. I remember this with visceral clarity. Staying up to watch Moonlighting, followed by Indelible Evidence, felt effortless, while waking the next morning felt simply agonising. Teenaged bodies, still growing and developing rapidly, require a substantial amount of rest — typically around eight to ten hours. But if a teenager is wired to fall asleep at midnight, yet still needs to wake up at 7.00 a.m. for school, then we’re faced with a problem.
I always like to ponder why these evidential facts about our biological nature and development might have evolved. It is undeniable that most teeangers experience a change in their body clock as they develop, and it is also undeniable that adults vary in terms of their own body clock: some are natural night owls, some are larks. (You probably have a good idea which one you are, but it’s quite fun to do the test: some people are very strongly one or the other, some people are flexible. I am as much of a lark as it is possible to be). So, why might it be that we vary in this way? As humans evolved, it would have been useful for a tribe to have a variety of members within it, to ensure that there is always someone that is capable of being hyper-vigilant at any time. When life was lived on a knife-edge, an endless battle for survival, it was crucial for the safety of everyone that at least some members of the tribe were capable of functioning at any one time. Thus, these subtle differences in how alert we feel at different times of the day may have been an essential advantage and thus these differences were perpetuated through natural selection.
Of course, biology is only part of the story. If teenagers were tucked away in candlelit rooms with nothing but a paperback novel and their thoughts, they might still stay up slightly later than the adults in the household, but probably not quite as late as they do now. Modern life has introduced a dazzling array of sleep-delaying tactics, most notably in the glowing rectangle of the smartphone. Social media, messaging apps, streaming platforms — all of these operate on the principle that there is always one more thing to see, one more conversation to have, one more video that might be even funnier than the last. If you can establish one rule in the home, it should be that these devices do not take the journey to bed with you. Teenagers are particularly sensitive to reward and novelty, meaning that the little bursts of satisfaction provided by notifications, likes and new content are especially compelling. The result can be a perfect storm: a brain wired to seek stimulation, a body that doesn’t feel sleepy yet, plus a device that delivers endless entertainment on demand. Bedtime, under these conditions, becomes achingly oppressive.
Waking up early for school is difficult not because teenagers are being dramatic (although, to be fair, some drama may be involved in some cases), but because the teenaged internal clock is still firmly set to “night mode.” When an alarm goes off at 6.30 a.m., it is essentially interrupting the biological equivalent of midnight. Imagine being forced to wake up at 2.00 a.m. and then expected to perform algebra, write essays and engage in meaningful discussion. That is not far off what most teenagers are experiencing every day.
The misalignment between biological rhythms and social expectations is sometimes referred to as “social jet lag”. It’s the same groggy, disoriented feeling one might have after flying across time zones, except instead of being a temporary inconvenience, it is a daily occurrence. The result is chronic sleep deprivation, which has a range of effects that extend far beyond simply feeling tired. In the classroom, this can manifest as difficulty concentrating, slower cognitive processing, and a general sense of mental fog. Teachers may notice students staring into space, struggling to retain information, or reacting with the enthusiasm of someone who has been asked to solve a puzzle while underwater. It’s not (always) that teenagers don’t care about their education; it’s that their brains are not operating at full capacity during the hours when learning is expected to happen.
Sleep deprivation is also closely linked to irritability, emotional volatility and increased stress. Sound familiar? Parents, who may already be operating under the assumption that their teenager simply needs to go to bed earlier, are so often met with morning grumpiness that can escalate into full-blown conflict. The state that a young person is in can reinforce the adults’ belief that the child should retire to bed earlier. The teenager, meanwhile, feels misunderstood and unfairly judged, leading to a cycle of frustration on all sides. What makes this situation especially tricky is that both perspectives contain elements of truth. Teenagers do, in many cases, make choices that exacerbate the problem — staying up later than necessary, using devices late into the night and underestimating the importance of sleep. I did this myself on an infinite loop and looking back it seems ridiculous. Yet at the same time, I remember vividly how alert I felt in the late evening and how utterly unattractive it seemed to take myself off to bed. The fact remains that the underlying biology of teenagers genuinely does make early sleep and early waking more difficult for them. It is not a simple matter of willpower, nor is it entirely in their control.
Some schools have tried to take the peculiar biology of teeangers into account by experimenting with later start times. Research suggests that even a modest delay in the beginning of the school day can lead to improvements in attendance, academic performance and overall well-being. Teenagers who are allowed to wake up in closer alignment with their natural rhythms tend to be more alert and more engaged. I have always wondered, however, what these schools are like for the adults. Speaking as someone whose energy is now heavily weighted towards the morning (I spring awake, starving hungry, at around 5.30am most days), I would hate to work in a place where the day was shifted later. This is the problem: the teenagers are not the only ones with skin in the education game.
Given that all schools still start significantly earlier than most teenagers would like, there are nevertheless small changes that can help. Exposure to natural light in the morning can nudge the circadian rhythm slightly earlier, making it easier to wake up over time. Limiting screen use in the hour before bed can reduce the stimulating effects of both blue light and stimulating content, giving melatonin a better chance to do its job. Consistent sleep schedules, even on weekends, can also make a difference, although this is perhaps the most challenging suggestion of all, given the powerful allure of a Saturday lie-in. It all seems rather easier said than done, and my parents certainly gave up even trying. Ultimately, understanding is key. If families can recognise that their teenager’s sleep patterns are not entirely a matter of choice, and if teenagers can be persuaded to acknowledge that their habits can influence their well-being, the conversation can at least be had.
It is, in the end, a delicate dance between what our bodies want and what our schedules demand. Teenagers, caught in the middle of this dance, are not failing at mornings so much as they are beginning to negotiate with them. If they’re lucky, they will become a lark like I did, and the world will become an infinitely easier place to negotiate (unless they want to work in the nightclub industry, I suppose). So, if your teen occasionally hits the snooze button one too many times, it might be worth remembering that they are not resisting the day — they are just trying to catch up with a night that ended a little too soon for them.
Last week, there was something of a debate amongst my now quite elderly parents and me. I remarked that I genuinely struggle to understand why so many people are so reluctant to change their minds. What on earth was so frightening about it? My father, a trained scientist, seemed to get where I was coming from. My mother, a trained counsellor, was less impressed. She sees everything through the matrix of people’s emotional responses and finds it easy to comprehend the ways in which people’s fears and hang-ups are their most powerful driving forces. But I genuinely struggle to understand why people are so fixed in their ways of thinking.
While I’ve never considered myself to be much of a scientist (a glance at my GCSE grades will confirm this for anyone in doubt), I do like to think that I am a rationalist and that I base my responses to most things on the evidence in front of me. I am also, I think on balance, quite emotionally robust. Given these two character traits, I will confess that I genuinely struggle to comprehend why changing one’s mind about something is considered to be such a terrifying prospect; but the older I get, the more I am forced to acknowledge that for many it seems to be so.
In the same week, I met a friend who filled me in on some local gossip and remarked, in passing, that living in our village had been good for her, since she had been exposed to a range of people with different political views and discovered (in a manner that she reported with some surprise) that Conservative voters did not all possess the horns of Beelzebub. She reflected on the limitations of being brought up in a home in which one political viewpoint was presented (a household that she summed up as “Guardian-reading”). I reflected on the fact that I felt there had been a variety of political standpoints within my close family and that these had been openly (and sometimes quite heatedly!) debated, perhaps leaving me open to the notion that there can be well thought-out (and indeed extremely badly thought-out) views on all sides. She said that she envied this experience. It was genuinely fascinating and gave me further pause for thought. Might this exposure to conflicting politics within one family be another reason why I am interested in rather than threatened by alternative viewpoints?
Also this week, whilst listening to a podcast, I heard a reference to an analogy used in human psychology that I had not come across before, and it chimed with all the thoughts I had been having about tribal thinking versus the ability to change one’s mind. I looked up the reference and was fascinated to discover someone called Julia Galef, an author and co-founder of the Center for Applied Rationality. Galef argues that some people act like “soldiers”, while others act like “scouts”. “Soldiers” in her analogy tend to approach a discussion from the sole position of defending their beliefs, attempting to discredit or dismiss conflicting information and seeing the alternative viewpoints as the enemy to be shot down. “Scouts”, by contrast, are motivated more by the desire to find the truth regardless, of their starting point.
But before we “scouts” get too smug about our Stoic capacity for reason, according to Galef, our tendencies towards being either a “soldier” or a “scout” are both rooted in our emotional responses and learned behaviour. The “soldier” mindset tends to be held by someone who is motivated by connection and community (which can lead to tribalism), whereas someone with a “scout” mindset is more likely to enjoy the process of discovering new things (which can lead to innovative or creative thinking, but carries with it the threat of isolation). For a “soldier”, the process of changing your mind feels like a weakness or even a defeat. For a “scout”, it as something positive and exciting. This is exactly how I feel. To me, the process of changing my mind isn’t simply non-threatening: it is genuinely thrilling and wonderful. I love discovering that I have been wrong about something, or that my understanding of a topic has been flawed. I find it genuinely mind-boggling that people can hold the same views that they have always held, and borderline distressing to imagine that they find the process of change a net negative.
In a quest for further knowledge, I will delve into Galef’s book, The Scout Mindset, and find out if her analogy resonates once I’ve read it in full. For now, I feel genuinely happy to be a “scout” and can highly recommend it. It might not make you the most popular person at the party, but it does make you the one who will point out that the emperor is stark, staring naked.
To what extent should it be evidentially apparent that we practise what we preach? I have been pondering this dilemma, since observing a distinctly less than svelte gentleman, who regularly oversees the training regimes of customers in the gym, encouraging his clients to build muscle and burn fat whilst resting his own arms on his notable and substantial belly. I will admit that my mind went somewhere slightly uncharitable, a thought process that can be summed up in the line, “maybe get your own house in order first,” but then I found myself wondering: to what extent does a fitness instructor have to be fit?
The question became more complicated the more I thought about it. Given the well-documented fact that many doctors smoke and drink alcohol and that many of them are also overweight, does that make those same doctors any less qualified to tell the rest of us what to do when it comes to looking after our bodies? Their ability to diagnose and treat doesn’t disappear if they don’t always follow ideal habits. Shift work is notorious for making it difficult to sustain a healthy diet and lifestyle, as is stress: so should we think less of them for falling prey to the same barriers as the rest of us? And why should this not apply also to fitness instructors?
Physical trainers and their clientele at the gym are something of a source of fascination for me. It is (for me, at any rate) simply impossible not to eavesdrop on their sessions, and I’m fascinated by the barely-concealed disinterest with which so many of the PTs conduct their sessions. It is, to be fair, a pretty repetitive job, watching a variety of newbies straining to improve their fitness, but I am startled by how dismally uninterested some of the instructors seem to be in the process of fitness. Why apply yourself to a career without that basic love for what you do? Personally, I still get a kick out of explaining how participles work, after nearly thirty years of doing so, and I still take joy and pride in watching the lights go on when I help a student to understand something that they have not managed to grasp before.
Perhaps the most enthusiastic personal trainer at the gym is a man whom I have nicknamed (in my head, you understand) The Dangling Frog. This man is genuinely interested in pumping iron, although I would question the degree to which he is an advocate of what I would deem true fitness, given the amount of steroids I suspect he has imbibed. So, why Dangling Frog, you ask? Well, he has done so much upper body work that he resembles one. Imagine holding up a frog by its body and letting its bowed, skinny legs dangle beneath? That’s what this guy looks like. I have witnessed him working on his own muscles and the level of strain he applies to his shoulders and biceps is impressive: the poor old legs don’t get a look in, so they remain a mere shadow of his upper-body musculature. To be frank, I’m surprised that his legs can sustain the downward force of his upper body. One day, I fear that the unarguable laws of physics will complete the inevitable demonstration of force and gravity and his legs will give way.
Still, disproportionate physique aside, this guy is certainly more enthusiastic than most. His favourite regular client is older than him, so he gets to show off a bit under the guise of training and encouragement, but I saw him recently with a younger compatriot and I couldn’t quite work out whether this was a trainer-client situation or simply a meeting of mutual appreciation. Whatever the circumstances, it looked like a beautiful bromance was developing, and their conversation went something as follows:
Bro 1: yeah I think I’m trying too hard. Bro 2: yeah you need to reduce to 36K and focus on letting your body do what it needs to do. Bro 1: yeah exactly. Bro 2: yeah. Bro 1: Maybe I’ll try 30K. Bro 2: I tried with like 42K last week. Bro 1: no way wow. Bro 2 (preening slightly): yeah that was like way too much but you know. Bro 1: yeah man. Bro 2: yeah.
It was like listening to poetry. Dangling Frog at least practises a significant amount of what he preaches, which is (I suspect) lots of upper-body pumping, the constant imbibing of protein-based sludge and (I also suspect) a regular date with some anabolic steroids. Is he a better role-model than the instructor with the standard paunch, I found myself wondering? I’m honestly not sure. So, where are the genuine fitness enthusiastists? Is it really that hard?
For fitness coaches, credibility is surely tied to example. If I were a client, I’ll be honest that I would expect them to look fit, and I would expect them to practise what they teach. If a coach promotes healthy eating and an active lifestyle but doesn’t look as if they follow their own advice, my trust would go out the window. Much of their job involves demonstration and motivation, so one would have thought that leading by example would be their most powerful weapon.
Yet, how about counsellors and therapists? They give advice on emotions, coping strategies and behaviour, but they are not expected to have perfect mental health themselves. Indeed, many counsellors have faced their own struggles and a therapist who has sailed through life with no challenges would be an inadequate one indeed. What matters is that they understand techniques and can guide others effectively. While practising what they preach is helpful, self-awareness and professional skill matter more than personal perfection. Perhaps this should also apply to fitness instructors?
I am still pondering this and am left asking myself why it is the case that I would expect and demand a fitness coach to be an exemplification of what they are employed to teach. Perhaps it is because I would assume that the process of fitness is a source of personal interest for them. Likewise, I also perhaps assume that once one is truly knowledgeable about fitness then one surely would not be able to resist the urge to apply it to one’s own lifestyle. For example, once you truly understand the damage that a sedentary lifestyle can cause, surely you cannot help but be more active? Yet, if this were true, then nobody would smoke, nobody would drink, nobody would be inactive. Is it really the case that ignorance is the problem, given the overwhelming amount of information with which we are all surrounded?
When someone claims authority — whether it be a teacher, a coach or a religious leader — humans instinctively look for alignment between their words and actions as proof that their advice is genuine and achievable. If a teacher follows their own guidance, it signals integrity, builds trust and makes their teachings feel real rather than abstract. On the other hand, hypocrisy can weaken respect and create doubt, because it suggests either a lack of belief or a gap between theory and real life. Ultimately, we value authenticity, and seeing someone embody their own principles makes those principles more convincing.
What should one do if one’s own Member of Parliament were to misrepresent reality and twist the truth in order to raise their profile on social media and blow their own trumpet in the House of Commons? One asks this purely hypothetically, of course, for it would be truly beyond the pale were such a thing to happen for real.
Imagine this entirely theoretical scenario. One’s local Member of Parliament visits a comprehensive school within his (or her) constituency. Let’s say, for argument’s sake, that this is a school in which one has worked for many happy years and is therefore a school in which one is emotionally invested and indeed a school of which one has an intimate knowledge. It is possible, for example, that one might well have taught a lesson in pretty much every single classroom in such a school. One might, indeed, have a thorough knowledge of the school and the fabric of its buildings, as a result of working there full time for well over a decade. Since leaving this school (again, entirely hypothetically), one might have maintained contact with it on and off, and one might have visited the school on occasion. All possible, in this entirely imaginary tale, I hope you can agree. What is less plausible in one’s story is the behaviour of the Member of Parliament.
To continue this whimsical flight of fancy, imagine that the Member of Parliament saw fit to poke about in an area of the school that has been condemned for several years and is completely shut off to both students and staff. Imagine that this has been the case for so long that — as someone who has worked there for over a decade — one is not even entirely sure where that part of the school actually is, since in one’s imagined scenario the condemned area has been quite rightly out of bounds. Perhaps nobody goes there and perhaps nobody has access to it except for the site team, in this altogether imaginary school in this altogether made-up story. Despite these facts (I say “facts”, but of course, do remember that this is an entirely fictional tale), the Member of Parliament in one’s story takes some pictures of this disused area (imagine that! Ludicrous!) then shares those pictures on his (or her) social media pages and makes claims about the entire school being in “an appalling state”. I mean … this is simply ridiculous, isn’t it? Any publisher would reject such a story as thoroughly unconvincing. Rip up the story and start again, you foolish author, for our fine upstanding Members of Parliament do not behave in such a way. They are busy, important people; they have no truck with such shenanigans.
Forgive me, but we must stretch this truly bizarre tale even further, to the point where you will of course recognise it for the blatant fiction that it must be. For this entirely imaginary Member of Parliament actually speaks in the House about his (or her) visit to this wholly make-believe comprehensive, in a manner that could, in extremis (and, of course, this is what makes this account so obviously hypothetical), be considered misleading to the House. He (or she) describes classrooms (note the plural! Classrooms!) which are “held together by gaffer tape”, leaving one, in one’s imagined scenario, scratching one’s head and struggling to picture the classrooms that he (or she) is talking about. He (or she) then goes on to describe the disused area of the school of which he (or she) took these photographs, and claims that he (or she) “almost fell through the floor” and was what’s more assailed by the stench of mould, all the while insinuating that this is actually representative of the state of the school. The classrooms, after all, you may recall from the imaginary description, are “held together by gaffer tape.” So, in this crazy hallucination, one is still left trying desperately to picture whether one saw a single piece of gaffer tape in one’s entirely imagined 13 years at this entirely fictional comprehensive school. One’s imagination may seem to have completely run away, for in one’s head the very same school is (in an alternate universe that one might call reality) so well-maintained that it is positively the envy of other schools in the area. So free from decay is the site (a source of pride for its site team) that visitors comment on the fact. This is all in one’s vivid imagination, you understand.
In a final flourish to one’s extraordinarily far-fetched tale (I really must write it up some time), one’s fabricated Member of Parliament doubles down when challenged on social media, deletes all of one’s posts in which one points out that he (or she) has misled the House, and even claims to be doing what he (or she) is doing at the behest of the school. This is despite the fact that he (or she) is not in government and not in a position to secure them any funding and perhaps despite the fact (and here things get really wild) that the Leadership team at the school might, one could possibly fantasise in one’s wildest moments, have asked him (or her) to desist. One can just about conceivably imagine that it has been made clear to this MP gone rogue that his (or her) interference and naming of the school on social media and at Westminster is distinctly unhelpful and unwelcome. Could one not?
What an extraordinary tale, I am sure you’ll agree! So, what would one do, in this entirely imagined situation? (I realise that it is so ludicrous that you might not even consider it worth fleshing out a plan for such an unlikely situation, but do humour me for a few more lines). One would, I suspect, have to write to this Member of Parliament directly, as one of his (or her) constiuents, in order to express in no uncertain terms one’s disquiet with his (or her) behaviour. One might even have to explore what avenues there are for making a formal complaint about such a state of affairs, given one’s sincere belief that a distinctly less than truthful statement has been made within the hallowed walls of Parliament. Furthermore, if one were to find oneself in this highly unlikely position, one would certainly (one imagines) feel thoroughly disillusioned with the honesty and integrity of one’s democratically elected representative and one would (one suspects) feel glad, not for the first time, that one did not personally vote for such a person.
It is a truth universally acknowledged that the one thing we love more than a hero is to see a hero fall. I’m not sure whether this is an entirely modern phenomenon, but it is perhaps a tendency that has burgeoned in recent decades. More than this, something which I do think is peculiar to our age, is the expectation that historic figures should be judged according to 21st century western values. This, especially when it is pitched against some of the figures who had a significant hand in the process of carving those same values, leaves me distinctly uneasy.
Last week, the BBC reported that Hinchingbrooke School in Huntingdon was swapping the name of one of their pastoral houses from Pepys to Lady Olivia. The process was enacted via a democratic ballot, which turned out to be a classic example of western democracy in action, given that the much-celebrated result was voted for by less than 50% of the electorate. Nevertheless, Lady Olivia, wealthy landowner, school sponsor and evangelical Christian, now finds herself named as the chosen figurehead for modern students in the school that Pepys attended, along with Oliver Cromwell. One can only hope for her that there are no skeletons in her cupboard, to be discovered down the line. There’s always a tweet.
Samuel Pepys seems to have gotten away with being a prolific sex offender without much modern public disapproval until 2025, when historian and translator De la Bédoyère went back to Pepys’s original manuscripts and translated all of his coded entries, which he wrote in a kind of Franglish, Pidgin Latin and a smattering of Spanglish. De la Bédoyère re-published Pepys’s diaries in all their glory, and the result is the extraordinarily detailed snapshot of 17th century life that one might expect; unfortunately, that life is one of a man for whom praying upon vulnerable women was a something of a daily occurrence. It was certainly an education for me, reading what this serial predator got up to on an average day, and it very much does not chime with 21st century western values. Historians are keen to point out that Pepys’s behaviour didn’t even chime particularly well with 17th century western values, as he seems to have had something of a reputation in his day. I only wish I could believe the world has changed, but let’s not pretend that it has. Men with such reputations are still running several countries.
I have been pondering the school’s decision to demote Pepys from his position as a House name and I have no wish to criticise it. The school has already made it clear that there are parts of the school named after Samuel Pepys and that those tributes to him will not change. I have no doubt whatsoever that the school was placed under enormous pressure by a vociferous minority and I don’t even have a particular issue with that in some ways: perhaps those individuals are right. If I had a daughter in the school, perhaps I might have agreed with them that there are better figureheads for her to look up to. Whatever my individual thoughts on the matter, it is inescapable that these days it only takes one parent with a bee in their bonnet and an active WhatsApp group to dictate school policy and this — for better or for worse — is the reality of where schools find themselves today. Headteachers have to pick their battles, and going out to bat for Samuel Pepys was perhaps not something the Headteacher felt was a hill worth dying on.
What I think is more interesting is to ponder whether we have lost something when society cannot tolerate undeniably serious flaws in their heroes. Is this a quirk of the kind of modern puritanism that we find ourselves facing today? If we turn to the ancient texts for our model, the authors of those understood only too well the value of a rounded hero, indeed the very definition of hero required the inclusion of multiple flaws. The notion of a “fatal flaw”, popularised by Rennaissance readings of Aristotle’s Poetics, influenced Shakespeare and other writers. There is unanimous agreement from ancient times to modern that the most interesting heroes are the ones with inherent weaknesses: a perfect hero would be a thoroughly tedious creation.
When Virgil introduces Aeneas as the hero at the beginning of his epic work, he does something quite remarkable. When we first meet Aeneas, he is at his lowest ebb. Battle-fatigued and a travel-worn refugee, Aeneas is at breaking point. He screams and cries and implores the gods to take him: why did I not die in Troy? he asks. What was the point of it all? The visceral shock of introducing us to a hero who appears to have abandoned all hope and is wishing he was dead is one of the most exciting decisions that the author could have made, and it thrills me every time I revisit the text (which has been hundreds of times over the last two years, for that section of the text is on the specification for OCR GCSE). The point, I think, is for us to reflect upon how much more impressive it is when Virgil later describes Aeneas suppressing his emotions, resuming command and leadership over his men: someone we have witnessed at cracking point does the right thing for the good of the majority and for the men in his care. Now, that’s a hero.
Not only does Virgil start his epic work with a radical take on heroism, he ends it controversially, by demonstrating that Aeneas is very much less than perfect. At the end of the epic battle that ensures the supremacy of the Trojans in their new homeland, thus securing the future of what will become the Roman empire, Aeneas is faced with his arch enemy, who begs for mercy. The tradition in ancient texts was that good heroes are extraordinary warriors but they do not give in to blood-lust; whenever a warrior is taken over by this kind of crazed, emotionally-charged violence, disaster tends to ensue and the warrior is punished for his misdemeanours. Good warriors show mercy when the time is right. Yet Virgil does not finish his work in this way. As Aeneas looks down upon his enemy, he is overwhelmed by rage, bitterness and grief: he slays him, quickly and ingloriously, and the epic finishes with our hero’s enemy groaning his last, his tortured soul shrinking away to the underworld. It is a radically depressing way to close an epic work of propaganda and reflects a true genius at his peak. The reader (or more likely the listener) is left with an uneasy sense of disappointment in our hero, left to carry the burdensome knowledge that founding an empire is not without its price and that war makes even good men do terrible things.
Perhaps indeed we have lost something along with the present-day puritanism that judges historic figures according to our modern western values and — inevitably — finds them wanting. Personally, I don’t have a problem with recognising the contribution that Pepys made through his unflinching account of 17th century life alongside the fact that the life he describes is one to which I would viscerally object. It’s what history is all about. What I hope for the future is that we can have these discussions in a more mature and nuanced way. There is nothing more irksome that the modern tendency towards cancellation and extremism, the “no debate” lobby, who consistently fail to understand that the very pluralistic society that they believe in so fervently and lobby so hard for requires endless compromise and true tolerance, the kind of forbearance that makes you feel uncomfortable and sometimes forces you to question your own values. I occasionally wonder whether the louder the cancellation crew shout, the more they’re trying to drown out the voices of doubt in their own head.
Suddenly, everyone is talking about Lord of the Flies. It is one of my favourite novels, one which I taught for GCSE English literature for around a decade. I’m afraid that I have no urge to see what the BBC have done with it. I have also been somewhat irritated to see multiple hot takes on social media, critisising the story’s doom-laden attitude towards childhood and children’s psychology.
First of all, Golding was emphatically not being doom-laden about the nature of children, he was being doom-laden about the nature of humanity as a whole: let us not underestimate the extent of his doom-mongering, please. Secondly, Lord of the Flies is no more a novel about children and childhood than Animal Farm is a novel about livestock and animal husbandry. Like Animal Farm, Lord of the Flies is an extended allegory, and its message is a profoundly depressing one. So, buckle up.
Golding’s work of genius (one which he, incidentally, dismissed in later life as “boring and crude”) is a thoroughly disturbing exploration of what happens when the structures of civilisation fall away. It is emphatically not a novel about children. While the novel appears to contain the trappings of childhood: children’s games, their fears, their rivalries and their capacity for cruelty, it becomes clear as the narrative unfolds that Golding’s central concern extends way beyond childhood psychology. The island on which the children find themselves stranded is a microcosm of the world that the boys have left behind, a specimen society in which rival authorities, social hierarchies, violence and superstitious ideology rapidly emerge. Golding uses children in order to examine society stripped to its essentials, suggesting that what we call “civilisation” is a fundamentally fragile construct laid over a persistent human capacity for savagery. The novel is less an anthropological study of childhood than a parable about the nature of society itself.
From the outset of the novel, in which the boys find themselves stranded in the wilderness, the protagonists attempt to recreate the structures of the adult world from which they have come. They call assemblies, establish rules and elect a leader. Ralph’s authority rests on apparent legitimacy: he is chosen through a vote, and a conch shell is used as a tangible sign of democratic order. The conch regulates speech, embodies fairness and stands as a shared agreement among the boys to abide by rules. These early chapters might seem to suggest that humans, left to their own devices, instinctively lean towards mature governance; yet Golding makes it clear that the boys’ desire for adherence to a set of rules depends not on moral conviction but on a fear of consequences and a individual lust for dominance, for the boys speak immediately of the punishments that will face anyone who transgresses the rules they plan to lay down for themselves. Furthermore, as the hope of rescue fades, the rules lose all of their potency. As Ralph puts it, “things are breaking up. I don’t understand why.” The deterioration is not portrayed as uniquely childish; rather, it reflects how flimsy and insubstantial social contracts are when the institutions that sustain them collapse.
Jack’s transformation from choir leader to autocratic demagogue underscores this shift. His authority on the island grows not through reasoned persuasion but through his manipulation of fear and the promise of hunting and meat. He paints his face, embraces ritual and forms a tribe built on spectacle and intimidation. In doing so, he does not regress into childhood so much as adopt the tactics of a charismatic despot.
It is hinted from the outset that the boys have arrived from a society already engaged in a global conflict. The island society quickly begins to resemble the violent regimes and wartime mentalities of the adult world and the children’s play-acting of war quickly becomes indistinguishable from the very worst forms of human brutality. The murder of Simon is not an impulsive scuffle between children; it is a collective frenzy, a ritualised killing fuelled by hysteria and conformity. In that pivotal moment, Golding depicts the terrifying ease with which ordinary individuals can participate in atrocities when swept up by mass hysteria and mindless ideology. This is emphatically not a comment on the nature of children: it is a study in group dynamics and the power of suggestion.
Prior to his death, Simon’s role in the novel further supports the interpretation that Golding is examining society and group dynamics. His encounter with the pig’s head, the eponymous “Lord of the Flies,” reveals the central moral insight of the book: “the beast” that the boys fear is not an external creature but something within themselves. The pig’s head, swarming with flies, seems to speak to Simon, telling him that it (the beast) is part of them, is inside them: it is not an external force, rather it is innate to humanity. Golding aims to convince his readers that the impulse toward violence and domination is an inherent aspect of human nature, one that civilised society attempts, imperfectly, to restrain. Simon’s death, at the hands of boys who mistake him for “the beast” crawling out of the forest, symbolises the destruction of moral truth by collective fear and aggression. The tragedy lies not in the fact that the children are capable of evil, but in the implication that all humans are in the wrong circumstances.
Piggy represents rationality, scientific thought and the values of ordered civilisation. His glasses, which enable the boys to make fire, symbolise the power of technology and reason. Yet reason alone cannot withstand the tide of savagery once the social consensus collapses. Piggy is marginalised, mocked and finally killed when Roger deliberately dislodges the boulder that crushes him. This final act by Roger is particularly significant: earlier in the novel, he is depicted as throwing stones at the younger boys but he deliberately misses; the implication is that he is an inherently violent boy who is restrained in his urges by what Golding calls “the taboo of the old life.” As those restrictions erode with the breakdown of society, so too does his individual restraint. By the time he kills Piggy, Roger acts with deliberate intent. Golding’s emphasis on the gradual disappearance of internalised moderation points to his theme of the importance of societal structures in shaping and curbing antisocial behaviour. When those structures weaken, he believes, our latent cruelty surfaces.
Golding’s novel is emphatically not about childhood. The boys bring with them the hierarchies, prejudices and fears of their culture. The choirboys, accustomed to discipline and exclusion, quickly form an elite group under Jack. The “littluns” (as the youngest members of the group are collectively referred to) are marginalised and terrorised by the older boys and even Ralph, ostensibly the champion of order, participates in the violence against Simon. No character is exempt from moral compromise and this universality suggests that Golding is less interested in developmental psychology than in the broader human condition: his view of us is emphatically not a happy one.
The sudden arrival of the naval officer at the end of the novel crystallises the evidence that the island society is a mirror that Golding is holding up to the adult world. The officer is initially amused by the boys’ appearance, viewing their behaviour as a childish game. Yet he represents a world engaged in destructive warfare: his warship waits offshore, a reminder that organised violence is not confined to the island but is institutionalised in the adult society that lies beyond it. The boys’ painted faces and sharpened sticks are grotesque reflections of his uniform and the weapons he brings. The officer’s presence does not negate the horror that has occurred; rather, it frames it within a wider context. The island is not an aberration but a microcosm: Golding implies that the same forces driving the boys to chaos are operating on a global scale.
Published in 1954, in the aftermath of the Second World War and at the dawn of the nuclear age, Lord of the Flies reflects a period of unprecedented recent human destruction. The belief in steady moral and social progress had been shattered by the exposure of the Holocaust and the growing fear of atomic warfare. Golding, who had served in the Royal Navy, stated that he had witnessed firsthand man’s capacity for organised brutality and illustrating this was his purpose in writing the novel. His choice to use schoolboys as protagonists was an artistic decision: by stripping away adult institutions and placing children in isolation, Golding constructs a controlled experiment in which the island mirrors the essential dynamics of society in a concentrated form. The boys’ age if anything underscores the horrifying argument that the seeds of societal violence lie not in complex political systems alone but in the fundamental aspects of human nature. While the “beast” that the children fear can be seen as a childish nightmare, Golding does not treat their fears as trivial. “The beast” evolves into a powerful symbol of how societies create external enemies to embody internal anxieties and explain the darkness within them. The boys’ belief in the beast apparently justifies Jack’s desire for authoritarian rule and explains the abandonment of rational deliberation. In this way, childish superstition becomes analogous to the propaganda and scapegoating we find in adult societies.
It is undeniable that the novel challenged the mid-twentieth-century literary tradition, which portrayed children as naturally innocent and if anything morally superior to adults. In traditional adventure stories, still popular at the time, stranded boys tend to maintain British civility and cooperation. Golding deliberately inverts this literary convention. His boys do not build a utopia; they descend into barbarism. This inversion, however, is not a comment on children but a critique of the complacent belief that civilisation is secure and that moral behaviour is natural and instinctive. By showing that even well-educated English schoolboys can commit atrocities, Golding aimed to dismantle the myth of inherent cultural or moral superiority. Ralph’s uncontrolled grief at the end of the novel is portrayed as a source of embarassment to the naval officer. He weeps “for the end of innocence” and “the darkness of man’s heart,” a final summation of Golding’s bleak vision.
To read Lord of the Flies as a novel about the nature of children is to overlook its broader philosophical ambitions. Golding did not believe or aim to suggest that children are uniquely savage or that society alone corrupts them. Instead, he proposes that society is both a product of and a defence against the darker aspects of human nature. Civilisation provides structures — laws, social norms and institutions — that channel natural instincts such as aggression and desire into appropriate avenues. When those structures disintegrate, as they do on the island, the underlying impulses are revealed. The boys are not aberrations; they are average human beings.
Golding’s frankly brilliant work interrogates the very foundations upon which social order rests, yet it achieves this by focusing on children, whose assumed innocence sharpens the shock of moral collapse. Golding invites readers to question their comforting assumptions about progress, about culture and the nature of morality. The savagery on the island is not confined to childhood; it is an ever-present possibility within human communities. By the time the naval officer arrives, the reader understands that rescue from the island does not equate to rescue from the darkness within. Golding’s enduring message is that society’s stability depends upon our constant vigilance against forces that originate in the human heart. How’s that for a bedtime story?
“Turn a blind eye” is one of those expressions that slips easily into everyday speech, a shorthand way of describing the act of deliberately ignoring something. We might say a teacher turned a blind eye to students whispering in class (never a good idea, by the way), or that a government turned a blind eye to corruption (even worse). Many people use the phrase without a second thought about its origins, but like many idioms, it comes with a story. In recent years, some people have questioned the phrase, arguing that it may be offensive or insensitive. Well, speaking as someone who actually is blind in one eye, I am here to defend it: so, brace yourselves.
The most commonly cited origin story for “turn a blind eye” dates back to the Napoleonic Wars and everyone’s favourite British naval hero, Admiral Horatio Nelson. Nelson had lost the sight in one eye earlier in his naval career, when flying debris from a shot impacted a sandbag and struck his face, causing severe damage to his retina. He is often portrayed as wearing an eye patch, but there appears to be no evidence that he did so: historic accounts seem to indicate that his eye remained intact, he simply couldn’t see out of it any more.
During the Battle of Copenhagen in 1801, Vice-Admiral Sir Hyde Parker, the commander-in-chief of the British fleet, ordered the signal for Nelson to cease fighting and withdraw. Signals were transmitted from ship to ship via the medium of flags, so the order was necessarily a visual one. Nelson was alerted to the signal to disengage, but was eager to press ahead with the attack. According to the story, he raised his telescope to his blind eye and claimed to see no signal. Having feigned ignorance of the order, he continued the battle and secured a crucial tactical victory. The rest, as they say, is history, and presumably explains why Nelson still has his statue on the top of a column in central London and Hyde Parker doesn’t.
The anecdote of Nelson’s act of defiance was popularised in later retellings and became associated with the idea of deliberately ignoring unwelcome information or instructions. Nelson’s choice to quite literally turn his blind eye to an order he did not want to follow captured perfectly the notion of wilful ignorance or selective attention. Over time, the phrase entered the broader English language as an idiom, detached from its naval origins. Speakers used it to describe actions or policies where someone in authority chose not to recognise or address a problem.
Historians, always here to spoil the fun, are not 100% certain that the phrase originated with the story of Nelson: some debate the precise accuracy of the apocryphal story and there is evidence that similar expressions already existed before the Battle of Copenhagen and that the phrase may have been popularised through literary or journalistic embellishments of naval history rather than by Nelson’s own words and actions. Whatever the truth, the phrase stuck, and for generations it has been taught in history classes and quoted in newspapers, novels and speeches around the English-speaking world. Hurrah for insurrection.
As with many idioms rooted in physical descriptions of the body, “turn a blind eye” uses a physical metaphor to express the complexities of the human psyche, indeed sight and blindness have long served as powerful symbols of human understanding and perception. To “see” something often stands for awareness or understanding, while to be “blind” to something suggests ignorance, either accidental or wilful. The metaphor is played out to its full in the story of Oedipus Rex, who is metaphorically blind to the truth of his own story, and blinds himself in reality when he discovers it. Teiresias the prophet is physically blind but is the only one that can see the truth as the story unfolds. Shakespeare likewise exploited the theme to equal horror in King Lear, in which the theme of blindness resonates throughout the play, at times to quite toe-curling effect.
Now, to the modern world. Despite the phrase’s deep history, widespread use and highly effective meaning, it has not been free from criticism in recent years. Some people today argue that “turn a blind eye” may be offensive or insensitive because it invokes blindness — a physical disability — in a potentially negative way. The concern, so far as I can gather, is that by equating blindness with wilful ignorance, the phrase serves to reinforce negative stereotypes about people who are visually impaired. This criticism is, of course, part of a broader trend in which people are told to pay closer attention to the ways language can unintentionally marginalise or demean particular groups of people.
As someone who actually is blind in one eye, I am going out to bat for the phrase (although, being blind in one eye, it is true that my batting can be somewhat haphazard). My blindness on one side (the right, as it happens) has cost me a lot, and I’m not about to let it cost me my language as well. It was a significant factor in my deciding not to drive and has affected my life in numerous ways. I now struggle significantly with eye strain and have to be careful with articifical light and screen time in order to avoid migraines, as my one good eye (not actually that good, as it happens!) is doing all the work. I am terrible at judging depth and distance, so professional tennis playing was out as a potential career; you also don’t want me to pour you a glass of red wine at an angle, trust me on that one.
I chose to tell my classes in school about it, as it was important to make clear to students that if they were waving their hand in the air on my right side I simply wouldn’t see them: I would much rather own up to a physical disablity than have children believe that I was ignoring them. Despite this, I know that my reputation as somewhat standoffish also stems from my disablity: colleagues, acquaintances and even close friends have often believed that I am deliberately ignoring them because they do not appreciate the limits of my vision. It is the problem with having what the right-on brigade call an “invisible disablity” — it is not obvious that I am blind on one side, nor is it apparent that my sight in general is pretty terrible, so as a result nobody makes any allowances for me when it comes to that. The received narrative is that Emma is rude and standoffish. Oh well. Sometimes it’s a useful reputation to have, to be honest.
Anyway, back to the phrase. The controversy around it reflects how social attitudes and awareness changes over time. Idioms such as “turn a blind eye” become ingrained in everyday speech, then one day somebody decides to unpick the meaning of the phrase and take offence. But the metaphorical connection between blindness and ignorance has been used for millenia, and is not a comment on those of us who are visually impaired. (Remember Teiresias? He was a blind man credited with insight beyond that of all others, perhaps reflecting the fact that even in the ancient world, people understood that those who are completely blind develop excellent perception beyond physical sight).
I have been lectured by keyboard warriors on the internet for using the phrase “turn a blind eye” and I shall confess that I have taken great pleasure in telling them that I am — as it happens — blind in one eye. To date, every single one of them has climbed down off their high horse and started self-flagillating, telling me that they are “still learning” and begging for my forgiveness. Dear Lord, how did we get here? I am honestly not sure when the tipping point was, when we reached the point that people feel they have to police every word they say. If I had to guess, I’d say that the turning point was about 1999.
I suspect that those who claim to find the phrase problematic have absolutely zero experience of what it is like to be blind in any sense. Were they in touch with the experience, they would understand why the metaphor works so well. Believe me, if you’re trying to get my attention beyond a certain angle to the right, you can forget it: it’s not going to happen. Even more crucially, were these people properly aware of the purported origins of the phrae, then surely they would also have to acknowledge that the phrase is clearly associated with wilful ignorance and avoidance, not merely physical disablity. According to the story, apocryphal or otherwise, Nelson didn’t accidentally hold up the telescope to his blind eye in a state of haplnessness or vulnerability: he deliberately used the telescope in this way, in order to disobey an order. That is the point! It is a story about disobedience and coolness under pressure, not about impairment. Somewhat less gloriously, I sometimes lie on my left to take advantage of my blindness and blot out the world: disabilities have their advantages, you know!
As society ties itself up in knots over what it believes is diversity and inclusion, people have begun to question whether expressions such as “turn a blind eye” carry unexamined assumptions that might be exclusionary or hurtful. I am here to tell you, people: for heaven’s sake, stop panicking and get on with your life. I don’t feel in the least bit excluded by the phrase, it is by a country mile the best, most expressive and most useful manner in which to describe what you’re trying to say. (Are we allowed to say country mile any more? Does that imply that people in the country don’t understand measures and distances? I’ll have to check).
This debate around “turn a blind eye” is just one part of a broader conversation about how language intersects with identity, power and social values. Similar discussions have arisen around other idioms and expressions that draw on physical traits or historical stereotypes. For example, phrases like “lame” to describe something unimpressive or “crazy” to describe something irrational have been questioned for their potential to offend or marginalise groups of people. In each case, speakers and writers are encouraged to consider whether there are better, more inclusive ways to express themselves. Personally, I am beginning to find it all more than a little bit exhausting. Sanitising language to the point where communication becomes awkward or laden with fear of making mistakes is crippling us all (there I go again — sorry). Learning about the historical origins of a phrase can enrich our appreciation of language rather than diminish it, and personally I’d rather enjoy the full richness of English expression than have my language policed by the terminally well-meaning.