Flawed heroes

It is a truth universally acknowledged that the one thing we love more than a hero is to see a hero fall. I’m not sure whether this is an entirely modern phenomenon, but it is perhaps a tendency that has burgeoned in recent decades. More than this, something which I do think is peculiar to our age, is the expectation that historic figures should be judged according to 21st century western values. This, especially when it is pitched against some of the figures who had a significant hand in the process of carving those same values, leaves me distinctly uneasy.

Last week, the BBC reported that Hinchingbrooke School in Huntingdon was swapping the name of one of their pastoral houses from Pepys to Lady Olivia. The process was enacted via a democratic ballot, which turned out to be a classic example of western democracy in action, given that the much-celebrated result was voted for by less than 50% of the electorate. Nevertheless, Lady Olivia, wealthy landowner, school sponsor and evangelical Christian, now finds herself named as the chosen figurehead for modern students in the school that Pepys attended, along with Oliver Cromwell. One can only hope for her that there are no skeletons in her cupboard, to be discovered down the line. There’s always a tweet.

Samuel Pepys seems to have gotten away with being a prolific sex offender without much modern public disapproval until 2025, when historian and translator De la Bédoyère went back to Pepys’s original manuscripts and translated all of his coded entries, which he wrote in a kind of Franglish, Pidgin Latin and a smattering of Spanglish. De la Bédoyère re-published Pepys’s diaries in all their glory, and the result is the extraordinarily detailed snapshot of 17th century life that one might expect; unfortunately, that life is one of a man for whom praying upon vulnerable women was a something of a daily occurrence. It was certainly an education for me, reading what this serial predator got up to on an average day, and it very much does not chime with 21st century western values. Historians are keen to point out that Pepys’s behaviour didn’t even chime particularly well with 17th century western values, as he seems to have had something of a reputation in his day. I only wish I could believe the world has changed, but let’s not pretend that it has. Men with such reputations are still running several countries.

I have been pondering the school’s decision to demote Pepys from his position as a House name and I have no wish to criticise it. The school has already made it clear that there are parts of the school named after Samuel Pepys and that those tributes to him will not change. I have no doubt whatsoever that the school was placed under enormous pressure by a vociferous minority and I don’t even have a particular issue with that in some ways: perhaps those individuals are right. If I had a daughter in the school, perhaps I might have agreed with them that there are better figureheads for her to look up to. Whatever my individual thoughts on the matter, it is inescapable that these days it only takes one parent with a bee in their bonnet and an active WhatsApp group to dictate school policy and this — for better or for worse — is the reality of where schools find themselves today. Headteachers have to pick their battles, and going out to bat for Samuel Pepys was perhaps not something the Headteacher felt was a hill worth dying on.

What I think is more interesting is to ponder whether we have lost something when society cannot tolerate undeniably serious flaws in their heroes. Is this a quirk of the kind of modern puritanism that we find ourselves facing today? If we turn to the ancient texts for our model, the authors of those understood only too well the value of a rounded hero, indeed the very definition of hero required the inclusion of multiple flaws. The notion of a “fatal flaw”, popularised by Rennaissance readings of Aristotle’s Poetics, influenced Shakespeare and other writers. There is unanimous agreement from ancient times to modern that the most interesting heroes are the ones with inherent weaknesses: a perfect hero would be a thoroughly tedious creation.

When Virgil introduces Aeneas as the hero at the beginning of his epic work, he does something quite remarkable. When we first meet Aeneas, he is at his lowest ebb. Battle-fatigued and a travel-worn refugee, Aeneas is at breaking point. He screams and cries and implores the gods to take him: why did I not die in Troy? he asks. What was the point of it all? The visceral shock of introducing us to a hero who appears to have abandoned all hope and is wishing he was dead is one of the most exciting decisions that the author could have made, and it thrills me every time I revisit the text (which has been hundreds of times over the last two years, for that section of the text is on the specification for OCR GCSE). The point, I think, is for us to reflect upon how much more impressive it is when Virgil later describes Aeneas suppressing his emotions, resuming command and leadership over his men: someone we have witnessed at cracking point does the right thing for the good of the majority and for the men in his care. Now, that’s a hero.

Not only does Virgil start his epic work with a radical take on heroism, he ends it controversially, by demonstrating that Aeneas is very much less than perfect. At the end of the epic battle that ensures the supremacy of the Trojans in their new homeland, thus securing the future of what will become the Roman empire, Aeneas is faced with his arch enemy, who begs for mercy. The tradition in ancient texts was that good heroes are extraordinary warriors but they do not give in to blood-lust; whenever a warrior is taken over by this kind of crazed, emotionally-charged violence, disaster tends to ensue and the warrior is punished for his misdemeanours. Good warriors show mercy when the time is right. Yet Virgil does not finish his work in this way. As Aeneas looks down upon his enemy, he is overwhelmed by rage, bitterness and grief: he slays him, quickly and ingloriously, and the epic finishes with our hero’s enemy groaning his last, his tortured soul shrinking away to the underworld. It is a radically depressing way to close an epic work of propaganda and reflects a true genius at his peak. The reader (or more likely the listener) is left with an uneasy sense of disappointment in our hero, left to carry the burdensome knowledge that founding an empire is not without its price and that war makes even good men do terrible things.

Perhaps indeed we have lost something along with the present-day puritanism that judges historic figures according to our modern western values and — inevitably — finds them wanting. Personally, I don’t have a problem with recognising the contribution that Pepys made through his unflinching account of 17th century life alongside the fact that the life he describes is one to which I would viscerally object. It’s what history is all about. What I hope for the future is that we can have these discussions in a more mature and nuanced way. There is nothing more irksome that the modern tendency towards cancellation and extremism, the “no debate” lobby, who consistently fail to understand that the very pluralistic society that they believe in so fervently and lobby so hard for requires endless compromise and true tolerance, the kind of forbearance that makes you feel uncomfortable and sometimes forces you to question your own values. I occasionally wonder whether the louder the cancellation crew shout, the more they’re trying to drown out the voices of doubt in their own head.

Photo by Esteban López on Unsplash

Effective study

Examinations are looming on the horizon. This year’s GCSE candidates will no doubt be receiving revision advice, yet I fear that much of it will be inadequate. While there are some schools that are doing a great job on this, others are still behind the curve when it comes to their knowledge base: teaching is sadly a profession that has been historically prone to fads and unevidenced practice, something I witnessed during my training and throughout my career. In recent years, many individual teachers have gone out of their way to inform themselves about what cognitive science has to say about effective study, and this increasing knowledge and understanding about memory and learning is finally beginning to impact upon the advice that is given to students. This can be seen in the sheer number of teachers who choose to attend ResearchED conferences on a Saturday during their own time, to inform their understanding of good learning techniques. Despite this quiet, grassroots revolution, there is still a remarkable amount of misinformation out there, and I still occasionally reel in mortification at the sorts of things that are said to my tutees when it comes to revision advice.

Much of the problem stems from the very language that is used by teachers, students and parents when it comes to revision. It is hard to know where that language comes from, but much of it seems to be ingrained and on an infinite loop, like a scratched record. Students still frequently say to me that they need to “go over” something, which by its very nature implies revisiting the content to refresh their memory. In practical terms, the advice that a student needs to “go over” something encourages them to reread their notes. A student who is attempting to be proactive about their studies may highlight key information while they read. Yet cognitive science teaches us that reading and highlighting in this way are entirely ineffective practices, for they provide the learner with a feeling of familiarity without genuinely increasing or securing their knowledge-base. Reading and highlighting can feel genuinely productive, to the extent that the student believes that they are actively engaging with an effective learning process; in reality, they are giving themselves false reassurance and not practising the process of retrieval, which is essential both for learning outcomes and for examination practice.

Kate Jones, a teacher and an expert in sharing good practice for effective, evidence-based learning, has this week published a short blog on the Evidence Based Education website, highlighting the importance of what she calls responsive revision. In the blog she did what she does so well, which is to summarise and consolidate what we know from cognitive science into a practical and effective format that is easy for both classroom teachers and students to apply. Responsive revision, according to Kate Jones’ blog, is “a deliberate, structured method of independent study in which students use retrieval to generate evidence about what they know, what they can recall, and where gaps remain. They then respond to that evidence by directing their time and effort towards strengthening those gaps.  It shifts revision from passive review to informed action. It also ensures students don’t keep going over their favourite or familiar topics but instead identify and tackle gaps in knowledge and understanding.”  

One of the most important things for students to understand is the difference between what feels familiar (the process of recognition) and what is genuine recall (the process of retrieval). When a student rereads their notes or sits and listens to a concept being explained to them again, the material will feel familiar. This gives them the illusion that they can remember something when in fact, under pressure, they will not be able to recall it. The illusion can be so convincing that it can even cause the learner to fool themselves in the process: for example, research shows that many students have the tendency to use flashcards wrongly by turning over the card too soon, resulting in the phenomenon of them recognising the answer and then convincing themselves they did indeed know the answer. The trap is surprisingly easy to fall into. One simple way to guard against it is to work with someone else and to put them in charge of flipping the cards over. Because recognising information is so much easier and more comforting than the process of forcing yourself to recall it independently, students often cling to methods that allow them to experience the process of recognition, like a comfort blanket. They may even insist that the method is working for them, because it feels safe and encouraging and gives them the illusion that their knowledge base is strengthening. In reality, they are doing nothing to aid their recall under pressure.

In her blog, Kate Jones argues that revision should generate evidence, and by that she means evidence of absence as well as evidence of knowledge. Students need to test themselves in order to evidence the knowledge that they possess and to reveal the gaps in that knowledge, keeping themselves in a constant information loop of what they can retrieve successfully and confidently, what they can partially remember, and what they cannot yet call to mind. Armed with that information, the student can then take effective ation, a process which she explores in her blog. 

If I could convince any learner of one thing that seems counter-intuitive, it would be that they should be testing themselves at every stage of their learning, including at the beginning. Students tend to resist this, for the process is challenging and uncomfortable (especially if they are not used to it in school) and the notion that they should be testing themselves on an area where they are aware that their knowledge-base is inadequate can feel rather daunting: perfectionists find it especially difficult to tolerate. Yet testing is essential to learning. When a student attempts to recall a piece of information from memory, they create the evidence base for what they do and do now know. Even more than this, not only does the process of retrieval make their knowledge (or lack of it) visible, it is also part of the learning process. For every time a student attempts to recall something and each time they manage to do so, they are working on the very thing that they will need to rely on in the examination; they are also strengthening the foundations of that knowledge base.

I cannot recommend Kate Jones’ blog highly enough for a simple, evidence-based explanation for how to go about the process of revision. Her ability to distil complex, research-informed ideas into a practical, workable guide is quite remarkable and as a result she is quite brilliant as a go-to advisory service for teachers. Her books on retrieval practice should be the benchmark for any classroom teacher. For advice directed at learners, regular readers of my blog will know that I am a huge fan of the psychologist Paul Penn’s advice on how to learn, which can be found both in his book on effective studying and on his YouTube channel.

Photo by Unseen Studio on Unsplash

Lord of the Flies: a very adult novel

Suddenly, everyone is talking about Lord of the Flies. It is one of my favourite novels, one which I taught for GCSE English literature for around a decade. I’m afraid that I have no urge to see what the BBC have done with it. I have also been somewhat irritated to see multiple hot takes on social media, critisising the story’s doom-laden attitude towards childhood and children’s psychology.

First of all, Golding was emphatically not being doom-laden about the nature of children, he was being doom-laden about the nature of humanity as a whole: let us not underestimate the extent of his doom-mongering, please. Secondly, Lord of the Flies is no more a novel about children and childhood than Animal Farm is a novel about livestock and animal husbandry. Like Animal Farm, Lord of the Flies is an extended allegory, and its message is a profoundly depressing one. So, buckle up.

Golding’s work of genius (one which he, incidentally, dismissed in later life as “boring and crude”) is a thoroughly disturbing exploration of what happens when the structures of civilisation fall away. It is emphatically not a novel about children. While the novel appears to contain the trappings of childhood: children’s games, their fears, their rivalries and their capacity for cruelty, it becomes clear as the narrative unfolds that Golding’s central concern extends way beyond childhood psychology. The island on which the children find themselves stranded is a microcosm of the world that the boys have left behind, a specimen society in which rival authorities, social hierarchies, violence and superstitious ideology rapidly emerge. Golding uses children in order to examine society stripped to its essentials, suggesting that what we call “civilisation” is a fundamentally fragile construct laid over a persistent human capacity for savagery. The novel is less an anthropological study of childhood than a parable about the nature of society itself.

From the outset of the novel, in which the boys find themselves stranded in the wilderness, the protagonists attempt to recreate the structures of the adult world from which they have come. They call assemblies, establish rules and elect a leader. Ralph’s authority rests on apparent legitimacy: he is chosen through a vote, and a conch shell is used as a tangible sign of democratic order. The conch regulates speech, embodies fairness and stands as a shared agreement among the boys to abide by rules. These early chapters might seem to suggest that humans, left to their own devices, instinctively lean towards mature governance; yet Golding makes it clear that the boys’ desire for adherence to a set of rules depends not on moral conviction but on a fear of consequences and a individual lust for dominance, for the boys speak immediately of the punishments that will face anyone who transgresses the rules they plan to lay down for themselves. Furthermore, as the hope of rescue fades, the rules lose all of their potency. As Ralph puts it, “things are breaking up. I don’t understand why.” The deterioration is not portrayed as uniquely childish; rather, it reflects how flimsy and insubstantial social contracts are when the institutions that sustain them collapse.

Jack’s transformation from choir leader to autocratic demagogue underscores this shift. His authority on the island grows not through reasoned persuasion but through his manipulation of fear and the promise of hunting and meat. He paints his face, embraces ritual and forms a tribe built on spectacle and intimidation. In doing so, he does not regress into childhood so much as adopt the tactics of a charismatic despot.

It is hinted from the outset that the boys have arrived from a society already engaged in a global conflict. The island society quickly begins to resemble the violent regimes and wartime mentalities of the adult world and the children’s play-acting of war quickly becomes indistinguishable from the very worst forms of human brutality. The murder of Simon is not an impulsive scuffle between children; it is a collective frenzy, a ritualised killing fuelled by hysteria and conformity. In that pivotal moment, Golding depicts the terrifying ease with which ordinary individuals can participate in atrocities when swept up by mass hysteria and mindless ideology. This is emphatically not a comment on the nature of children: it is a study in group dynamics and the power of suggestion.

Prior to his death, Simon’s role in the novel further supports the interpretation that Golding is examining society and group dynamics. His encounter with the pig’s head, the eponymous “Lord of the Flies,” reveals the central moral insight of the book: “the beast” that the boys fear is not an external creature but something within themselves. The pig’s head, swarming with flies, seems to speak to Simon, telling him that it (the beast) is part of them, is inside them: it is not an external force, rather it is innate to humanity. Golding aims to convince his readers that the impulse toward violence and domination is an inherent aspect of human nature, one that civilised society attempts, imperfectly, to restrain. Simon’s death, at the hands of boys who mistake him for “the beast” crawling out of the forest, symbolises the destruction of moral truth by collective fear and aggression. The tragedy lies not in the fact that the children are capable of evil, but in the implication that all humans are in the wrong circumstances.

Piggy represents rationality, scientific thought and the values of ordered civilisation. His glasses, which enable the boys to make fire, symbolise the power of technology and reason. Yet reason alone cannot withstand the tide of savagery once the social consensus collapses. Piggy is marginalised, mocked and finally killed when Roger deliberately dislodges the boulder that crushes him. This final act by Roger is particularly significant: earlier in the novel, he is depicted as throwing stones at the younger boys but he deliberately misses; the implication is that he is an inherently violent boy who is restrained in his urges by what Golding calls “the taboo of the old life.” As those restrictions erode with the breakdown of society, so too does his individual restraint. By the time he kills Piggy, Roger acts with deliberate intent. Golding’s emphasis on the gradual disappearance of internalised moderation points to his theme of the importance of societal structures in shaping and curbing antisocial behaviour. When those structures weaken, he believes, our latent cruelty surfaces.

Golding’s novel is emphatically not about childhood. The boys bring with them the hierarchies, prejudices and fears of their culture. The choirboys, accustomed to discipline and exclusion, quickly form an elite group under Jack. The “littluns” (as the youngest members of the group are collectively referred to) are marginalised and terrorised by the older boys and even Ralph, ostensibly the champion of order, participates in the violence against Simon. No character is exempt from moral compromise and this universality suggests that Golding is less interested in developmental psychology than in the broader human condition: his view of us is emphatically not a happy one.

The sudden arrival of the naval officer at the end of the novel crystallises the evidence that the island society is a mirror that Golding is holding up to the adult world. The officer is initially amused by the boys’ appearance, viewing their behaviour as a childish game. Yet he represents a world engaged in destructive warfare: his warship waits offshore, a reminder that organised violence is not confined to the island but is institutionalised in the adult society that lies beyond it. The boys’ painted faces and sharpened sticks are grotesque reflections of his uniform and the weapons he brings. The officer’s presence does not negate the horror that has occurred; rather, it frames it within a wider context. The island is not an aberration but a microcosm: Golding implies that the same forces driving the boys to chaos are operating on a global scale.

Published in 1954, in the aftermath of the Second World War and at the dawn of the nuclear age, Lord of the Flies reflects a period of unprecedented recent human destruction. The belief in steady moral and social progress had been shattered by the exposure of the Holocaust and the growing fear of atomic warfare. Golding, who had served in the Royal Navy, stated that he had witnessed firsthand man’s capacity for organised brutality and illustrating this was his purpose in writing the novel. His choice to use schoolboys as protagonists was an artistic decision: by stripping away adult institutions and placing children in isolation, Golding constructs a controlled experiment in which the island mirrors the essential dynamics of society in a concentrated form. The boys’ age if anything underscores the horrifying argument that the seeds of societal violence lie not in complex political systems alone but in the fundamental aspects of human nature. While the “beast” that the children fear can be seen as a childish nightmare, Golding does not treat their fears as trivial. “The beast” evolves into a powerful symbol of how societies create external enemies to embody internal anxieties and explain the darkness within them. The boys’ belief in the beast apparently justifies Jack’s desire for authoritarian rule and explains the abandonment of rational deliberation. In this way, childish superstition becomes analogous to the propaganda and scapegoating we find in adult societies.

It is undeniable that the novel challenged the mid-twentieth-century literary tradition, which portrayed children as naturally innocent and if anything morally superior to adults. In traditional adventure stories, still popular at the time, stranded boys tend to maintain British civility and cooperation. Golding deliberately inverts this literary convention. His boys do not build a utopia; they descend into barbarism. This inversion, however, is not a comment on children but a critique of the complacent belief that civilisation is secure and that moral behaviour is natural and instinctive. By showing that even well-educated English schoolboys can commit atrocities, Golding aimed to dismantle the myth of inherent cultural or moral superiority. Ralph’s uncontrolled grief at the end of the novel is portrayed as a source of embarassment to the naval officer. He weeps “for the end of innocence” and “the darkness of man’s heart,” a final summation of Golding’s bleak vision.

To read Lord of the Flies as a novel about the nature of children is to overlook its broader philosophical ambitions. Golding did not believe or aim to suggest that children are uniquely savage or that society alone corrupts them. Instead, he proposes that society is both a product of and a defence against the darker aspects of human nature. Civilisation provides structures — laws, social norms and institutions — that channel natural instincts such as aggression and desire into appropriate avenues. When those structures disintegrate, as they do on the island, the underlying impulses are revealed. The boys are not aberrations; they are average human beings.

Golding’s frankly brilliant work interrogates the very foundations upon which social order rests, yet it achieves this by focusing on children, whose assumed innocence sharpens the shock of moral collapse. Golding invites readers to question their comforting assumptions about progress, about culture and the nature of morality. The savagery on the island is not confined to childhood; it is an ever-present possibility within human communities. By the time the naval officer arrives, the reader understands that rescue from the island does not equate to rescue from the darkness within. Golding’s enduring message is that society’s stability depends upon our constant vigilance against forces that originate in the human heart. How’s that for a bedtime story?

Photo by Joris Voeten on Unsplash

Surprise, surprise?

No matter how long I have been working with young people, they never fail to surprise me. By the same token, no matter how long I have been teaching, I am still learning and adjusting my methods and assumptions. This is one of the many things that makes the process so rewarding and exciting.

There are a couple of students that have been working with me for a considerable period of time. Perhaps unsurprisingly, they have both made outstanding progress. This is not to blow my own trumpet, it is simply to highlight the power of one-to-one tutoring and the genuinely spectacular impact that it has when utilised for the longterm. There is lots that one can do with a short-term emergency intervention, and I have indeed worked with students to boost their grade shortly before the examinations, but in such a situation there is only so far you can go. When a parent employs you well in advance of the examinations, it undoubtedly gives their child the best possible chance not only of a better grade but also of an improved understanding of the subject they are struggling in. This is the kind of work that is the most rewarding.

The two students I have in mind were both finding the subject very difficult but both highly ambitious and high-achievers in other subjects. They are now both working at a Grade 9 level and I have high hopes for their performance in the final examinations, all being well. Yet both of them still have their moments that surprise me: for example, they will both make significant blunders in a very simple grammar question, revealing what seems to be a fissure in their knowledge when I thought it was solid. These sorts of students are genuinely fascinating and benefit from tutoring the most, for in a one-to-one session you can pivot and adjust what you’re doing to isolate that unexpected sign of trouble and work on it.

Likewise, there are students that appear to have no solid knowledge base and approach the grammar as if it were an optional extra. These students can also surprise you, for they sometimes will smash a translation out of the park, leaving you open-mouthed and wondering what the hell just happened. The issue with such students is, of course, you never know what’s going to happen on the day of the exam: of course, this is true for all students, but it is especially true for them. They will oscillate from sheer brilliance to unmitigated disaster and you never quite know which version of events you will be presented with.

As for my own learning, I am still discovering what students do and don’t know and the sands are ever-shifting. Part of teaching and particularly tutoring is endless challenge to your own theory of mind: endless reminders that other people’s human brains, especially ones that have not been on this earth so long as your own brain, are not filled with the same knowledge, thoughts and ideas as yours. Teaching in secondary schools is particularly challenging from this point of view, as you rotate between classes of various ages: one hour you can be teaching a room full of 11-year-olds, the next you will be faced with a small group of near-adults. The frequent adjustments that secondary school teachers have to make during the day in terms of knowledge, expectations and vocabulary usage can be quite dizzying.

I have been pondering in particular this week the question of how much each of my individual students understand about sailing. This might seem bizarre, but the section of the Aeneid that most of them have been set for studying this year involves a storm that wrecks the ships that are carrying the hopeful Trojan refugees from their war-torn city to a new homeland. One of my students spends half the year at the family’s second home in Cornwall, sailing with her twin brother. As a result, she knows infinitely more about sailing than I will ever do and thus, when Virgil describes “the groaning of the rigging” (stridor rudentum) and uses phrases like “the prow swings off” (prora avertit) she knows exactly what is going on. For most of my students, this has to be explained: they don’t know that “rigging” refers to the system of ropes employed to support a ship’s mast and to control the sails, nor do they know that the prow is the front of the ship. My sailing student has a good grasp of Virgil’s more poetic descriptions of the power of the sea, for she has experience of it: she knows knows what it means when Virgil describes how the winds seem to lift the waves to the stars (fluctus ad sidera tollit) and how the sea momentarily appears to be like a sheer mountain in front of the sailors (praeruptus aquae mons). Hopefully she’s never been in a ship with this happening, but she will understand the concept well enough, and will have watched the sea and understood when is and is not a good time to sail. Most of my students none of this knowledge.

Given the obvious fact that most of my students are not sailors, this week it occurred to me that I needed to unpack what was going on in the Virgil text in much more detail for them, in case they were struggling to comprehend what was happening. Most kids (and indeed most adults) have never experienced what sailing is like, so will have limited capacity to imagine the extent of the damage and disaster that is being described. I suddenly realised that it was important to remind them that just moments before, the Trojans had been described as joyfully turning their sails for the open sea (in altum vela dabant laeti) and heading for the mainland, their new home of Italy within their sights. Crucially, they were in full sail when Aeolus, god of the winds, releases the squalls and tempests across the ocean. None of my students had considered this fact until I pointed it out to them. They were then able to comprehend, even from the most rudimentary grasp of forces, that being in full sail when a storm strikes is game over for a ship and its crew. This is why the storm is such a disaster for the men on board.

One of the things that every teacher and every tutor has to remind themselves of is to constantly test knowledge and understanding, and this goes for every assmuption that you might be making about vocabulary. It is crucial to consider the fact that the student(s) in front of you may not know the meaning of the words that you are using or they are reading. The word “rigging” was a good example for me — not one of my students, with the exception of the girl who sails, knew what the word meant. I had a similar reminder with the other verse text selections for 2026, in which one of the Catullus poems refers to his “purse”. I was brought up short by the fact that several of my students did not know what a purse was: in this modern day of digital money, in addition to the fact that we are flooded with Americanisms so many people now refer to a “purse” as a “wallet”, it is in fact not surprising at all that they did not know the word.

Vocabulary is an important foundation for learning and unfamiliar terminology can quickly become a barrier to understanding key concepts. When students hear and repeat terms without a solid grasp of their meaning, they may appear confident whilst holding misconceptions that affect their progress. Only by explicitly teaching vocabulary, checking for understanding and exploring students’ understanding of words without making assumptions can we ensure that the learners in front of us can access the curriculum and build deeper, more secure knowledge.

Photo by Sebastian Bill on Unsplash

Another brick in the wall

This week, I upset a few people. That’s nothing new, for it is undeniable that I am the sort of person who sometimes opens her mouth merely to change feet. Often, this has landed me in trouble, especially when working for managers that like their staff nice and compliant; sometimes, it has earned me some respect, when I was fortunate enough to work for robust managers, those who are confident enough to respond well to challenge, even when that challenge could — in all honesty — have been better or more politely worded. When I think about some of the things I’ve said to and about management over the years, I consider myself jolly lucky to have been in a unionised workplace. Yet, in the school where I spent the second half of my teaching career, I am also grateful to have worked with managers who would listen, take note and respond thoughtfully when I said my piece, however clumsily: it demonstrates a confidence and an emotional resilience that is not to be underestimated.

These days, of course, I work for myself, so I have to go to social media to find people to upset. I can’t recall whether or not I have mentioned this on my blog before, but I have recently removed myself entirely from the platform formerly known as Twitter. It’s been something of a wrench, having been on there more or less since its inception, but needs must and it is true to say that the platform is not what it used to be. As a consequence, I have begun to spend a little bit more time on LinkedIn, which also seems to have changed, in my opinion for the better: it no longer seems to be solely dominated by corporate types humble-bragging about their mid-range sports car.

I’ve never been one for leaving platforms solely because of who owns them. Let’s face it, compared to my little world, every tech giant billionaire is probably, in relative terms, a pretty awful person. But when the owner of a platform has already proved their amorality in how they treat their staff and their customers, then goes on to double down in defending people’s “right” to manipulate, share and disseminate exploitative images of women and children, claiming that it is a “free speech” issue (something I care about passionately and do not appreciate being used as a smokescreen for abuse and exploitation), then that’s way over the line for me. So, farewell Elon, you moral cipher of a man: you won’t be getting my eyes on the advertisements that fund you any more. And hello, LinkedIn: let’s see what you have to offer. I have been pleased to find that there is an increasing amount of educational discussion on LinkedIn, and many of the brilliant go-to teacher-voices that I originally found on Twitter in its heyday are now actively posting on there. Furthermore, there is also plenty of talk about other relevant issues that interest me, some of them much more challenging than anything one would have found on there a few years ago, when LinkedIn was dedicated solely to corporate bragging and self-promotion.

The reality of being more active on such a platform seems inevitably (for me at least) to result in some low-level beef. Given that it is ultimately a business platform and thus a place where people showcase themselves and what they are bringing to market, it is inevitable that LinkedIn will include multiple voices who are crafting their image as someone who offers something to the education space that is not traditional classroom teaching (for which, given the well-documented recruitment and retention crisis, one generally does not have to advertise oneself). Such people include me these days and indeed I think and write a lot about what one-to-one tutoring enables me to do that was not possible in the mainstream classroom. The way I work now is truly liberating and I am grateful for it. What puzzles and concerns me, however, is the fact that so many people who are outside of the traditional classroom space seem remarkably keen on bashing the traditional system, and it was my objection to this that got me into trouble. I was assured that it is the system they are bashing, not the classroom teachers within it, and some people seemed to find it very insulting that I should think otherwise. But what they don’t seem to understand is that it can be pretty difficult to tell the difference. In bashing the system, they are actively contributing to the increasingly dismal situation in which classroom teachers find themselves. It is truly wretched to be a part of a system that is being relentlessly criticised on all sides, and this fact is undoubtedly contributing to the mass exodus of teachers from the profession. Harry Hudson has written very eloquently about this in his book, Must Do Better: how to improve the image of teaching.

For the avoidance of doubt, and in case anyone needs to hear this, it’s really tough out there in the modern classroom. I think more of us need to be saying this out loud. I am probably guilty of not being frank enough about it, so here is me saying that after 21 years at the chalkface, I’d had enough of being treated with contempt. In my final year, when I confessed to my husband that I wanted to resign from my job, I tried to explain to him what working in a modern school can feel like: I said, “you know that feeling when you’re walking down a towpath and you see a bunch of scary-looking lads hanging about that you have to walk through and your brain goes into high alert, wondering whether they’re going to shout something or surround you or just generally make you feel uncomfortable?” He nodded. Everyone knows that feeling. “Well,” I said, “it’s like that but all the time. Plus, those lads are your responsibility, and how you handle the situation on the towpath is at worst going to be called into question by your boss, at best will massively add to your already-horrendous workload if you decide to follow it up.”

There are very few jobs in which one can feel personally belittled and intimidated on a daily basis: teaching is one of them. Add to that the fact that in teaching, you are frequently asked what you could have done better or more empathetically in order for you to have avoided creating the situation in which you felt belittled and intimidated: I am genuinely not sure that this happens in many other spheres. Most places I go to, I see a sign up telling me that rude or threatening behaviour will not be tolerated. There’s one in our local vets, one at the GP’s surgery and I saw one in A&E when I had a surprisingly zestful response to some antibiotics a few weeks ago. Fantastic. I’m all for the signs and for the message that they convey. But schools don’t have those signs. Teachers just have to suck it up, apparently. Rude and contemptuous behaviour towards teaching staff has increasingly become par for the course in modern schools, and our teachers and TAs are expected to let it bounce off them like water off the proverbial duck’s back. We’re the adults in the room, we’re told: that may be so, but a notable number of the students didn’t get the memo.

One of the reasons I decided to move on from classroom teaching was not simply the unpleasant situations in which I increasingly found myself: it was the fact that I could feel my attitude towards young people starting to shift, and I didn’t want that to happen. I am glad to say that I hugely enjoy the time that I spend with the young people I now work with, but before I left the classroom I feared that my whole perspective on teenagers would be damaged forever, were I to spend much more time within a system that nobody is willing to support any more and everybody seems to think is part of the problem. See, this is the issue: many people — an alarming number of whom are calling themselves “educators” — seem somehow to have talked themselves into believing that the traditional education system is a net negative, that schools fail to prepare young people for “the modern world” (whatever that is: people have been talking about it since at least 1975), that the imparting of skills and knowledge in the conventional manner is deeply inadequate and should be condemned to history. We don’t need no education, we don’t need no thought control.

When belittlement is your daily reality, it can be pretty galling to scroll through social media and find yourself on a loop telling you how our Gradgrindian school system is failing young people, how every child exhibiting low-level defiance is simply dysregulated and misunderstood, how every uniform rule is an imposition on their individuality and an insult to their personal liberty, how every teacher who attempts to lay down some basic ground-rules is just another brick in the wall imprisoning them and preventing them from blossoming.

If we are to provide an education that is free to all at the point of contact (and I cling to the belief that this principle is non-negotiable), then traditional classroom teaching is here to stay. The alternative providers don’t want to hear it, but that’s the bottom line. And until we start believing that most of the youngsters in our care are able to rise to it, that the overwhelming majority of those young people are in fact infinitely capable of being both polite and attentive, if only such basic expectations were requested of them, then I fear we are set upon a path that will not end happily for any of those young people. To be clear, letting a student off is letting them down. When empathy with a student who is struggling to behave leads us down the path of least resistance, that is not kindness: far from it. It is sending them the message that we don’t care, that we don’t believe that they are capable of meeting the most basic of standards that we set for ourselves and for the rest of humanity. When we excuse challenging behaviour because of an individual’s difficult circumstances, we have to ask ourselves what we’re really communicating to that student about their potential. Just think about it: because once you see it that way, you can’t unsee it. I don’t know who coined the phrase, but it couldn’t be summed up more perfectly than this: the soft bigotry of low expectations. By adjusting our most basic standards, we make it clear to a certain kind of student that we’re writing them off as incapable of basic manners. Nothing — truly nothing — could be more inequitable or more damning for that child and their future.

This wonderful photo was taken by Maria Teneva on Unsplash

Cambridge hangovers

The Cambridge Latin Course: love it or hate it, you can’t ignore it. Longterm readers of my blog and listeners of my podcast will be aware that I have been quite critical of the CLC in the past, despite the fact that it did form the backdrop to my classroom teaching for most of my career. While I continued to use the stories (albeit adjusted) and the characters from the course, I moved further and further away from its approach to grammar during my time at the chalkface and rejected its underlying principles (show, don’t tell) pretty early on. Towards the end I had completely re-written the curriculum and had stopped using the text books altogether.

Now, as a full-time tutor, I am increasingly aware of the legacy that the CLC has left Latin teaching and I am genuinely curious to know how long this legacy will last. Whilst many schools have ostensibly stopped using the CLC, its influence on teachers’ approach remains apparent in ways that many of them are perhaps not even aware of. In this blog post I hope to reveal some of the habitual oversights that classroom teachers of Latin are making as a result of what I believe is a hangover from the CLC curriculum.

One key blind spot for classroom teachers aiming to prepare their students for the OCR examination is a failure to teach the verb malo at the same time as they teach volo and nolo. I cannot explain this, other than a legacy of the fact that malo is not taught in the CLC when volo and nolo are taught. Taylor & Cullen introduce malo at the same time (in chapter 7 of their text book), but the overwhelming majority of students that I teach are reasonably well-drilled on volo and nolo but have never been taught the verb malo. Students following the WJEC/Eduqas syllabus do not need to know malo, but those aiming at the OCR examination need to know it, so to miss this tricky verb out of one’s teaching is a major oversight. I believe that this is purely and simply because schools are following curricula that were originally built around the CLC, which makes a big deal out of volo and nolo in Book 2, but never mentions malo.

Another legacy from the CLC which I have written about before is the decision to teach the purpose clause before the indirect command. It was many years ago now when it suddenly hit me what a massive mistake this was. I asked myself why students were so wedded to the habit of translating ut as “in order to” whenever they see it and realised that it is because this is how they first see it and after that they can’t let it go. I have yet to meet a single student who has been taught the indirect command prior to the purpose clause unless they have been taught by me, and this is genuinely fascinating. Every single Latin teacher seems to assume that it is a good idea to teach the purpose clause first, and I believe that the all-pervasive influence of the Cambridge Latin Course is partly to blame. Even Taylor & Cullen do in Latin to GCSE: despite mixing up the approach taken by the CLC (they teach ut clauses first, leaving cum clauses and the indirect question until later), they still take the decision to teach purpose clauses first. In my experience, this is a massive error, and leaves students convinced that ut always means “in order to” when in fact it only means this when it’s used in a purpose clause.

My final grammar-based concern when it comes to school curricula being based around the legacy of the CLC is that teachers are still teaching the perfect active participle as if it is a broad grammar feature. This is done in the CLC, which for some extraordinary reason introduces PAPs towards the beginning of Book 3, long before deponent verbs are even mentioned in Book 4. Students really struggle as a result, since they form the understandable belief that the perfect active participle is a grammar feature that is common to all verbs. They thus struggle with the concept that most verbs have a perfect passive participle because they have not been taught that perfect active participles only exist because of deponent verbs. I have to spend a great deal of time unpicking students’ misapprehensions and misconceptions about this, teaching them in detail about deponent verbs and their features and then mapping this onto their participle. It takes so much time to dispel these misunderstandings, which would never be there in the first place were schools to adjust the curriculum to introduce the perfect active participle solely as a feature of deponent verbs.

It is genuinely fascinating to observe the fallout from text book use and to be able to identify where students’ misconceptions are coming from as a direct result of the curriculum that many schools are adhering to. I do find it worrying that so few schools are asking themselves why they are using text books that are not built around the examination that their students are aiming at, not least because the vocabulary in those text books is quite often a monumental waste of time. While the 5th edition of the CLC goes some way towards addrssing this, it doesn’t solve the problem entirely and too much of its old stucture and principles remains for the problem to be solved in its entirety.

Photo by Ivan Aleksic on Unsplash

Turn a blind eye

“Turn a blind eye” is one of those expressions that slips easily into everyday speech, a shorthand way of describing the act of deliberately ignoring something. We might say a teacher turned a blind eye to students whispering in class (never a good idea, by the way), or that a government turned a blind eye to corruption (even worse). Many people use the phrase without a second thought about its origins, but like many idioms, it comes with a story. In recent years, some people have questioned the phrase, arguing that it may be offensive or insensitive. Well, speaking as someone who actually is blind in one eye, I am here to defend it: so, brace yourselves.

The most commonly cited origin story for “turn a blind eye” dates back to the Napoleonic Wars and everyone’s favourite British naval hero, Admiral Horatio Nelson. Nelson had lost the sight in one eye earlier in his naval career, when flying debris from a shot impacted a sandbag and struck his face, causing severe damage to his retina. He is often portrayed as wearing an eye patch, but there appears to be no evidence that he did so: historic accounts seem to indicate that his eye remained intact, he simply couldn’t see out of it any more.

During the Battle of Copenhagen in 1801, Vice-Admiral Sir Hyde Parker, the commander-in-chief of the British fleet, ordered the signal for Nelson to cease fighting and withdraw. Signals were transmitted from ship to ship via the medium of flags, so the order was necessarily a visual one. Nelson was alerted to the signal to disengage, but was eager to press ahead with the attack. According to the story, he raised his telescope to his blind eye and claimed to see no signal. Having feigned ignorance of the order, he continued the battle and secured a crucial tactical victory. The rest, as they say, is history, and presumably explains why Nelson still has his statue on the top of a column in central London and Hyde Parker doesn’t.

The anecdote of Nelson’s act of defiance was popularised in later retellings and became associated with the idea of deliberately ignoring unwelcome information or instructions. Nelson’s choice to quite literally turn his blind eye to an order he did not want to follow captured perfectly the notion of wilful ignorance or selective attention. Over time, the phrase entered the broader English language as an idiom, detached from its naval origins. Speakers used it to describe actions or policies where someone in authority chose not to recognise or address a problem.

Historians, always here to spoil the fun, are not 100% certain that the phrase originated with the story of Nelson: some debate the precise accuracy of the apocryphal story and there is evidence that similar expressions already existed before the Battle of Copenhagen and that the phrase may have been popularised through literary or journalistic embellishments of naval history rather than by Nelson’s own words and actions. Whatever the truth, the phrase stuck, and for generations it has been taught in history classes and quoted in newspapers, novels and speeches around the English-speaking world. Hurrah for insurrection.

As with many idioms rooted in physical descriptions of the body, “turn a blind eye” uses a physical metaphor to express the complexities of the human psyche, indeed sight and blindness have long served as powerful symbols of human understanding and perception. To “see” something often stands for awareness or understanding, while to be “blind” to something suggests ignorance, either accidental or wilful. The metaphor is played out to its full in the story of Oedipus Rex, who is metaphorically blind to the truth of his own story, and blinds himself in reality when he discovers it. Teiresias the prophet is physically blind but is the only one that can see the truth as the story unfolds. Shakespeare likewise exploited the theme to equal horror in King Lear, in which the theme of blindness resonates throughout the play, at times to quite toe-curling effect.

Now, to the modern world. Despite the phrase’s deep history, widespread use and highly effective meaning, it has not been free from criticism in recent years. Some people today argue that “turn a blind eye” may be offensive or insensitive because it invokes blindness — a physical disability — in a potentially negative way. The concern, so far as I can gather, is that by equating blindness with wilful ignorance, the phrase serves to reinforce negative stereotypes about people who are visually impaired. This criticism is, of course, part of a broader trend in which people are told to pay closer attention to the ways language can unintentionally marginalise or demean particular groups of people.

As someone who actually is blind in one eye, I am going out to bat for the phrase (although, being blind in one eye, it is true that my batting can be somewhat haphazard). My blindness on one side (the right, as it happens) has cost me a lot, and I’m not about to let it cost me my language as well. It was a significant factor in my deciding not to drive and has affected my life in numerous ways. I now struggle significantly with eye strain and have to be careful with articifical light and screen time in order to avoid migraines, as my one good eye (not actually that good, as it happens!) is doing all the work. I am terrible at judging depth and distance, so professional tennis playing was out as a potential career; you also don’t want me to pour you a glass of red wine at an angle, trust me on that one.

I chose to tell my classes in school about it, as it was important to make clear to students that if they were waving their hand in the air on my right side I simply wouldn’t see them: I would much rather own up to a physical disablity than have children believe that I was ignoring them. Despite this, I know that my reputation as somewhat standoffish also stems from my disablity: colleagues, acquaintances and even close friends have often believed that I am deliberately ignoring them because they do not appreciate the limits of my vision. It is the problem with having what the right-on brigade call an “invisible disablity” — it is not obvious that I am blind on one side, nor is it apparent that my sight in general is pretty terrible, so as a result nobody makes any allowances for me when it comes to that. The received narrative is that Emma is rude and standoffish. Oh well. Sometimes it’s a useful reputation to have, to be honest.

Anyway, back to the phrase. The controversy around it reflects how social attitudes and awareness changes over time. Idioms such as “turn a blind eye” become ingrained in everyday speech, then one day somebody decides to unpick the meaning of the phrase and take offence. But the metaphorical connection between blindness and ignorance has been used for millenia, and is not a comment on those of us who are visually impaired. (Remember Teiresias? He was a blind man credited with insight beyond that of all others, perhaps reflecting the fact that even in the ancient world, people understood that those who are completely blind develop excellent perception beyond physical sight).

I have been lectured by keyboard warriors on the internet for using the phrase “turn a blind eye” and I shall confess that I have taken great pleasure in telling them that I am — as it happens — blind in one eye. To date, every single one of them has climbed down off their high horse and started self-flagillating, telling me that they are “still learning” and begging for my forgiveness. Dear Lord, how did we get here? I am honestly not sure when the tipping point was, when we reached the point that people feel they have to police every word they say. If I had to guess, I’d say that the turning point was about 1999.

I suspect that those who claim to find the phrase problematic have absolutely zero experience of what it is like to be blind in any sense. Were they in touch with the experience, they would understand why the metaphor works so well. Believe me, if you’re trying to get my attention beyond a certain angle to the right, you can forget it: it’s not going to happen. Even more crucially, were these people properly aware of the purported origins of the phrae, then surely they would also have to acknowledge that the phrase is clearly associated with wilful ignorance and avoidance, not merely physical disablity. According to the story, apocryphal or otherwise, Nelson didn’t accidentally hold up the telescope to his blind eye in a state of haplnessness or vulnerability: he deliberately used the telescope in this way, in order to disobey an order. That is the point! It is a story about disobedience and coolness under pressure, not about impairment. Somewhat less gloriously, I sometimes lie on my left to take advantage of my blindness and blot out the world: disabilities have their advantages, you know!

As society ties itself up in knots over what it believes is diversity and inclusion, people have begun to question whether expressions such as “turn a blind eye” carry unexamined assumptions that might be exclusionary or hurtful. I am here to tell you, people: for heaven’s sake, stop panicking and get on with your life. I don’t feel in the least bit excluded by the phrase, it is by a country mile the best, most expressive and most useful manner in which to describe what you’re trying to say. (Are we allowed to say country mile any more? Does that imply that people in the country don’t understand measures and distances? I’ll have to check).

This debate around “turn a blind eye” is just one part of a broader conversation about how language intersects with identity, power and social values. Similar discussions have arisen around other idioms and expressions that draw on physical traits or historical stereotypes. For example, phrases like “lame” to describe something unimpressive or “crazy” to describe something irrational have been questioned for their potential to offend or marginalise groups of people. In each case, speakers and writers are encouraged to consider whether there are better, more inclusive ways to express themselves. Personally, I am beginning to find it all more than a little bit exhausting. Sanitising language to the point where communication becomes awkward or laden with fear of making mistakes is crippling us all (there I go again — sorry). Learning about the historical origins of a phrase can enrich our appreciation of language rather than diminish it, and personally I’d rather enjoy the full richness of English expression than have my language policed by the terminally well-meaning.

Photo by Martti Salmi on Unsplash

Life in plastic, it’s fantastic

Last week, toy giant Mattel launched its first “autistic Barbie”. Coming hot on the heels of its first-ever doll with type 1 diabetes, sporting her own insulin pump and glucose monitor, this latest addition to Barbie’s range marks another milestone in Mattel’s purported goal to ensure that more children “see themselves in Barbie.”

While many have celebrated a neurodivergent Barbie as an important step toward inclusion and visibility for children with autism, others have raised concerns about representation and stereotypes. Supporters argue that the doll’s design — including features like noise-cancelling headphones, a tablet with communication apps and sensory-friendly clothing — will ensure that autistic children see themselves reflected in a mainstream toy. They argue that such representations could normalise the support tools that many children use in daily life, which can be empowering and affirming for them. As one poster on LinkedIn put it last week, “We have all had an opinion on the new autism Barbie. Today, I chose to leave that to the person who actually matters. I bought the Barbie for my daughter. Her reaction was immediate and joyful. “Awesome.” She picked it up and said, “Look Mum, it has the talking board you got at the parks and what my brother used.” Then, “The ear defenders are like mine. We can wear them together.” … Representation does not need to be flawless to be powerful. It just needs to be seen, felt and recognised by the people it is for.”

I have also read that autistic Barbie has been blessed with articulated joints “to allow for stimming gestures”. Now, if we’re going to talk about representing humans, autistic or otherwise, I would have thought that all versions of Barbie would benefit from articulated joints. As I recall, Barbie’s extraordinary lack of flexibility was my main issue with her back in the 1980s, when I was playing with dolls. Barbie’s fixed limbs meant that she effectively couldn’t ride her horse, only balance above it like a plastic A-frame, giving the impression that she was wing-walking rather than riding her steed. In my 10-year-old world, in which I lived and breathed all things horse-related, this was a massive let-down.

Critics of the all-new neurodivergent Barbie have pointed out that autism is an invisible, highly diverse spectrum that cannot be captured by one set of external traits or accessories. While this is arguably an issue for all representation, some people worry that relying on visible markers to represent women with ASD will reinforce simplistic or stereotypical ideas about what autism “looks like.” The debate about the new Barbie doll is, of course, part of a wider conversation about corporate “diversity” initiatives and the commercialisation of identity, with some seeing the doll as meaningful representation and others questioning whether it reduces a complex human experience to design features.

This is not a new debate, merely the current iteration of a discussion that has been evolving since Barbie’s inception over 60 years ago. When I was a child, more than forty years ago, feminists were raging about Barbie. My mother, a reasonably committed feminist herself, was nevertheless comfortable with me having a Barbie. Indeed, I had the Barbie horse (which actually did have articulated limbs, unlike its owner, but was a ridiculously stylised fantasy creature) and I also had the Barbie car, which was frankly hideous. Personally, I found the Sindy products more appealing: the horses were more realistic (of paramount importance) and her car was a sensible beach buggy, which seemed infinitely more usable when compared to Barbie’s insane mega-pink sportsmobile.

So, when and where did the Barbie doll originate, you may wonder? Well, Barbie burst onto the scene in New York in 1959, and at the time she was pretty unique. She was created by Ruth Handler, co-founder of Mattel, who had noticed her daughter playing with paper dolls, imagining them as grown women with jobs, romances and social lives. At the time, the dolls that were marketed to girls were baby dolls, designed to encourage domestic play that mimicked nurturing and motherhood. Handler realised there was space for something radically different: a doll that allowed girls to imagine themselves not as mothers but as independent adults, with working lives and hobbies. In terms of an aspirational start-point for a girl’s toy, it was actually quite progressive.

What the world ended up with was arguably anything but that. Mattel designed the look of Barbie supposedly as a teenaged fashion model, and there is no escaping the fact that she was overtly sexualised and designed around an unobtainable body ideal. Despite (or perhaps because of?) this, Barbie sold spectacularly well, becoming a cultural phenomenon almost overnight, but she also drew criticism from parents and feminist commentators, who pointed out that her figure was unrealistic and inappropriate. Her tiny waist, elongated legs and prominent bust sparked debates that would dog her image for decades. To be honest, when I was 10 I’m not sure that I saw her as a representative human, since nobody I knew looked like that. I think I saw her as an imaginary creature that was a bit like humans but not actually human: an entity designed purely for fantasy. My mother’s only comment on Barbie’s physique was on her rigid arms, fixed permanently in the position of elbows at a 90-degree angles: “probably years of carrying a tray,” she said.

As Barbie expanded through the 1960s and 1970s, Mattel worked hard to position her as a girlboss. Barbie acquired careers, first as a fashion model (sigh), then as a nurse, then a flight attendant, then eventually as an astronaut. These additions expanded her image for sure. Arguably, Barbie could be seen as wholly progressive, presenting girls with visions of independence and professional ambition, summarised in the slogan still linked to the doll: “you can be anything”. On the other hand, Barbie remained bound during this period to narrow beauty standards, with the same unobtainable body type, youthful face and a consumerist lifestyle to boot. Feminist responses to Barbie during the second wave in the 1970s were largely critical. Many women argued that Barbie taught girls to value appearance above all else and promoted a passive, male-oriented ideal of femininity: the introduction of Ken as Barbie’s boyfriend further fuelled this narrative. But a counter-narrative argued that Barbie represented autonomy and independence: she remained unmarried, child-free, financially solvent and capable of holding almost any job. What a woman! Except she still couldn’t ride a horse.

Inevitably, Barbie’s commercial success prompted other manufacturers to get in on the act. In the UK, the Sindy doll was introduced in 1963 and quickly became known as “the girl next door” in contrast to Barbie’s glamorous American swagger. Personally, as a sensible shoe-wearer from childhood to the present day, Sindy was the girl for me. She had a softer face, a smaller bust and broadly speaking more realistic proportions. She was deliberately marketed as more relatable and was certainly less overtly sexualised. Sindy’s lifestyle emphasised hobbies and everyday fashion rather than aspiration and luxury. Many parents understandably viewed Sindy as a more wholesome option, and some feminist commentators later pointed to her as an example of how dolls could and should reflect a broader, less idealised version of womanhood. Much more importantly for 10-year-old me, she had articulated limbs and could ride a horse properly.

There were other dolls of course. The Pippa doll, launched in the UK in 1966, occupied a different cultural space again. I had one Pippa doll and from memory I wasn’t keen. Smaller and thus cheaper than Barbie, Pippa was marketed primarily as another teenaged fashion doll, closely tied to the aesthetics of London in the swinging ’60s. No wonder I wasn’t interested: she was far too trendy for me. Pippa reflected contemporary youth culture rather than adulthood or career ambition, but like Barbie and Sindy, she drew attention to how dolls function as cultural reflection, the encoding of our ideas about age, class and identity.

Representation became an increasingly central issue as Barbie’s reach grew globally. The first black-skinned Barbie appeared all the way back in 1980, followed by dolls representing various ethnicities and cultures. While these moves were broadly welcomed, they were rightly criticised for being superficial, as the early supposedly “diverse” Barbies shared the same facial features and body moulds as the original, differing solely in skin tone and costume and thus rendering them a frankly grotesque parody of the women they were purported to represent. Ken, too, was “diversified” over time, although he rarely attracted the same level of scrutiny, this very fact reflecting the inescapable truth that society’s response to representations of the female body is always more highly-charged.

Disability representation, body diversity and realistic aging were largely absent for much of Barbie’s history at this time and by the 1990s and 2000s, long after my own toys had been banished to the loft, Barbie’s cultural dominance had begun to wane and criticism of her image grew louder. Discussions linked the dolls to unrealistic beauty ideals and society became more and more concerned with the unnatural and hugely limiting image she presented. In response to falling sales, Mattel undertook a series of reinventions. In 2016, the company introduced a new line of Barbies with explicitly named body types — tall, petite and curvy (I kid you not) — alongside the original stretched form that represented nobody who has actually walked this planet. These new dolls had different proportions, altered clothing fits and a range of silhouettes that disrupted the long-standing elongated form of Barbie. Mattel also expanded its facial representation, introducing varied nose shapes, jawlines and eye placements; they also significantly broadened hair textures to include natural curls, afros and braids. Later additions included dolls with prosthetic limbs, wheelchairs, hearing aids, vitiligo and the visible medical devices we find today. These changes were accompanied by marketing that explicitly framed Barbie as a reflection of “real women” and “diverse lived experiences”. Critics remain sceptical, and many people question whether such brand rehabilitation can ever meaningfully counter decades of cultural messaging to the contrary.

Throughout her history, Barbie has functioned both as a mirror and as a mould for cultural ideas about gender and adulthood. Feminist responses to Barbie and her contemporaries continue to be mixed, reflecting broader tensions within modern intersectional feminism about choice, agency, beauty and capitalism. Whether she is seen as a symbol of oprression or progressivism, Barbie reveals how deeply children’s toys can be entangled with social values. More than six decades after her launch, the debate surrounding Barbie and her rivals endures because it is ultimately a debate about how society sees women and the futures that young girls are encouraged to imagine for themselves.

Photo by Sean Bernstein on Unsplash