Another brick in the wall

This week, I upset a few people. That’s nothing new, for it is undeniable that I am the sort of person who sometimes opens her mouth merely to change feet. Often, this has landed me in trouble, especially when working for managers that like their staff nice and compliant; sometimes, it has earned me some respect, when I was fortunate enough to work for robust managers, those who are confident enough to respond well to challenge, even when that challenge could — in all honesty — have been better or more politely worded. When I think about some of the things I’ve said to and about management over the years, I consider myself jolly lucky to have been in a unionised workplace. Yet, in the school where I spent the second half of my teaching career, I am also grateful to have worked with managers who would listen, take note and respond thoughtfully when I said my piece, however clumsily: it demonstrates a confidence and an emotional resilience that is not to be underestimated.

These days, of course, I work for myself, so I have to go to social media to find people to upset. I can’t recall whether or not I have mentioned this on my blog before, but I have recently removed myself entirely from the platform formerly known as Twitter. It’s been something of a wrench, having been on there more or less since its inception, but needs must and it is true to say that the platform is not what it used to be. As a consequence, I have begun to spend a little bit more time on LinkedIn, which also seems to have changed, in my opinion for the better: it no longer seems to be solely dominated by corporate types humble-bragging about their mid-range sports car.

I’ve never been one for leaving platforms solely because of who owns them. Let’s face it, compared to my little world, every tech giant billionaire is probably, in relative terms, a pretty awful person. But when the owner of a platform has already proved their amorality in how they treat their staff and their customers, then goes on to double down in defending people’s “right” to manipulate, share and disseminate exploitative images of women and children, claiming that it is a “free speech” issue (something I care about passionately and do not appreciate being used as a smokescreen for abuse and exploitation), then that’s way over the line for me. So, farewell Elon, you moral cipher of a man: you won’t be getting my eyes on the advertisements that fund you any more. And hello, LinkedIn: let’s see what you have to offer. I have been pleased to find that there is an increasing amount of educational discussion on LinkedIn, and many of the brilliant go-to teacher-voices that I originally found on Twitter in its heyday are now actively posting on there. Furthermore, there is also plenty of talk about other relevant issues that interest me, some of them much more challenging than anything one would have found on there a few years ago, when LinkedIn was dedicated solely to corporate bragging and self-promotion.

The reality of being more active on such a platform seems inevitably (for me at least) to result in some low-level beef. Given that it is ultimately a business platform and thus a place where people showcase themselves and what they are bringing to market, it is inevitable that LinkedIn will include multiple voices who are crafting their image as someone who offers something to the education space that is not traditional classroom teaching (for which, given the well-documented recruitment and retention crisis, one generally does not have to advertise oneself). Such people include me these days and indeed I think and write a lot about what one-to-one tutoring enables me to do that was not possible in the mainstream classroom. The way I work now is truly liberating and I am grateful for it. What puzzles and concerns me, however, is the fact that so many people who are outside of the traditional classroom space seem remarkably keen on bashing the traditional system, and it was my objection to this that got me into trouble. I was assured that it is the system they are bashing, not the classroom teachers within it, and some people seemed to find it very insulting that I should think otherwise. But what they don’t seem to understand is that it can be pretty difficult to tell the difference. In bashing the system, they are actively contributing to the increasingly dismal situation in which classroom teachers find themselves. It is truly wretched to be a part of a system that is being relentlessly criticised on all sides, and this fact is undoubtedly contributing to the mass exodus of teachers from the profession. Harry Hudson has written very eloquently about this in his book, Must Do Better: how to improve the image of teaching.

For the avoidance of doubt, and in case anyone needs to hear this, it’s really tough out there in the modern classroom. I think more of us need to be saying this out loud. I am probably guilty of not being frank enough about it, so here is me saying that after 21 years at the chalkface, I’d had enough of being treated with contempt. In my final year, when I confessed to my husband that I wanted to resign from my job, I tried to explain to him what working in a modern school can feel like: I said, “you know that feeling when you’re walking down a towpath and you see a bunch of scary-looking lads hanging about that you have to walk through and your brain goes into high alert, wondering whether they’re going to shout something or surround you or just generally make you feel uncomfortable?” He nodded. Everyone knows that feeling. “Well,” I said, “it’s like that but all the time. Plus, those lads are your responsibility, and how you handle the situation on the towpath is at worst going to be called into question by your boss, at best will massively add to your already-horrendous workload if you decide to follow it up.”

There are very few jobs in which one can feel personally belittled and intimidated on a daily basis: teaching is one of them. Add to that the fact that in teaching, you are frequently asked what you could have done better or more empathetically in order for you to have avoided creating the situation in which you felt belittled and intimidated: I am genuinely not sure that this happens in many other spheres. Most places I go to, I see a sign up telling me that rude or threatening behaviour will not be tolerated. There’s one in our local vets, one at the GP’s surgery and I saw one in A&E when I had a surprisingly zestful response to some antibiotics a few weeks ago. Fantastic. I’m all for the signs and for the message that they convey. But schools don’t have those signs. Teachers just have to suck it up, apparently. Rude and contemptuous behaviour towards teaching staff has increasingly become par for the course in modern schools, and our teachers and TAs are expected to let it bounce off them like water off the proverbial duck’s back. We’re the adults in the room, we’re told: that may be so, but a notable number of the students didn’t get the memo.

One of the reasons I decided to move on from classroom teaching was not simply the unpleasant situations in which I increasingly found myself: it was the fact that I could feel my attitude towards young people starting to shift, and I didn’t want that to happen. I am glad to say that I hugely enjoy the time that I spend with the young people I now work with, but before I left the classroom I feared that my whole perspective on teenagers would be damaged forever, were I to spend much more time within a system that nobody is willing to support any more and everybody seems to think is part of the problem. See, this is the issue: many people — an alarming number of whom are calling themselves “educators” — seem somehow to have talked themselves into believing that the traditional education system is a net negative, that schools fail to prepare young people for “the modern world” (whatever that is: people have been talking about it since at least 1975), that the imparting of skills and knowledge in the conventional manner is deeply inadequate and should be condemned to history. We don’t need no education, we don’t need no thought control.

When belittlement is your daily reality, it can be pretty galling to scroll through social media and find yourself on a loop telling you how our Gradgrindian school system is failing young people, how every child exhibiting low-level defiance is simply dysregulated and misunderstood, how every uniform rule is an imposition on their individuality and an insult to their personal liberty, how every teacher who attempts to lay down some basic ground-rules is just another brick in the wall imprisoning them and preventing them from blossoming.

If we are to provide an education that is free to all at the point of contact (and I cling to the belief that this principle is non-negotiable), then traditional classroom teaching is here to stay. The alternative providers don’t want to hear it, but that’s the bottom line. And until we start believing that most of the youngsters in our care are able to rise to it, that the overwhelming majority of those young people are in fact infinitely capable of being both polite and attentive, if only such basic expectations were requested of them, then I fear we are set upon a path that will not end happily for any of those young people. To be clear, letting a student off is letting them down. When empathy with a student who is struggling to behave leads us down the path of least resistance, that is not kindness: far from it. It is sending them the message that we don’t care, that we don’t believe that they are capable of meeting the most basic of standards that we set for ourselves and for the rest of humanity. When we excuse challenging behaviour because of an individual’s difficult circumstances, we have to ask ourselves what we’re really communicating to that student about their potential. Just think about it: because once you see it that way, you can’t unsee it. I don’t know who coined the phrase, but it couldn’t be summed up more perfectly than this: the soft bigotry of low expectations. By adjusting our most basic standards, we make it clear to a certain kind of student that we’re writing them off as incapable of basic manners. Nothing — truly nothing — could be more inequitable or more damning for that child and their future.

This wonderful photo was taken by Maria Teneva on Unsplash

Cambridge hangovers

The Cambridge Latin Course: love it or hate it, you can’t ignore it. Longterm readers of my blog and listeners of my podcast will be aware that I have been quite critical of the CLC in the past, despite the fact that it did form the backdrop to my classroom teaching for most of my career. While I continued to use the stories (albeit adjusted) and the characters from the course, I moved further and further away from its approach to grammar during my time at the chalkface and rejected its underlying principles (show, don’t tell) pretty early on. Towards the end I had completely re-written the curriculum and had stopped using the text books altogether.

Now, as a full-time tutor, I am increasingly aware of the legacy that the CLC has left Latin teaching and I am genuinely curious to know how long this legacy will last. Whilst many schools have ostensibly stopped using the CLC, its influence on teachers’ approach remains apparent in ways that many of them are perhaps not even aware of. In this blog post I hope to reveal some of the habitual oversights that classroom teachers of Latin are making as a result of what I believe is a hangover from the CLC curriculum.

One key blind spot for classroom teachers aiming to prepare their students for the OCR examination is a failure to teach the verb malo at the same time as they teach volo and nolo. I cannot explain this, other than a legacy of the fact that malo is not taught in the CLC when volo and nolo are taught. Taylor & Cullen introduce malo at the same time (in chapter 7 of their text book), but the overwhelming majority of students that I teach are reasonably well-drilled on volo and nolo but have never been taught the verb malo. Students following the WJEC/Eduqas syllabus do not need to know malo, but those aiming at the OCR examination need to know it, so to miss this tricky verb out of one’s teaching is a major oversight. I believe that this is purely and simply because schools are following curricula that were originally built around the CLC, which makes a big deal out of volo and nolo in Book 2, but never mentions malo.

Another legacy from the CLC which I have written about before is the decision to teach the purpose clause before the indirect command. It was many years ago now when it suddenly hit me what a massive mistake this was. I asked myself why students were so wedded to the habit of translating ut as “in order to” whenever they see it and realised that it is because this is how they first see it and after that they can’t let it go. I have yet to meet a single student who has been taught the indirect command prior to the purpose clause unless they have been taught by me, and this is genuinely fascinating. Every single Latin teacher seems to assume that it is a good idea to teach the purpose clause first, and I believe that the all-pervasive influence of the Cambridge Latin Course is partly to blame. Even Taylor & Cullen do in Latin to GCSE: despite mixing up the approach taken by the CLC (they teach ut clauses first, leaving cum clauses and the indirect question until later), they still take the decision to teach purpose clauses first. In my experience, this is a massive error, and leaves students convinced that ut always means “in order to” when in fact it only means this when it’s used in a purpose clause.

My final grammar-based concern when it comes to school curricula being based around the legacy of the CLC is that teachers are still teaching the perfect active participle as if it is a broad grammar feature. This is done in the CLC, which for some extraordinary reason introduces PAPs towards the beginning of Book 3, long before deponent verbs are even mentioned in Book 4. Students really struggle as a result, since they form the understandable belief that the perfect active participle is a grammar feature that is common to all verbs. They thus struggle with the concept that most verbs have a perfect passive participle because they have not been taught that perfect active participles only exist because of deponent verbs. I have to spend a great deal of time unpicking students’ misapprehensions and misconceptions about this, teaching them in detail about deponent verbs and their features and then mapping this onto their participle. It takes so much time to dispel these misunderstandings, which would never be there in the first place were schools to adjust the curriculum to introduce the perfect active participle solely as a feature of deponent verbs.

It is genuinely fascinating to observe the fallout from text book use and to be able to identify where students’ misconceptions are coming from as a direct result of the curriculum that many schools are adhering to. I do find it worrying that so few schools are asking themselves why they are using text books that are not built around the examination that their students are aiming at, not least because the vocabulary in those text books is quite often a monumental waste of time. While the 5th edition of the CLC goes some way towards addrssing this, it doesn’t solve the problem entirely and too much of its old stucture and principles remains for the problem to be solved in its entirety.

Photo by Ivan Aleksic on Unsplash

Turn a blind eye

“Turn a blind eye” is one of those expressions that slips easily into everyday speech, a shorthand way of describing the act of deliberately ignoring something. We might say a teacher turned a blind eye to students whispering in class (never a good idea, by the way), or that a government turned a blind eye to corruption (even worse). Many people use the phrase without a second thought about its origins, but like many idioms, it comes with a story. In recent years, some people have questioned the phrase, arguing that it may be offensive or insensitive. Well, speaking as someone who actually is blind in one eye, I am here to defend it: so, brace yourselves.

The most commonly cited origin story for “turn a blind eye” dates back to the Napoleonic Wars and everyone’s favourite British naval hero, Admiral Horatio Nelson. Nelson had lost the sight in one eye earlier in his naval career, when flying debris from a shot impacted a sandbag and struck his face, causing severe damage to his retina. He is often portrayed as wearing an eye patch, but there appears to be no evidence that he did so: historic accounts seem to indicate that his eye remained intact, he simply couldn’t see out of it any more.

During the Battle of Copenhagen in 1801, Vice-Admiral Sir Hyde Parker, the commander-in-chief of the British fleet, ordered the signal for Nelson to cease fighting and withdraw. Signals were transmitted from ship to ship via the medium of flags, so the order was necessarily a visual one. Nelson was alerted to the signal to disengage, but was eager to press ahead with the attack. According to the story, he raised his telescope to his blind eye and claimed to see no signal. Having feigned ignorance of the order, he continued the battle and secured a crucial tactical victory. The rest, as they say, is history, and presumably explains why Nelson still has his statue on the top of a column in central London and Hyde Parker doesn’t.

The anecdote of Nelson’s act of defiance was popularised in later retellings and became associated with the idea of deliberately ignoring unwelcome information or instructions. Nelson’s choice to quite literally turn his blind eye to an order he did not want to follow captured perfectly the notion of wilful ignorance or selective attention. Over time, the phrase entered the broader English language as an idiom, detached from its naval origins. Speakers used it to describe actions or policies where someone in authority chose not to recognise or address a problem.

Historians, always here to spoil the fun, are not 100% certain that the phrase originated with the story of Nelson: some debate the precise accuracy of the apocryphal story and there is evidence that similar expressions already existed before the Battle of Copenhagen and that the phrase may have been popularised through literary or journalistic embellishments of naval history rather than by Nelson’s own words and actions. Whatever the truth, the phrase stuck, and for generations it has been taught in history classes and quoted in newspapers, novels and speeches around the English-speaking world. Hurrah for insurrection.

As with many idioms rooted in physical descriptions of the body, “turn a blind eye” uses a physical metaphor to express the complexities of the human psyche, indeed sight and blindness have long served as powerful symbols of human understanding and perception. To “see” something often stands for awareness or understanding, while to be “blind” to something suggests ignorance, either accidental or wilful. The metaphor is played out to its full in the story of Oedipus Rex, who is metaphorically blind to the truth of his own story, and blinds himself in reality when he discovers it. Teiresias the prophet is physically blind but is the only one that can see the truth as the story unfolds. Shakespeare likewise exploited the theme to equal horror in King Lear, in which the theme of blindness resonates throughout the play, at times to quite toe-curling effect.

Now, to the modern world. Despite the phrase’s deep history, widespread use and highly effective meaning, it has not been free from criticism in recent years. Some people today argue that “turn a blind eye” may be offensive or insensitive because it invokes blindness — a physical disability — in a potentially negative way. The concern, so far as I can gather, is that by equating blindness with wilful ignorance, the phrase serves to reinforce negative stereotypes about people who are visually impaired. This criticism is, of course, part of a broader trend in which people are told to pay closer attention to the ways language can unintentionally marginalise or demean particular groups of people.

As someone who actually is blind in one eye, I am going out to bat for the phrase (although, being blind in one eye, it is true that my batting can be somewhat haphazard). My blindness on one side (the right, as it happens) has cost me a lot, and I’m not about to let it cost me my language as well. It was a significant factor in my deciding not to drive and has affected my life in numerous ways. I now struggle significantly with eye strain and have to be careful with articifical light and screen time in order to avoid migraines, as my one good eye (not actually that good, as it happens!) is doing all the work. I am terrible at judging depth and distance, so professional tennis playing was out as a potential career; you also don’t want me to pour you a glass of red wine at an angle, trust me on that one.

I chose to tell my classes in school about it, as it was important to make clear to students that if they were waving their hand in the air on my right side I simply wouldn’t see them: I would much rather own up to a physical disablity than have children believe that I was ignoring them. Despite this, I know that my reputation as somewhat standoffish also stems from my disablity: colleagues, acquaintances and even close friends have often believed that I am deliberately ignoring them because they do not appreciate the limits of my vision. It is the problem with having what the right-on brigade call an “invisible disablity” — it is not obvious that I am blind on one side, nor is it apparent that my sight in general is pretty terrible, so as a result nobody makes any allowances for me when it comes to that. The received narrative is that Emma is rude and standoffish. Oh well. Sometimes it’s a useful reputation to have, to be honest.

Anyway, back to the phrase. The controversy around it reflects how social attitudes and awareness changes over time. Idioms such as “turn a blind eye” become ingrained in everyday speech, then one day somebody decides to unpick the meaning of the phrase and take offence. But the metaphorical connection between blindness and ignorance has been used for millenia, and is not a comment on those of us who are visually impaired. (Remember Teiresias? He was a blind man credited with insight beyond that of all others, perhaps reflecting the fact that even in the ancient world, people understood that those who are completely blind develop excellent perception beyond physical sight).

I have been lectured by keyboard warriors on the internet for using the phrase “turn a blind eye” and I shall confess that I have taken great pleasure in telling them that I am — as it happens — blind in one eye. To date, every single one of them has climbed down off their high horse and started self-flagillating, telling me that they are “still learning” and begging for my forgiveness. Dear Lord, how did we get here? I am honestly not sure when the tipping point was, when we reached the point that people feel they have to police every word they say. If I had to guess, I’d say that the turning point was about 1999.

I suspect that those who claim to find the phrase problematic have absolutely zero experience of what it is like to be blind in any sense. Were they in touch with the experience, they would understand why the metaphor works so well. Believe me, if you’re trying to get my attention beyond a certain angle to the right, you can forget it: it’s not going to happen. Even more crucially, were these people properly aware of the purported origins of the phrae, then surely they would also have to acknowledge that the phrase is clearly associated with wilful ignorance and avoidance, not merely physical disablity. According to the story, apocryphal or otherwise, Nelson didn’t accidentally hold up the telescope to his blind eye in a state of haplnessness or vulnerability: he deliberately used the telescope in this way, in order to disobey an order. That is the point! It is a story about disobedience and coolness under pressure, not about impairment. Somewhat less gloriously, I sometimes lie on my left to take advantage of my blindness and blot out the world: disabilities have their advantages, you know!

As society ties itself up in knots over what it believes is diversity and inclusion, people have begun to question whether expressions such as “turn a blind eye” carry unexamined assumptions that might be exclusionary or hurtful. I am here to tell you, people: for heaven’s sake, stop panicking and get on with your life. I don’t feel in the least bit excluded by the phrase, it is by a country mile the best, most expressive and most useful manner in which to describe what you’re trying to say. (Are we allowed to say country mile any more? Does that imply that people in the country don’t understand measures and distances? I’ll have to check).

This debate around “turn a blind eye” is just one part of a broader conversation about how language intersects with identity, power and social values. Similar discussions have arisen around other idioms and expressions that draw on physical traits or historical stereotypes. For example, phrases like “lame” to describe something unimpressive or “crazy” to describe something irrational have been questioned for their potential to offend or marginalise groups of people. In each case, speakers and writers are encouraged to consider whether there are better, more inclusive ways to express themselves. Personally, I am beginning to find it all more than a little bit exhausting. Sanitising language to the point where communication becomes awkward or laden with fear of making mistakes is crippling us all (there I go again — sorry). Learning about the historical origins of a phrase can enrich our appreciation of language rather than diminish it, and personally I’d rather enjoy the full richness of English expression than have my language policed by the terminally well-meaning.

Photo by Martti Salmi on Unsplash

Life in plastic, it’s fantastic

Last week, toy giant Mattel launched its first “autistic Barbie”. Coming hot on the heels of its first-ever doll with type 1 diabetes, sporting her own insulin pump and glucose monitor, this latest addition to Barbie’s range marks another milestone in Mattel’s purported goal to ensure that more children “see themselves in Barbie.”

While many have celebrated a neurodivergent Barbie as an important step toward inclusion and visibility for children with autism, others have raised concerns about representation and stereotypes. Supporters argue that the doll’s design — including features like noise-cancelling headphones, a tablet with communication apps and sensory-friendly clothing — will ensure that autistic children see themselves reflected in a mainstream toy. They argue that such representations could normalise the support tools that many children use in daily life, which can be empowering and affirming for them. As one poster on LinkedIn put it last week, “We have all had an opinion on the new autism Barbie. Today, I chose to leave that to the person who actually matters. I bought the Barbie for my daughter. Her reaction was immediate and joyful. “Awesome.” She picked it up and said, “Look Mum, it has the talking board you got at the parks and what my brother used.” Then, “The ear defenders are like mine. We can wear them together.” … Representation does not need to be flawless to be powerful. It just needs to be seen, felt and recognised by the people it is for.”

I have also read that autistic Barbie has been blessed with articulated joints “to allow for stimming gestures”. Now, if we’re going to talk about representing humans, autistic or otherwise, I would have thought that all versions of Barbie would benefit from articulated joints. As I recall, Barbie’s extraordinary lack of flexibility was my main issue with her back in the 1980s, when I was playing with dolls. Barbie’s fixed limbs meant that she effectively couldn’t ride her horse, only balance above it like a plastic A-frame, giving the impression that she was wing-walking rather than riding her steed. In my 10-year-old world, in which I lived and breathed all things horse-related, this was a massive let-down.

Critics of the all-new neurodivergent Barbie have pointed out that autism is an invisible, highly diverse spectrum that cannot be captured by one set of external traits or accessories. While this is arguably an issue for all representation, some people worry that relying on visible markers to represent women with ASD will reinforce simplistic or stereotypical ideas about what autism “looks like.” The debate about the new Barbie doll is, of course, part of a wider conversation about corporate “diversity” initiatives and the commercialisation of identity, with some seeing the doll as meaningful representation and others questioning whether it reduces a complex human experience to design features.

This is not a new debate, merely the current iteration of a discussion that has been evolving since Barbie’s inception over 60 years ago. When I was a child, more than forty years ago, feminists were raging about Barbie. My mother, a reasonably committed feminist herself, was nevertheless comfortable with me having a Barbie. Indeed, I had the Barbie horse (which actually did have articulated limbs, unlike its owner, but was a ridiculously stylised fantasy creature) and I also had the Barbie car, which was frankly hideous. Personally, I found the Sindy products more appealing: the horses were more realistic (of paramount importance) and her car was a sensible beach buggy, which seemed infinitely more usable when compared to Barbie’s insane mega-pink sportsmobile.

So, when and where did the Barbie doll originate, you may wonder? Well, Barbie burst onto the scene in New York in 1959, and at the time she was pretty unique. She was created by Ruth Handler, co-founder of Mattel, who had noticed her daughter playing with paper dolls, imagining them as grown women with jobs, romances and social lives. At the time, the dolls that were marketed to girls were baby dolls, designed to encourage domestic play that mimicked nurturing and motherhood. Handler realised there was space for something radically different: a doll that allowed girls to imagine themselves not as mothers but as independent adults, with working lives and hobbies. In terms of an aspirational start-point for a girl’s toy, it was actually quite progressive.

What the world ended up with was arguably anything but that. Mattel designed the look of Barbie supposedly as a teenaged fashion model, and there is no escaping the fact that she was overtly sexualised and designed around an unobtainable body ideal. Despite (or perhaps because of?) this, Barbie sold spectacularly well, becoming a cultural phenomenon almost overnight, but she also drew criticism from parents and feminist commentators, who pointed out that her figure was unrealistic and inappropriate. Her tiny waist, elongated legs and prominent bust sparked debates that would dog her image for decades. To be honest, when I was 10 I’m not sure that I saw her as a representative human, since nobody I knew looked like that. I think I saw her as an imaginary creature that was a bit like humans but not actually human: an entity designed purely for fantasy. My mother’s only comment on Barbie’s physique was on her rigid arms, fixed permanently in the position of elbows at a 90-degree angles: “probably years of carrying a tray,” she said.

As Barbie expanded through the 1960s and 1970s, Mattel worked hard to position her as a girlboss. Barbie acquired careers, first as a fashion model (sigh), then as a nurse, then a flight attendant, then eventually as an astronaut. These additions expanded her image for sure. Arguably, Barbie could be seen as wholly progressive, presenting girls with visions of independence and professional ambition, summarised in the slogan still linked to the doll: “you can be anything”. On the other hand, Barbie remained bound during this period to narrow beauty standards, with the same unobtainable body type, youthful face and a consumerist lifestyle to boot. Feminist responses to Barbie during the second wave in the 1970s were largely critical. Many women argued that Barbie taught girls to value appearance above all else and promoted a passive, male-oriented ideal of femininity: the introduction of Ken as Barbie’s boyfriend further fuelled this narrative. But a counter-narrative argued that Barbie represented autonomy and independence: she remained unmarried, child-free, financially solvent and capable of holding almost any job. What a woman! Except she still couldn’t ride a horse.

Inevitably, Barbie’s commercial success prompted other manufacturers to get in on the act. In the UK, the Sindy doll was introduced in 1963 and quickly became known as “the girl next door” in contrast to Barbie’s glamorous American swagger. Personally, as a sensible shoe-wearer from childhood to the present day, Sindy was the girl for me. She had a softer face, a smaller bust and broadly speaking more realistic proportions. She was deliberately marketed as more relatable and was certainly less overtly sexualised. Sindy’s lifestyle emphasised hobbies and everyday fashion rather than aspiration and luxury. Many parents understandably viewed Sindy as a more wholesome option, and some feminist commentators later pointed to her as an example of how dolls could and should reflect a broader, less idealised version of womanhood. Much more importantly for 10-year-old me, she had articulated limbs and could ride a horse properly.

There were other dolls of course. The Pippa doll, launched in the UK in 1966, occupied a different cultural space again. I had one Pippa doll and from memory I wasn’t keen. Smaller and thus cheaper than Barbie, Pippa was marketed primarily as another teenaged fashion doll, closely tied to the aesthetics of London in the swinging ’60s. No wonder I wasn’t interested: she was far too trendy for me. Pippa reflected contemporary youth culture rather than adulthood or career ambition, but like Barbie and Sindy, she drew attention to how dolls function as cultural reflection, the encoding of our ideas about age, class and identity.

Representation became an increasingly central issue as Barbie’s reach grew globally. The first black-skinned Barbie appeared all the way back in 1980, followed by dolls representing various ethnicities and cultures. While these moves were broadly welcomed, they were rightly criticised for being superficial, as the early supposedly “diverse” Barbies shared the same facial features and body moulds as the original, differing solely in skin tone and costume and thus rendering them a frankly grotesque parody of the women they were purported to represent. Ken, too, was “diversified” over time, although he rarely attracted the same level of scrutiny, this very fact reflecting the inescapable truth that society’s response to representations of the female body is always more highly-charged.

Disability representation, body diversity and realistic aging were largely absent for much of Barbie’s history at this time and by the 1990s and 2000s, long after my own toys had been banished to the loft, Barbie’s cultural dominance had begun to wane and criticism of her image grew louder. Discussions linked the dolls to unrealistic beauty ideals and society became more and more concerned with the unnatural and hugely limiting image she presented. In response to falling sales, Mattel undertook a series of reinventions. In 2016, the company introduced a new line of Barbies with explicitly named body types — tall, petite and curvy (I kid you not) — alongside the original stretched form that represented nobody who has actually walked this planet. These new dolls had different proportions, altered clothing fits and a range of silhouettes that disrupted the long-standing elongated form of Barbie. Mattel also expanded its facial representation, introducing varied nose shapes, jawlines and eye placements; they also significantly broadened hair textures to include natural curls, afros and braids. Later additions included dolls with prosthetic limbs, wheelchairs, hearing aids, vitiligo and the visible medical devices we find today. These changes were accompanied by marketing that explicitly framed Barbie as a reflection of “real women” and “diverse lived experiences”. Critics remain sceptical, and many people question whether such brand rehabilitation can ever meaningfully counter decades of cultural messaging to the contrary.

Throughout her history, Barbie has functioned both as a mirror and as a mould for cultural ideas about gender and adulthood. Feminist responses to Barbie and her contemporaries continue to be mixed, reflecting broader tensions within modern intersectional feminism about choice, agency, beauty and capitalism. Whether she is seen as a symbol of oprression or progressivism, Barbie reveals how deeply children’s toys can be entangled with social values. More than six decades after her launch, the debate surrounding Barbie and her rivals endures because it is ultimately a debate about how society sees women and the futures that young girls are encouraged to imagine for themselves.

Photo by Sean Bernstein on Unsplash

A general lack of guidance

I struggle to understand why so little guidance is given in many schools about how students should go about the process of learning. To be clear, I’m not talking about school assemblies on “study skills”, which I realise that most teenagers will zone out during. No, guidance needs to come directly from each individual classroom teacher, the subject expert; it also needs to be explicitly taught, modelled and demonstrated on a regular basis. Schools need to agree what methods they are going to recommend and this needs to be reflected right across the school in all subjects, tailored specifically to what works best in each academic discipline.

Startlingly often, students are still being told: here is your Latin set text, now off you go and learn the first section. I was guilty of this in my first few years of teaching — rote-learning comes relatively easy to me and I didn’t really comprehend the fact that most students need to be shown how to go about engaging with the process of committing something to memory. Furthermore, I was working in a very high-achieving grammar school, where we were not really encouraged to support students proactively with their learning; it was assumed that all the students in the school could cope well in academia without such support. This was a foolish assumption, but it was the one we were subliminally encouraged to make.

When it comes to the literature element of the Latin GCSE, whether or not a student knows the translation of the set text off by heart and whether they can relate that knowledge to the the Latin version in front of them is without doubt the single most important differentiator between a student’s success and failure in the exam. Despite this inescapable fact, few Latin teachers appear willing to dedicate classroom time to the learning process, so wedded are they to the conviction that students can manage the learning in their own time. Many of my tutees have been told time and again that they don’t know the text well enough, that they need to learn it, that they need to spend more time doing so. Yet when I ask them, “what methods have you practised in class?” they stare at me, blankly. I have come to realise that most students are not being taught how to learn things off by heart, beyond the most rudimentary of suggestions.

Now, I am not naive. Having taught in secondary schools for 21 years, 13 of those years in a comprehensive setting, I am more than well aware of students’ uncanny ability to claim that they have “never been taught” something that they have in fact been told on multiple occasions. However, the extreme cluelessness of so many of my clients when it comes to what to do and their apparent awe when they are taught some very basic methods such as colour-coding and the first-letter technique do leave me increasingly convinced that many classroom teachers are simply not dedicating enough (or in some extreme cases any) classroom time to learning methodologies. I’ll bet most of them are doing what I used to do in my first few years of teaching — giving students a few bullet points of advice on how to go about learning the texts, then assuming that those students will remember this going forward. But why do we believe such nonsense? We would not (I hope) present them with the endings of the 1st declension in one lesson then assume that they will remember those endings for the rest of time — so why on earth should that be the case when it comes to study skills?

One possible reason is teachers’ anxiety about time. One of the greatest strains that GCSE Latin teachers are under is time pressure. Very few schools offer enough space on the timetable for our subject and I am fully aware that making it through both set texts within the time available is a mammoth task. I rarely finished the second set text prior to the end of March; on the few occasions that I managed to do so, it was real cause for celebration. Yet despite this, as my career progressed, I allocated an ever-increasing amount of classroom time to teaching students how to go about the learning process and also to giving them short bursts of learning time to actually get on with it in silence. Any spare few minutes that I found myself in possession of at the end of a new section or a new concept, I would allow them to bow their heads and spend 10 minutes using the first-letter technique to get a few sentences of the text under their belts. I wonder whether classroom teachers are afraid of allowing students this time, as if it somehow undermines the important of our teaching role. I used to remind students that I was painfully aware how much pressure I was putting them under, asking them to rote-learn a new chunk of text almost every single week. So part of the deal I made with them was that — whenever I could — I would let them have a few minutes of classroom time to kick-start the process.

The benefits of allocating this time are twofold. Firstly, it literally does get the children started on the process and is an opportunity to remind them once again of the methods that have been recommended: I used to put them up on a summary slide, even when they could all recite the methods without hesitation. Secondly, while students are studying, a teacher can circulate the room and check whether they are actually using the recommended methods — there will always be a few determined recalcitants, who claim that the recommended methods “don’t work for them”. This is when a teacher needs to be strong. The evidence for what works and what doesn’t work in terms of how we learn is overwhelming, and unless that child can perform perfectly in every test you give them then they need to get on board with the methods.

As for what the methods should be, I recommend a variety but one is definitely stand-out brilliant and so far has worked for every student I have ever met. So if you haven’t read my old post on how to use the first-letter technique then do so straight away — you will never look back! For broader guidance on effective study I would recommend looking at the work of Dr. Paul Penn, Professor of Psychology and author of The Psychology of Effective Studying. His book is fantastic, as is his YouTube channel.

Photo by Nick Morrison on Unsplash

Off my trolley

It seemed simple enough. It even seemed like a good idea. Something I had done before and not struggled with, an easy way to earn my Good Citizen badge for the day.

On my regular route to a local megastore, I pass a garage with a carwash. For some inexplicable reason, the footpath next to the carwash has become a dumping ground for supermarket trollies. Without fail, every time I make my way through this pass, there is an abandoned trolley, standing askew. I have puzzled as to why this particular place is where someone consistently no longer has need of the trolley that they apparently did need to take out of the supermarket carpark, but the logic escapes me. Still, given that I am able-bodied and on my way to the very megastore to which the abandoned trollies belong, I always take hold of the forsaken four-wheeler and push it back to its home.

On this particular occasion, in the hiatus between Christmas and New Year, I happened upon no less than four of them, nosing each other like abandoned dogs in the underpass. Can I manage four? I mused to myself. Of course I can, beamed my gym-going, over-confident self. Four trollies will be a breeze. Of course, I hadn’t factored in the treacherous nature of supermarket trolley-wheels combined with a sharp corner, heavy traffic and a steep slope: nevertheless, I eventually made it to the trolley park, breathless but triumphant. The park was completely empty of trollies. I kid you not, not one single person appeared to have returned their trolley to its rightful home on that day. In smiling possession of four, I was thus immediately set upon by multiple shoppers, all of them making a grab for one of the trollies I had brought. You’re welcome!

I grant you that it is all too easy to bemoan the state of modern Britain, but sometimes it’s the little things that get you down. I’m not sure if I can put a date upon when the shift occurred, but I’m sure that there were indeed halcyon days when people dutifully returned their trollies to the trolley park for the benefit of others. At the risk of sounding a little deranged, I’ve been pondering this for a week or more: when and why did people stop thinking that they had to return their trolley? After much musing, I think I’ve hit upon the source of the problem. It isn’t a symptom of poor parenting, it isn’t the state of our schools and it isn’t that people have somehow become inherently worse than they used to be. The issue, I believe, is that very few of us do our shopping in anything that even remotely resembles a community any more.

When my parents speak of their youth (a timespan ranging from the mid 1930s to the post-war period), both of them talk about local shops and local tradespeople. Everyone knew everyone else’s business, for better or for worse, and local businesses were at the very heart of the community. Shops and services were run by people you knew and that meant that those shops and services were places that expected and demanded respect and acknowledgement. Shop-owners were not a faceless corporation, they were members of the inner circle. If some local scallywag caused trouble for a local shopkeeper, there would be consequences and those consequences would have an impact on family and friends.

In such a community, it was shameful to be caught doing something thoughtless, because reports of such behaviour would be shared with other members of the neighbourhood. Both my parents recall being known to all the adults in their area and they can acknowledge both the privileges and the responsibilities that came with that fact. The privileges included feeling safe and looked after in their community, the sense that they could knock on anyone’s door at any time and ask for help; the responsibilities included knowing that any misdemeanours would get straight back to their parents! It suddenly occurred to me that very few of us feel either looked after or indeed feel judged and monitored in this way any more.

Very few of us feel vulnerable to any sense of shame about our routine behaviours, because we move through the world so anonymously, or at least we feel as if we do. Small acts of selfishness such as dumping our trolley at the side of the street will rarely if ever receive any kind of direct challenge or lasting consequence. As a result, people have gradually and unconsciously learned that they can get away with such thoughtless behaviour without an impact on their own lives. I honestly don’t believe that we are any less innately thoughtful than we used to be — it doesn’t make sense for such a seismic change in human nature to have happened so quickly; rather, it is the case that we operate in a world that does not expect us to be thoughtful and in which there are no consequences for our thoughtless behaviour.

There is so much to regret with the loss of local shopping: when I think of all the hand-wringing that is done about the state of the environment, so much of that could be solved or at least mitigated against if we simply went back to local stores. Out-of-town supermarkets started to become the norm somewhere between the 1960s and the 1980s and I would argue that this caused a shift in people’s attitude towards buying produce. At one stroke, we started to feel like we were giving our money to big corporations, nameless and faceless profiteers that we all began to resent whilst at the same time demanding more and more of their wares. Within the next two generations came the internet and online delivery, meaning we didn’t even have to leave our homes to give our money to invisible people. As a result of all of this, retail as a concept has taken on an identity of its own and is completely detached from humanity.

When we do leave our houses to circulate around the premises that such corporations set up for their customers, the distate on all sides is palpable. Despite what the advertisements would have us believe, it is obvious that customers feel no liking nor obligation towards such corporations and likewise the companies themselves display barely-concealed disdain towards their customers. Buying and selling now operates in an open atmosphere of mutual contempt. If you think I’m exaggerating, then perhaps you’ve never shopped in a large megastore in one of the poorest parts of the country. To quote the words of Jarvis Cocker in his 1995 classic, Common People, “I can’t see anyone else smiling in here.” Nobody smiles and nobody talks to each other. Everyone beats a path to the automated check-outs so they don’t have to interact with a human being before they leave the store. Virtually everything is tagged because theft is so rife, another consequence of people feeling so detached from their store-merchants: research indicates that most people now believe shoplifting to be a victimless crime.

But before we get too depressed, let’s all resolve to do better. While we might indeed be forking out our money to a giant company we don’t care for, in itself owned by one of the handful of global corporations that appear to own and control the entire universe, let us not forget that within those conglomerates there are hundreds and thousands of individual people like us, people who work and shop on their premises. Let us not lose sight of our individual humanity, which I believe we still possess in bucketloads: it is simply that we are operating in a world that makes us feel isolated and unmoored, disconnected from the sometimes bewildering number of other humans that move around us. As the population increases, there is a painful irony in the fact that we all seem to feel more and more alone inside it. But as just a tiny drop in what could be a potential antidote, how about this for a New Year’s resolution? Next time you see an abandoned trolley and you’re heading towards its homeland, why not pick it up and take it with you? You might be surprised how good it makes you feel.

Photo by James Watson on Unsplash

New Year celebrations: a Roman legacy

While people around the world have been engaging in the tradition of celebrating the New Year, have you ever wondered where this custom originated from? To uncover the roots of New Year celebrations, we must (of course!) journey back to ancient Rome, where the calendar and many of the traditions we take for granted today began to take shape.

The Roman calendar initially had little resemblance to the one we use today. In its earliest form, the Roman calendar was a 10-month system that began in March, a month named after Mars, the god of war. The year ended in December, with a winter period left unaccounted for in the calendar — a gap that made the year phenomenally difficult to track. I have written before on the phenomenal mess that the Romans got themselves into with their calendar, so I shan’t re-hash it all here, but suffice to say they really did make a right old business of getting it wrong.

Back in 713 BCE, Numa Pompilius, the second king of Rome, introduced two additional months, Ianuarius (January) and Februarius (February), to bring the number of calendar months up to twelve. The month of January was placed at the beginning of the year and dedicated to the god Janus, making it an appropriate time for reflection and planning for the future. Janus was the god of beginnings, transitions and duality. Often depicted with two faces — one looking to the past and the other to the future — Janus symbolised the liminal space between old and new, making him the perfect patron of New Year’s celebrations. His domain included doorways (ianuae in Latin), thresholds and gateways, such as the beginning of a journey or a new phase in life.

In Roman religion, Janus was invoked at the start of any significant endeavour, whether it was the launching of a military campaign, the construction of a building or the start of the agricultural season. His presence at the beginning of the calendar year cemented the idea of looking both backward in gratitude and forward with hope. Naming the first month after Janus thus underpins the idea of the New Year as a moment for reflection and resolutions.

The start of January was a time for Romans to engage in rituals and festivities. Celebrations included exchanging gifts, such as coins or small tokens, which were thought to bring good fortune for the year ahead, and decorated laurel branches were also exchanged, symbols of prosperity and victory. The Romans adorned their homes with greenery and light candles, symbolising the hope for illumination and guidance in the coming year. Sacrifices to Janus were made, and prayers were offered for peace and prosperity. The tradition of making new year’s resolutions can trace its lineage back to this time, when Romans would pledge to improve themselves in the coming year, offering vows to Janus as part of their commitment. The Roman empire’s vast reach ensured that its calendar and traditions left a lasting imprint on the regions it governed. Even after the fall of Rome, the Julian calendar, introduced by Julius Caesar in 46 BCE, remained in use across much of Europe.

Caesar’s calendar reforms, which I discuss in my blog post on Roman calendars, was significant not only for standardising the length of the year but also for firmly establishing January as the beginning of it. This decision was partly practical and partly symbolic. By aligning the calendar with the solar year and by dedicating its beginning to Janus, Caesar reinforced the notion of January as a time for renewal. Over time, Christian Europe adopted the Julian calendar, and while some regions initially celebrated the New Year on different dates, January 1st eventually became the standard.

While many modern New Year customs have their roots in Roman practices, they have evolved over centuries and absorbed influences from various cultures and religions. For instance, the Christian Church initially resisted the celebration of January 1st as New Year’s Day, associating it with pagan rituals. However, by the Middle Ages, the Church had incorporated the date into its liturgical calendar, marking the Feast of the Circumcision of Jesus.

New Year’s Day stands as a testament to humanity’s enduring desire to mark the passage of time and embrace renewal. The Romans’ choice of January, their veneration of Janus and their customs of gift-giving and reflection have profoundly shaped the way we celebrate the New Year. Though centuries have passed and cultures have changed, the essence of New Year’s traditions—hope, renewal, and connection—remains timeless.

So, as we ring in the New Year, we honour not only our aspirations for the future but also the rich tapestry of history that has brought us to this moment. In every resolution made and every toast raised, the spirit of Janus lives on, guiding us through the thresholds of time.

Photo by Tim Mossholder on Unsplash

Felix Nativitas

Christmas did not begin its story in a vacuum. It arose within the vast and vibrant Roman Empire, a place where countless gods, rituals and traditions were already woven into the rhythm of everyday life. When early Christians eventually shaped their own celebrations, they did so whilst living among people who already marked their calendar with festivals, feasts and customs. Christmas was a celebration which developed in conversation with the pagan world around it, and echoes of ancient Roman festivities can still be heard to this day.

Before Christmas ever graced a church calendar, the month of December belonged to Saturnalia, the most beloved festival in the Roman year. Dedicated to Saturn, the god of agriculture, Saturnalia was a season of feasting, public merriment, exchanged gifts and an inversion of ordinary social rules. Slaves were permitted to dine alongside their masters, ordinary citizens dressed in colourful clothing and laughter filled the streets. For the Romans, Saturnalia was a cherished invitation to joy and generosity, when daylight was at its shortest.

As Christianity spread across the empire, its followers could hardly avoid the fact that they were living beside these exuberant customs. They worked, traded and travelled among people who had long found comfort in Saturnalia’s festivities. Even while Christians rejected the worship of pagan gods, the rhythms of the culture around them could not simply be dismissed. The earliest believers did not yet celebrate Jesus’s birth. Easter, with its promise of resurrection, held far greater importance at that time, and still does in many parts of the world. But the season of Saturnalia left a deep imprint on the Roman imagination, an imprint that would shape the Christmas period in centuries to come.

Another celebration, emerging later but carrying immense symbolic power, prepared the ground for what would eventually become Christmas Day itself. On the 25th December, the Romans honoured Sol Invictus, the Unconquered Sun. This was the moment in the year when the sun, having reached its lowest point in the winter sky, began its slow ascent once more. Light returned, day by day, and darkness lost its hold. As a sufferer of mild Seasonal Affective Disorder, I am still somewhat obsessed with this, and track the progress of the sun’s re-emergence quite obsessively on an app on my phone. The emperor Aurelian was perhaps a fellow sufferer, for he elevated the sun god to renewed prominence in the third century, building a temple in his honour and giving the festival the stamp of imperial authority. The symbolism was unmistakable: the rebirth of the sun signalled renewed strength, hope and the promise of triumph.

The imagery of light returning to the world resonated with early Christians. Long before Christmas existed, the early Christian writers were already describing Jesus as a radiant presence — a light that shines in the darkness, a sun of righteousness. When the time came to choose a date to mark the birth of Christ, an alignment with the festival of the Unconquered Sun carried a poetic logic. Winter solstice celebrations already existed across many cultures and Christians, surrounded by a world that already rejoiced at the return of daylight, found in them a natural metaphor for their own faith.

Yet the decision to celebrate Christmas on December 25th did not happen quickly. For centuries, Christians debated whether Jesus’s birthday should be celebrated at all. Some early theologians went so far as to criticise such birthday celebrations as pagan excess. In the end, theological reasoning blended with cultural reality, and a compromise was reached. The celebration of Christ’s nativity was drawn into the orbit of Rome’s winter festivals.

Once Christianity gained legal recognition under Constantine in the 4th century, church leaders faced the challenge of guiding a vast and diverse population into a new religious identity. The empire still had the legacy of the customs of Saturnalia, the reverence for Sol Invictus and countless other local traditions. Abolishing such celebrations outright would have caused confusion and led to civil unrest. Instead, Christian leaders chose the path of least resistance: they recast familiar festivities with new meaning. They did not graft pagan worship onto Christianity, but they repurposed cultural habits — gift-giving, feasting and decorating homes — to fit the story that they wanted to tell. In doing so, they allowed people to continue the customs they loved whilst shifting the spiritual focus.

Christmas grew within this climate of adaptation and reinterpretation. Many of the customs that now feel inseparable from the holiday were once part of Roman winter traditions. The exchanging of gifts, once associated with Saturn’s festival, found a new home in the tale of wise men bearing offerings for a newborn child, and in the Christian emphasis on charity and care for the poor. Feasting and joyful gatherings continued, now wrapped in the language of celebration for Christ’s birth rather than Saturn’s agricultural blessings. Lights and candles, once meant to honour the returning sun, became symbols of the divine light that entered the world in Bethlehem according to Christian belief. Even the greenery that adorned Roman homes during winter — a symbol of life persisting in the cold — persisted in later centuries as wreaths, boughs and eventually the Christmas tree.

Such continuities do not make Christmas a pagan holiday in disguise. Rather, they reveal how cultural transformation naturally unfolds. Christianity, growing from a small sect into the dominant religion of a sprawling empire, had to find ways to speak to the hearts and habits of its people. In Rome, this meant placing the celebration of Jesus’s birth in a season already rich with meaning, then slowly reshaping that meaning through worship, stories and symbolism. As centuries passed, Christmas continued to evolve. Medieval Europeans added their own layers of traditions of plays, feasts and symbolic foods. Later still, modern customs from Victorian England and American culture reshaped the holiday yet again, giving us carols, cards, Santa Claus imagery and the commercial bustle that now defines the season, for better or for worse. But beneath all these layers, the ancient Roman foundations still flicker like candlelight. The joy of gathering with others in the dark of winter in anticipation of the increasing daylight to come; the encouragement to be generous and think of others in need; the glow of lights that promise warmth and renewal. All these traditions echo the old festivals that once marked December long before Christ was born.

Understanding this intertwined history should not diminish Christmas for anyone, Christians included. The holiday stands as a testament to humanity’s enduring desire to find meaning in the dark months, to celebrate hope’s return, and to bring warmth into the coldest part of the year. Through Christianity’s encounter with Rome’s festivals, the season became a bridge between worlds — between old gods and the new faith, between ancient customs and evolving traditions, between winter’s chill and the promise of returning light. In that sense, Christmas is not merely a date on the calendar, but a centuries-long story of cultural evolution, a process that is still unfolding each time December rolls around.

Photo by Mariana B. on Unsplash