Delayed gratification

This week I have found myself having a very stern conversation with one of my cats. Her name is Piglet. Piglet by name, piglet by nature. The animal simply cannot help herself when it comes to food. If she had her way, she’d be the size of a house, hauling her enormous belly around like a competitor in the World’s Strongest Man. Fortunately – or unfortunately, as far as she is concerned – she has mean old me controlling her food intake.

So, Piglet and I had to have a very serious conversation about her life choices. This is a cat that was in line to receive some small pieces of chicken as a treat. See, I’m not always mean: I had even taken the pieces out of the fridge, to bring them up to room temperature. Piglet, however, elected that evening to wolf down the remaining supper of our other cat, who is currently being rather delicate about her food intake. The second cat is in the early stages of renal failure and so is on a specialist prescription diet. When my back was turned for a nano-second, I failed to register that Dolly had walked away from her food and so I turned around to find Piglet urgently inhaling the last scraps of Dolly’s prescription dinner.

“You could have had some chicken pieces this evening!” I admonished her. “As it is, you’ve made the choice to eat the prescription cat food, so now you’re not getting anything else.” She stared at me, unmoved and unimpressed, still cleaning her whiskers after the extra feed she had claimed for herself.

In reality, of course, the cat’s brain is not capable of understanding the point. She’s a very smart cat, but she has not yet mastered English, nor has she worked out that stealing the prescription cat food means missing out on her chicken treats. She is also – being a cat – not capable of making the fundamental decision of delayed gratification, something which human psychologists and the world in general like to cite as a crucial indicator of our future success as adults. Or is it?

I am quite a fan of The Studies Show, a podcast hosted by two science writers called Stuart Richie and Tom Chivers. In each episode, they debunk various stubborn myths that persist either as a result of poor science or as a result of the science being poorly reported or interpreted (or both). They investigate how science is at the mercy of human bias like any other subject, and explain things such as confounding, publication bias and collider bias (I am still struggling to grasp the last one in full). In one particular episode, they explore the experiment nicknamed “the marshmallow test”, which was hailed as a groundbreaking study into impulse control in very young children, with some quite extraordinary claims made about how the findings were linked to future success in several walks of life – in education, in financial stability, in relationships and in health.

In various tests, performed on a group of 4-year-olds in Stanford University in the late 1960s and early 1970s, psychologists offered several hundred children a choice between either one or two sweet treats. The children were offered the choice of either taking one treat which they could have immediately, or if they waited for an unspecified amount of time, during which the psychologist left the room, they would then be allowed two treats. Times that the children were left to wait varied but could be up to 20 minutes. One point, made hilariously by Tom Chivers during the discussion, is to question whether some smart four-year-olds might already have a sound understanding of the value of their own time. “You know what, one marshmallow isn’t worth 20 minutes of my time, mate!” he imagines them saying. Stuart Richie then ponders whether marshmallows were a significantly bigger deal in the 1970s compared to now – what kid in the mid-2020s is going to wait 15 or 20 minutes just for one extra marshmallow?

The issues with the study are many, but the most dubious are the claims that were extrapolated from two follow-up questionnaires, which were responded to by only around 100 of the original 653 participants – meaning that more than 80% of the candidates were not included in the two follow-up studies, which looked at the children in later life. Chivers and Richie also raise the query that the original test was confounded by the fact that different children were given different coping strategies to assist with the waiting time – for example, some were encouraged to use distraction techniques, others to focus on the end reward. This is because the original purpose of the research at Stanford was to try to find out which of the coping strategies would help children most with delaying gratification – the idea of following them up to see which children became more successful in later life came some time afterwards, which may explain why Stanford lost touch with so many of the participants. However, it is the later follow-up studies that caused all the excitement, as they supposedly found a quite remarkably strong correlation between later success and the period of time that the younger children had managed to wait before receiving their reward. The claim – of course – turns out to be nonsense. The correlation only worked with children who had not been offered any coping strategies to help to delay the gratification, which somewhat begs the question why the primary author of the study believed so strongly in the teaching of delayed gratification as a life-strategy. Far more importantly, however, the correlation all but disappeared in replication studies, when controls were introduced for socio-economic background and previous academic success, both of which are far more obvious likely predictors of future academic attainment and overall success.

Chivers and Richie link the wild extrapolations taken from this particular study to similar attempts to introduce the concept of “growth mindset” in schools, another topic of academic research that they take a sledgehammer to in a previous episode. I remember this particular fad very well, as at the time in my school we had one particular Senior Manager who had read Carol Dweck’s book The Psychology of Success and was a shiny, happy acolyte for the concept that the tiniest shift in rhetoric – basically, praising kids for working hard rather than for their smarts – would somehow revolutionise their success in the classroom. It may not surprise you to know that it didn’t, and that the studies in this area have since been shown to prove nothing of the sort.

This is not to say that delaying gratification is not an important skill. It is, of course, an important part of growing up and becoming a successful adult that one learns to some extent to place tasks in an order of importance and/or urgency, rather than focusing entirely on what you would most like to do in the moment. Studying for an exam, preparing for a competition or an interview, exercising and eating the right things for the benefit of your longterm health are all simple goals shared by many which require this skill. In my experience, children acquire the ability to delay their gratification at different rates and while some teenagers have fully mastered the process others are still grappling with their motivation and find it really hard to set aside the things that they enjoy the most to focus on something important but less interesting. One of the greatest things that schools can do is thus to focus on assisting children in their ability to concentrate, as a lack of attention in class remains by far the biggest barrier to academic success for many of our most vulnerable students.

In the meantime, Piglet remains at the mercy of her desires and will no doubt continue to make a lunge for every tasty morsel she can find in her path. I have often said that one of the joys of keeping a cat is that they teach you how to live your life and speaking as someone who doesn’t always remember to reward myself just for the hell of it, Piglet serves as a feline reminder that sometimes making a dive for the thing you crave the most is to be recommended.

Piglet, who can only delay her gratification while sleeping

France is bacon and other misconceptions

When I was young, my father said to me: “Knowledge is power, France is bacon.” For more than a decade I wondered over the meaning of the second part and what was the surreal linkage between the two. If I said the quote to someone, “Knowledge is power, France is bacon,” they nodded knowingly. Or someone might say “Knowledge is power” and I’d finish the quote, “France is bacon” and they wouldn’t look at me like I’d said something very odd, but thoughtfully agree. I did ask a teacher what did “Knowledge is power, France is bacon” mean, and got a full 10 minute explanation of the “knowledge is power” bit but nothing on “France is bacon.” When I prompted further explanation by saying “France is bacon?” in a questioning tone, I just got a “yes”. At 12 I didn’t have the confidence to press it further. I just accepted it as something I would never understand. It wasn’t until years later I saw it written down, “Knowledge is power,” Francis Bacon, that the penny dropped.

Anonymous post on Reddit, 2011.

The ease with which such misconceptions can arise is something that all teachers should be aware of. Most likely, you can remember some of your own from childhood. For me, most memorably, it was the phrase “rich as Croesus”, which my mother used to use regularly. As a kid, unsurprisingly, I’d never heard of the ancient Greek king of legendary wealth, so I heard “rich as creases.” For years I wondered what being rich had to do with having creases, or why creases were considered to be the same thing as being rich. I just put it down to one of those weird things that grown-ups say.

It is important to remember that much of what adults say is inherently puzzling to young children. Before we berate them for a lack of intellectual curiosity (why on earth didn’t I just ask … ?), it is important to remind ourselves that pretty much everything that adults say or do can seem puzzling on some level to very young children. It is not, therefore, surprising when they shrug and accept a saying or something that they are told is a truism that makes little obvious sense: nothing makes obvious sense when you’re small.

Further to that, the account of the child who heard “France is bacon” illustrates the anxiety that most children have that they have at best missed something obvious or at worst that they are inherently stupid. You can feel the child’s unease as they anxiously test the waters with the various ways in which they attempt to have the saying explained to them. Even the teacher completely misses the opportunity to correct the misconception, as they clearly did not realise where the misconception lay. This illustrates the tendency that we have as teachers to assume that we already understand what it is that a child needs explaining to them: in this case, the teacher assumed that the child was puzzled as to the underlying message of the saying – in what sense can knowledge bring power? What the teacher actually needed to do was to quiz the child on why they were asking about it – what was puzzling them about the quotation? Had the teacher done so, the misconception would have been identified and rectified.

One of the things that I love about tutoring is the opportunity that the one-to-one setting brings to uncover such misconceptions or gaps in a child’s knowledge. This is partly because of the time and focused attention that it affords, but it is also because of the opportunity that you are offering a child to ask all of those “stupid” questions that they’ve been bottling up for years. Nothing brings me greater joy than a tutee who develops the confidence to interrupt me and demand an explanation for something, or to ask me a question that I did not realise that they needed to ask. That’s when the relationship between the tutor and their student has really developed, when a child gains the confidence to demand the most out of their sessions.

Just recently, I was reminded how careful we need to be when assuming what a child knows. I showed my tutee the translation of a Latin poem by Catullus, which contains the metaphor “my purse is full of cobwebs”. Now I went in with the assumption that the child might need encouragement to grasp the metaphor, as many children do not find these as easy as you might assume. During the discussion, however, I discovered that she did not in fact know what “a purse” was. There was no chance of her understanding the metaphor until that was rectified! It had not previously occurred to me that this might be a word that a 16-year-old might not know: but if your family have always used the word “wallet”, or your parents carry their change in their jeans, or – as is becoming increasingly the norm – they don’t really carry cash at all, then maybe it is simply not a word you have come across. We should never, ever assume.

Misconceptions that arise from mishearings such as “France is bacon” or “rich as creases” also illustrate the essential importance of dual coding. A couple of years ago, I realised that one of my tutees was convinced that the dative case had something to do with numbers. After a couple of minutes of trying to explore where this misconception had come from, I suddenly realised what had happened: his teacher had (quite rightly) taught his class that the dative case was to be translated as “to” or “for”. My tutee, however, had heard “two” or “four”. He heard numbers instead of words, and he had been understandably confused ever since. Yet had the teacher simply written the words “to” and “for” on the board as well as saying them out loud, this misconception would have been avoided. So many people confuse dual coding with the idea of simply putting a nice picture on their handouts, or the ridiculous belief that illustrations are essential for basic vocabulary learning. Not a bit of it. Dual coding is the process of combining words with visual stimulus. It is used to help the brain to grasp a concept without misconceptions: using a visual representation of what you are explaining in written words, or writing down what you are explaining verbally.

Children will always form misconceptions and that fact is nothing to be feared. It does, however, mean that teachers must be particularly alert to them and the methods that are most likely going to help to resolve them, or to prevent them from forming in the first place.

Photo by Daniele Levis Pelusi on Unsplash

The role of the translator?

The festive season would not be complete without an EduTwitter bust up, and this year there was more than one. Pleasingly, the one that’s rumbled on the longest is a controversy surrounding Homer’s Odyssey, striking a rare burst of attention for Classics in the broader world of education. The debate started with some people arguing about whether school-teachers have or should have read or taught this text in schools. Quite why anybody cares remains a puzzle. After a few days, however, the row spread amongst a much wider audience and mutated into reactions to a recent translation from the original Greek by Emily Wilson, who in 2018 became the first woman to publish a full English translation of the Odyssey, to much fanfare.

Predictably, people’s reactions to Wilson’s translation fall into clear political camps. The Guardian hailed it as “groundbreaking” and a feminist interpretation which will “change our understanding of it forever” while critics more right of centre breathed anxiously in and out of a paper bag and muttered dire warnings about the decline of the West. It’s all very silly, but in amongst all the hysteria has been some unintentionally thoughtful commentary: sometimes, people don’t even realise that they’ve said something interesting while they are trying to score a political point. “The job of a translator is not to attribute postmodern ideas of sex oppression to a writer who has been dead for 3000 years” raged Charlie Bentley-Astor. Broadly, I don’t disagree with her, but I found myself pondering: what is the role of the translator, exactly?

Those who have not studied languages, particularly ancient languages, might find this question bizarre. The role of the translator, surely, is to reproduce the text as faithfully as possible in a different language? Well, yes. But you see, it depends what your priorities are and it depends what aspects of the text you believe are most important to remain faithful to. The spirit and mores of the times? The lyrical qualities of the original? Its readability? And what is the purpose of your translation? To support the study of the text in the original language? Or to open up the text to a wider audience, who will never have the chance to study it in the original Greek? These are just a handful of the questions that a translator must ask themselves. The translations that I produce for students who are studying a text in the original Latin are clunky and unsuitable for publication. This is because their sole purpose is to facilitate the students’ understanding of the Latin text in front of them, on which they will be questioned in an examination: I do not produce my work for the pleasure of a general audience, so I am not aiming at fluidity, readability or beauty, all of which are potentially important when publishing a translation for a wider readership, for people who want to enjoy reading a text for pleasure.

The power of the translator is immense, and those who are exercised by Wilson’s approach are upset by the fact that she has been credited with approaching the text from a more feminist standpoint, potentially imbuing it with a set of values that could not have been imagined by Homer himself. Yet I simply do not understand the hysterical reaction by some conservatives, who seem completely oblivious to the fact that this interpretative dance has gone on since the dawn of time. Every translator inculcates a text with his or her own priorities, and every translator knows that. If you are picking up a translation of an ancient text and you honestly believe that it will be giving you a faithful, full and accurate rendering of the original author’s meaning and intention then you are deeply naïve, for this is impossible. It is for this reason that I have never understood those who claim to understand the “word of God” when they have not studied their own religious texts in the original language in which it was written. So, Christians, off you go to learn Hebrew and New Testament Greek!

Let’s just take one very simple example of the problem. Imagine that I were producing a translation of Virgil’s Aeneid. How would I render the phrase “imperium sine fine”, which is what Jupiter states that he will grant to the Romans? The word imperium has multiple meanings and can be translated as “power”, “command” or “empire”. It had some quite specific and technical meanings in relation to the command that a general had over a region but was also tied up with the Roman belief that the expansion of their dominion was a fundamentally good and noble thing: this included the exercising of power over other nations and the geographical expansion of their borders. The phrase sine fine could be rendered “without end” or “without borders” – it refers both to the physical extent of the Roman empire and to their belief that their domination was not only unlimited in terms of their relationship with the world, but that it would be unlimited in time. The phrase is therefore deeply resonant in the Roman mindset – that their empire, their military might, their control over the world was divinely-granted: it had no borders and it would last forever.

I would argue further that it is not only the layered meanings that such a phrase had for the Romans that have to be considered when translating this phrase now. As readers from a modern perspective, in the full knowledge of the decline and fall of the Roman empire, the phrase imperium sine fine has a poignancy for us that Virgil could not have imagined. This does not mean that its meaning to a modern audience does not have value – quite the opposite. There would be little point in the survival of ancient texts if they were not to strike resonances within us as a result of the changes that have taken place since they were written.

The importance of capturing the spirit of a text over and above remaining faithful to its construction is a challenge faced by those who convert a classic novel into a film or a drama. The 1992 film version of John Steinbeck’s Of Mice and Men, directed and produced by Gary Sinise and starring John Malkovich is – in my opinion – an absolute masterpiece of this spiritual capture. The opening scenes are entirely invented by the film-makers: a terrified woman in a torn red dress runs across farmland, then we see men on horseback who appear to be in pursuit of the woman’s assailant. Those of you who know the novel will understand exactly the background to George and Lennie’s situation that this represents and in my view it was a brilliant leap of imagination to transfer the information to film in this way.

Conservatives who fear “incorrect” interpretations of a text fail to understand that the enduring appeal of a text lies in its interpretation. Believe you me, Aristophanes did not have a feminist slant in mind when he wrote Lysistrata, a comic play poking fun at the incompetence of the Athenian political intelligentsia, who were doing such a God-awful job that even the women could probably do it better! That was the joke, for the Athenian audience. Yet Lysistrata is – inevitably – read and performed as a feminist play in the modern setting, and I have enjoyed productions that have rendered it thus. Thus, I do find myself chuckling at the rising hysteria expressed by many who seem so terrified by the fact that Homer has now been translated by a woman. I do wish they could understand that Homer will survive: he’s big enough and man enough to take it.

Photo by Becca Tapert on Unsplash

Stress? What stress?

For various reasons, I’ve been thinking about stress. More specifically, stress relating to the work that people do. As we bed in to the holiday spell (for some, I have read, quite literally), there will be people reading this who find themselves wondering where they will find the strength from to go back into work.

While everyone will experience work-related stress from time to time, it is a truth universally acknowledged that some jobs are apparently more stressful than others. This universally-accepted truth is riffed upon beautifully in an old Mitchell and Webb sketch, which I won’t link to because it gets a bit post-watershed towards the end. The scenario drawn is one partner coming home from a tough day at work as a paediatrician, working with sick and dying children; the running gag is his earnest desire to reassure his partner, whose job entails tasting new products at an ice-cream factory, that their careers are both equally important and equally pressurised. “Just because I’m a paediatrician dealing with severely ill children, doesn’t mean that you can’t have a tough day tasting ice cream,” he says.

People have wildly varied takes on the levels of stress that they assume come with classroom teaching. Some people seem irrevocably wedded to the idea that teachers are work-shy layabouts who finish at 4.00pm on the days that they do work, plus luxuriate in an almost unlimited supply of holiday time when they don’t. I lost count of the number of times someone hurled the “long holidays” at me like it was a brilliant gotcha. After a while, I used to hurl it back. “Teaching is a fantastic job,” I would say. “Did you know that there is currently an enormous drive to get more people into teaching, so given how convinced you are of the benefits, shall I send you a link to the courses that are recruiting? You even get paid to train!” That usually shut them up.

There have always been people who think that teaching’s a breeze. There are plenty of others who believe that it is horribly stressful. At times, they were right. While the average classroom teacher will not find themselves in charge of a multi-million pound budget, nor will they find themselves in a position where they are hiring and firing, nor indeed are they likely to find themselves presenting their work to a roomful of demanding CEOs, I’d like to see those same CEOs try their hand at managing a roomful of Year 10s on a hot afternoon when there’s a wasp in the room.

Let’s be honest. My subject, in the grand scheme of things, is relatively unimportant. While I can bang the drum of what A Good Thing Latin is for all students, let’s not be silly about this: whether or not a student attains a respectable grade in their Latin GCSE is not going to affect their life-chances (unless their life-plan is to become a Professor of Classics, and even then there are ways around that particular problem). However, most Latinists who work – as I did – in the state sector, will find themselves expected to earn their keep by offering at least one other mainstream subject. For me, that was English. As a result, I have found myself solely responsible for the GCSE English grades of several cohorts. This has included sets where there was an enormous focus on what used to be the C/D borderline and sets where their chances of making it to that borderline were considered slim. This, in very real terms, meant that I was directly responsible for a student’s life chances. I am not being over-dramatic, I don’t think. In all honesty, whether a child attains a pass grade in both English and Maths will shape their destiny in ways that few people outside education are fully aware of. A child who does not attain their GCSE English and Maths is largely condemned to a life on minimum wage. This may sound over-dramatic, but it is broadly true. Of course, there are plenty of exceptions, including many successful entrepreneurs who take pride in citing their scholarly failures as a badge of honour. I’m glad for them that they overcame this hurdle, but a hurdle it is, and one which proves impossible for the majority to overcome. I have never cried more tears of joy than when my students who had been classified as unlikely to pass managed to do so. For them, it quite literally meant the difference between poverty and a fighting chance. These kids, by the way, fought me every step of the way and if they’d had their way they never would have sat the exam in the first place. That, I would argue, is a considerable pressure, one faced by thousands of teachers across the country every year: helping kids to get over a barrier, with them quite literally doing everything in their power to remain behind it.

Another factor which many people fail to appreciate is the number of safeguarding concerns that your average teacher is exposed to during their career. I never specialised in pastoral care and did not do any training in the field of safeguarding beyond that which is expected of anyone working with young people, yet in my time I came across cases of neglect, of child sexual exploitation, of child criminal exploitation, of illegal drug use and more besides. On the penultimate day of my 21 years at the chalkface I became aware of what I was concerned could be a potential case of FGM and was urgently summoning Designated Safeguarding Leads to my classroom for advice, all while maintaining a calm demeanour and continuing to run the classroom and teach my lessons as if nothing were afoot. This is the kind of thing that teachers do every day and I am not sure whether other people realise this. We don’t talk about it much, partly because it’s not appropriate, but partly because it is – or has become – the norm. It is not unusual for teachers to be working with children who are experiencing genuine trauma; it is not unusual to be painfully aware of some deeply troubling circumstances that a child may be experiencing at home.

For most of my career, I loved my job. I also considered it a considerably less stressful deal than others experienced by more high-powered friends who managed large budgets or were responsible for people’s livelihoods in their business. Yet sometimes I would remind myself that I was, in many ways, responsible for people’s livelihoods. A teacher can shape someone’s future in unimaginable ways and their influence – for better or for worse – can dictate which doors are open and which ones are closed in the future. If you are a teacher, never underestimate that power.

Photo by Stormseeker on Unsplash

Responsive Tutoring

One of the most powerful tools for promoting student progress is what’s called assessment for learning (AfL). When I was first teaching and the phrase was all the rage, you wouldn’t have passed an interview without mentioning it. While the acryonym AfL is less often used these days, it still underpins modern teaching.

The thinkers credited with the founding principles behind the use of AfL in the classroom are on record as saying they wish they’d called it something else. Rather than “assessment for learning”, they wish they’d called it “responsive teaching” and I can see why. In many ways, AfL is about neither assessment nor learning – at least, not in isolation. AfL, or rather responsive teaching, is about what a teacher does differently in response to where their students are in terms of their understanding.

While summative assessments (such as a GCSE examination) focus on evaluating final outcomes, AfL is embedded in day-to-day teaching in order to gauge students’ progress, clarify misunderstandings and – most crucially – to guide further learning. Effective use in the classroom presents a unique set of challenges for teachers, especially when working with larger groups. The process is infinitely easier in a one-to-one setting, where the dynamic between the tutor and the tutee shapes the entire process.

Responsive teaching is meant to be a continuous loop, the gathering and interpretation of evidence used to shape a teacher’s instructional decisions. AfL can also be used to help students to recognise their own current level of understanding and set goals to improve. It is meant to be an ongoing, dynamic process and requires teachers to have a nuanced understanding of each student’s needs, strengths, and areas for improvement. To be effective, AfL requires not just frequent feedback but feedback that is individualised and actionable. In a one-on-one setting, a tutor can more naturally meet these requirements, while in a classroom with multiple students, the process becomes complex, requiring considerable skill and resourcefulness from the teacher.

When implementing AfL in the classroom, teachers encounter several challenges that are unique to managing large groups. In a classroom of 30 students, teachers must balance AfL with the demands of covering the curriculum, managing behaviour and addressing a multitude of diverse learning needs. The time constraints are significant. For each student, providing specific feedback and tailoring instructional adjustments is an ideal that is often close to impossible to achieve in practice. In any single lesson, a teacher may only have a minute or two to focus on each student. This time is rarely enough for comprehensive feedback, making it challenging to provide meaningful guidance on areas for improvement.

In larger classrooms, teachers have to rely on quick, general assessments, such as asking questions to the class or using hand-raising methods, but these approaches can miss individual nuances and only provide superficial insights into each student’s understanding. Real-time feedback is essential for the process to work, but logistical challenges mean that teachers sometimes delay feedback until they can examine students’ work. This delay can diminish the impact of the feedback and may hinder a student’s immediate progress. It also places a significant workload burden on the teacher: even schools who have understood and embraced the principles behind whole-class feedback are still placing a considerable assessment burden on the classroom teacher in terms of work that must be completed outside the classroom.

In any classroom, some students may actively participate and show enthusiasm, while others remain quiet or withdrawn. Unless a school has fully embraced and embedded the principles of “no excuses”, teachers will struggle to gauge the understanding of all students. Ensuring equal participation is challenging, and without specific engagement from each student, teachers may only get a partial view of the overall class understanding. Implementing AfL strategies requires significant time and energy, which teachers often need to dedicate to managing classroom behaviour. Students can become disengaged, especially if they don’t immediately understand a lesson or find it challenging. The need for behaviour management can take time away from delivering AfL, reducing the effectiveness of feedback and lesson adaptation.

By contrast, one-to-one tutoring provides an environment where AfL shapes and defines the entire process. In a one-on-one setting, the tutor’s focus is exclusively on a single student and this individual attention means the tutor can tailor questions, feedback, and guidance specifically for that student. Any misconceptions or gaps in knowledge are immediately identified and addressed, without the need for complex assessment. For example, a tutor might notice hesitation in a student’s response and immediately reframe the question to clarify understanding. This kind of personalised, immediate and dynamic intervention is impossible in a classroom.

In tutoring, feedback is instant. If a student misunderstands a concept, the tutor can pause and offer corrective feedback on the spot. There is no need to wait, no need to press ahead with the curriculum. This timely response to a student’s needs helps to solidify learning and build confidence, making AfL truly effective. Tutoring allows for a flexibility in pacing which simply cannot happen in the classroom. A tutor can spend as much time as necessary on a particular concept, adjusting the level of challenge to ensure that a student remains engaged. For example, if a student masters a topic quickly, the tutor can introduce more complex material. Conversely, if a student is struggling, the tutor can slow down, review foundational concepts, or use alternative explanations.

One-to-one tutoring fosters a relationship where the student may feel more comfortable expressing misunderstandings or asking questions. I actively praise my students for interrupting me and asking questions, although I am careful to highlight for them that this is the right environement in which to do so; it is important to me that I support classroom teachers by clarifying to students that they cannot – nor should they – demand this level of individual attention and feedback in the mainstream classroom.

Photo by Element5 Digital on Unsplash

An actual Nazi on campus?

It’s been on my mind to write about this for a while, but I was waiting for the right trigger in current events. This week, news has broken that a student at Leeds University has been suspended from her work at the student radio station and investigated by the Students’ Union for, allegedly, “not acting in a duty of care,” putting the “health and safety” of members at risk, not “upholding the values” of Leeds Student Radio and the Student Union, and “bringing the reputation of the University, the [Student Union, or Leeds Student Radio] into disrepute.”

I’d already had some online contact with Connie Shaw, as she seemed to me to be a very impressive young woman who has been treated quite outrageously by her university and I sent her a message to that effect; her situation has now been reported in the mainstream media, so many more people are aware of what has happened to her. Connie was interrogated by the Union about her “gender critical views” (which are protected in law) and it seems pretty clear that the apparent complaints about her “conduct” arise from the fact that she has launched a podcast on which she interviewed Graham Linehan, Andrew Gold and Charlie Bentley-Astor; these are all people who have had personal experiences and/or hold views that do not align with the prevailing narrative on a typical university campus these days, so Connie has found herself in a whole heap of trouble. Unfortunately for Leeds, Connie is not somebody to be pushed around or silenced and her predicament has now been highlighted in the national press.

I wish my recollections were clearer for the situation I wish to contrast this with, but when I was at university I really was not involved with Union politics. I made sure to vote for representatives, as I have always believed that voting is important. One of the things that has driven me absolutely wild over the many years that I have spent signed up to various Unions is that the average member rarely votes. The number of conversations I have had with people who bemoan the fact that their Union committee is dominated by political zealots at the same time as admitting that they don’t bother to vote makes me want to bash my head against the wall. I will point out until the end times that the reason why so many Unions are dominated by representatives with extreme or bizarre views is because people with extreme or bizarre views get off their butts and run for office, and people who support those views get off their butts and vote for them. The problem is rarely the extreme or bizarre views themselves (which are not held by the vast majority of Union members), it is the apathy of the majority which allows them to thrive. So, yes, I always voted. My only other involvement was I acted as a volunteer for the Nightline service, a telephone support line manned by students and modelled on the service run by the Samaritans. But that was it. I didn’t go to hustings and I wasn’t involved with the day-to-day drama of Union politics.

Despite my lack of involvement, even I managed to hear about the fact that we had a Nazi on campus in 1992. “Nazi” is an over-used word these days and Connie Shaw has joked about being called “a Nazi” by those who disagree with her. It is beyond doubt that, in the current climate, this ridiculous insult is regularly rolled out by people on the political left when they don’t like what somebody else is saying. But this was university in 1992: there were no websites, no chat rooms, no social media, no hashtags and no mobile phones. We used to leave a note on our doors to tell friends where to find us. These were different times in every sense: I recall hearing another student making an openly homophobic remark about one of our lecturers within earshot of dozens of students (and the lecturer himself), and I was the only one to call him out on it. Even when I did so, nobody else backed me up. And again, when I say “homophobic” I really mean it: “Better keep your backs to the wall, lads” was what he actually said as the poor man walked past. Yeah, I know. This was how the world was in those days and believe me when I say that very, very few people were willing to step in and say something. At 19 years old I was already one of them and I’m proud of that.

So, the concept of labelling anyone who failed to meet the exacting liberal standards of a certain kind of Guardian journalist “a Nazi” had very much not taken off in 1992. Quite the contrary. Yet rumours abounded that we had a genuine, bona fide Nazi on campus and he was causing trouble. I first became aware of the situation when I heard that this self-confessed Nazi had applied to speak publicly at a Union meeting and lots of people were very upset about it. From what I could gather, there was a lobby of students pushing that he should be disallowed: nobody wanted to hear what he had to say and why should we have to put up with his revolting opinions being platformed and aired in our own Union? I had a considerable amount of sympathy with this view and understood the strength of reaction that his application to speak had sparked. However, after much discussion, everyone accepted that under the rules of the Union – of which this student was a member – Nazi Boy had the right to speak. Lots of people were very unhappy about it, but those were the rules.

On the day after the event, I spoke to one or two people who were present at the meeting when it happened. Apparently, the guy stood up and said his piece. Nobody shouted him down, because the decision had been made that under the rules he was allowed to speak. However, by the same token, nobody was interested in listening. His speech was not good: it was not articulate, it was not rational and it was, of course, offensive. After he sat down, nobody applauded. The meeting moved on. That was the sum total of his impact: zero. Following what turned out to be quite the non-event, the student in question did not last the year on campus: he left after a few months, and was quickly forgotten.

I am agog as to how quickly we have shifted from a committee of students in 1992, who reasoned that the right to free speech must prevail above all else – even if that meant sitting on their hands and grinding their teeth while the worst of all views were shared publicly – to so many of them believing that nobody has the right to say anything that might challenge a prevailing social narrative in 2024. Here’s the thing, kiddos – when you let people speak, they reveal the truth about themselves and their views. If those views are insane, offensive or irrelevant, perhaps it is all to the good that they are exposed for what they are. If I’m honest, I’m still not sure whether it truly was the right decision to allow a Nazi to speak in the Union, but I believe that the scenario is worth recalling and I applaud the Union committee of 1992 who believed that the agreed democratic process was what mattered most, despite the pressure that they were under to ban the guy from speaking.

We have moved from a situation in which the youngest of people were capable of grasping the dangers of curbing free speech in even the most challenging of circumstances, to one in which students refuse to even entertain a narrative which may jar with their own. Quite how these young people navigate their way through the world I struggle to understand. What a terrifying and dangerous place it must seem, when you cannot cope even with hearing some politely-spoken words you disagree with. It seems to be a frequent occurrence in many universities now, with students either refusing to platform certain speakers or protesting their very presence when they do appear. I defend anyone’s right to protest, but it seems to me that this important right is now exploited by people who simply do not wish to allow others to speak freely. Ask any student who protested the appearance of Kathleen Stock at the Oxford Union what their purpose was and I am quite sure that they will happily tell you that they wanted to drown her out, as they believed that her views were hateful.

Perhaps some students are terrified of any alternative narrative because deep down they are actually afraid that they might be persuaded by it. What if I start to believe what the other side has to say? Yet surely it says very little for the strength of anyone’s convictions if they are genuinely terrified of a conversation. I guess if you lack all moral fibre and courage then it’s easier to scream until you can no longer hear the other speaker. In that way, you also get to drown out the niggling voice inside your head: the voice that says maybe – just maybe – you’re the bad guy.

Photo by Kristina Flour on Unsplash

Call that a PhD?

You could be forgiven for thinking that Gregg Wallace’s video was the most explosive thing to happen on social media this week, but you would be wrong.

Picture the scene: a young, female academic at Cambridge shares a happy picture of herself, smiling and clutching her freshly-acknowledged PhD thesis in English literature. Ally Louks, now Dr Ally Louks, probably thought that her message of celebration that she was “PhDone” would be liked by a few and ignored by the majority. Yet her post at the time of writing has been seen by hundreds of thousands of people and Ally has received torrents of abuse, some of which beggars belief. The whole storm has sparked outraged discussion on all sides – most of it thoroughly ignorant – about what a PhD is or should be.

Here’s the thing, for those of you that haven’t been there. A PhD is like going potholing: you wriggle down into some difficult spaces and explore the subterrain. Nobody will ever know those particular underground passages better than you, because nobody else is ever likely to go there or, indeed, even want to go there. The reason you’re awarded the PhD is because you have traversed new terrain and – in the judgement of the potholing community – you are the first to do so, or you have uncovered a sufficient number of nooks and crannies that previous potholers did not comment upon. Most of the time, you don’t find an underground palace, a glistening river of stalactites or a dazzling crystal chamber: you simply wriggle your way back up to the surface and get on with your life. Your thesis will sit on the shelf of whichever institution recognised it and – if you’re lucky – it will be consulted by a tiny handful of niche-hole specialists over the next few decades, the number of which you could count on one hand.

Personally, I blame Stephen Hawking. During his doctorate, he hit upon a leap of understanding so brilliant that it changed the direction of theoretical physics forever. Most of us don’t manage that. This does not mean that our PhDs are not worthy of the title: it simply means that most of us are – demonstrably – not a genius like Hawking. There is a reason why Hawking has been laid to rest between Newton and Dawin: he is right up there with those two when it comes to the significance of his contribution to his field. Yet many people seem to assume that Hawking is an example of what is expected of a PhD candidate – a particularly famous example, perhaps, but an example nonetheless. In reality, most research is utterly banal and unimportant: it’s not going to shake up our understanding of the fabric of the universe.

Louks’ PhD sounds – to me – rather fun. Okay, I’m one of those wish-washy artsy types that got a PhD in Classics, not theoretical physics, but I reckon her thesis “Olfactory ethics: the politics of smell in modern and contemporary prose” sounds like a more stimulating read than a huge number of PhDs that have passed under my nose over the years (pun intended). In response to the unexpected interest in her work, Louks shared her abstract, which only further made my nostrils twitch. Her thesis explores “how literature registers the importance of olfactory discourse – the language of smell and the olfactory imagination it creates – in structuring our social world.” Her work looks at various authors and explores how smell is used in description to delineate class, social status and other social strata. I mean … fine? No? Quite why a certain type of Science Lad on the internet decided that this was a completely unacceptable thesis baffles me. Apparently, there is a certain type of aggressively practical chap, who believes that exploring how things are represented in literature and how that literature has in turn helped to shape our world is utterly unworthy. Well, more fool them. They should read some literature. I suggest they start with Perfume by Patrick Suskind, a modern classic that is quite literally a novel about smell.

I’ll confess that the whole thing has left me feeling quite jumpy about my own thesis, which in 1999 was welcomed as an acceptable contribution to my very narrow, very obscure corner of the underground caves. Once I had seen the reaction to Louks’ abstract I decided to re-read my own. Having done so, I concluded not only that it would sound utterly barking to the rest of the world, it sounded utterly barking to me! This was a field in which I was immersed at the time but have read nothing about since I walked out of the room in which my viva took place.

The viva itself is something that most people do not really understand and is difficult to explain. It is not an examination. Short for viva voce, which is Latin for “with the living voice”, the viva is there in principle for the PhD candidate to demonstrate that they are the author of their own work. In practice, it is also an opportunity for the examiners to quiz the candidate and explore their hypothesis further. The examiners may have questions and it is common for them to advise corrections and amendments; often, the examiners make the passing of the thesis conditional on these amendments. Best case scenario (and one enjoyed by Ally Louks), the examiners pass your thesis with nothing more than a few pencil annotations, none of which require attention for the thesis to be accepted. Worst case scenario, they say that your thesis is a load of old hooey and that you should not – under any circumstances – re-submit it, corrected or otherwise.

While the worst-case scenario is rare and indicates a profound failure on the part of the candidate’s supervisor, who never should have allowed the submission, it does happen on rare occasions. The last time I saw one of my old lecturers from my university days, he reported being fresh from a viva on which he had acted as an external examiner and had failed the thesis. This happens so rarely that I was agog. Having been so long out of the world of academia, it is impossible for me to express in simple terms the intellectual complexities that he explained were the reasons behind his decision, so I shall have to quote him directly: apologies if the language is too academic for you to follow. “Basically, it was b*****ks,” he said. “I mean, don’t get me wrong, it was kind of brilliant b*****ks: but it was b*****ks nevertheless.” That poor candidate. I ached for him. I also found myself recalling the gut-wrenching moment during which Naomi Wolf’s PhD thesis was exposed as fundamentally flawed by Matthew Sweet, live on Radio 3. If you’ve never listened to the relevant part of the interview, I highly recommend it: it is – especially for those of us who have submitted a thesis for judgement in the past – the most toe-curling listen imaginable. Wolf’s entire thesis appears to have been based on a misunderstanding of a legal term, which Sweet discovered simply by looking it up on The Old Bailey’s website. Wolf’s thesis had been passed at Trinity College, Cambridge, an institution that would be hard to beat in terms of intellectual clout and reputation, so quite how this happened is mind-boggling and shameful.

The reaction to Souks’ thesis does, I suspect, have a great deal to do with the increasing suspicion with which academia is viewed, and in many ways I am not unsympathetic to people’s disquiet. There is, without question, a good deal of nonsense (or b*****ks, to use the technical term) talked in a lot of fields, particularly in the arts and social sciences. Yet the vitriol with which Souks was criticised has nothing to do with this. Her abstract, to anyone with even a grudging respect for the field of English literature, makes intellectual sense. No, the roasting of Souks and her work betrays a profound and depressing ignorance as well as a nasty dose of good old-fashioned cruelty. Before people decide that an entire field of study is unworthy of merit, they should maybe ask themselves whether there is even the tiniest possibility that they perhaps don’t know enough about it before they pounce. One can but hope that these people who value their rationality so much will next time run a more scientific test, rather than dunking the witch to see whether she floats.

Photo by Alex Block on Unsplash

Vocabulary acquisition

An essential challenge faced by students and teachers alike is the acquisition of vocabulary. I have written before on the best methods that students can employ when tackling vocabulary learning, so I do not plan to reiterate those here. What follows are rather some observations and musings about what we’re getting wrong in the Latin classroom when it comes to vocabulary acquisition, especially when compared to our counterparts in modern languages.

In my experience to date, supporting students in the accretion of vocabulary is a responsibility undertaken more effectively and proactively by modern language teachers than by those of us who specialise in Latin. It is possible that Latinists are under more time pressure in the curriculum and thus have no choice but to place the responsibility for vocabulary learning onto our students, but I think it more likely that we are simply less well trained in how to go about it than our colleagues in MFL. Classicists suffer from the fact that our training is somewhat broad – a qualified Classics teacher will necessarily have spread their training time across Ancient History and Classical Civilisation subjects, dramatically reducing the time that they spend focused purely on the teaching of the Latin language. I have little to no recollection of being given any significant guidance on how to help my students to develop their knowledge of vocabulary, so all my knowledge in this area has come later – through experience and through reading.

One of the many differences between the manner in which ancient languages are taught compared to modern ones is in the presentation of vocabulary to students. While modern linguists favour grouping words into themes or topics (e.g. “going to the shops” or “hobbies”), Latin teachers tend to present vocabulary in the following ways:

  1. By chapters in a text book (e.g. Cambridge Latin Course, Suburani, De Romanis or Taylor & Cullen). Sometimes these may have a loose theme, but it’s generally pretty tenuous.
  2. As one long alphabetical list (e.g. OCR GCSE or Eduqas GCSE).
  3. In parts of speech. Some teachers invite students to learn the GCSE list in types of words, e.g. 1st declension nouns, 2nd declension nouns etc. 

Each of these approaches has its drawbacks, so let’s consider those one by one. First of all, let us consider the approach of learning vocabulary by text book chapter. If one were to use Taylor & Cullen for this purpose, one would at least be learning the set vocabulary for OCR and thus there is some longterm justification for the approach. The vocabulary also reflects what is being introduced in each chapter and therefore there is some pedagogical justification for students learning it as they go. All of that said, you wouldn’t believe how few schools are actually doing this and to date I’m not sure I have met a single student that is working systematically through the chapters of Taylor & Cullen and learning the vocabulary as they go: some students are being tested on the chapters retrospectively, but I have not worked with any who are using the text book as it was designed. This is most likely because Taylor & Cullen is an ab initio course and thus the early chapters are not suitable for use with Year 10s who have studied Latin in Years 7-9. Why don’t schools use it during those years? Well, I’m assuming that its somewhat sombre presentation and lack of colour pictures puts teachers off the idea of using it a basis for KS3, when (to be frank) they are under pressure to recruit bums onto seats for KS4 or else find themselves out of a job. This means that there is no text book explicitly aimed at preparing students for a specific GCSE exam board being made wide use of in schools.

None of the text books commonly used in schools at KS3 build vocabulary that is explicitly and exclusively aimed at a particular GCSE course. While Suburani is supposedly linked to the Eduqas course, it diverts from using the vocabulary that is relevant to this in favour of what suits its own narrative. For example, students of Suburani will be deeply familiar with the word popina as meaning “bar” (not on the GCSE list for either OCR or Eduqas but used widely throughout the first few chapters), yet they are not introduced to the word taberna meaning “tavern” or “shop” (on the GCSE list for both boards) until chapter 12. Similar problems occur in terms of the thematic focus of Suburani: because it focuses on the life of the poor in Rome, students are taught that insula means “block of flats”. While it does mean this, I have never seen it used in this way on a GCSE paper – the word is used exclusively by both boards in a context in which the only sensible translation is “island”.  I shall say more about the problem of words with multiple meanings later on.

Presenting words in an alphabetical list seems to be the practice used by most schools when students reach Years 10 and 11 and are embarking on their GCSE studies. Most students that I have worked with are told to learn a certain number of words from the alphabetical list and are thus tested on multiple words that have nothing in common, either in terms of their meaning or their grammatical form. One advantage of this is that students are forced to look at words with similar appearance but different meaning. However, multiple and in my opinion worse problems arise from this method. Students learning the vocabulary in alphabetical order give little thought to what type of word they are looking at (e.g. whether it is a noun or a verb) or to its morphology. This means that students do not learn the principal parts of their verbs, nor do they learn the stem changes of nouns and adjectives. This can cause considerable frustration and demotivation when students struggle to recognise the words that they have supposedly learnt when those words appear in different forms. Teachers could mitigate against this by testing students on those forms, but most seem reluctant to do so. Do they think it’s too hard?

The method I used was to present the GCSE list in parts of speech and invite students to learn different types of words in groups: all the 1st declension nouns, all the 2nd declension nouns etc. The advantage of this method is that it allows for the opportunity to link the vocabulary to the grammar. For example, the first vocabulary learning task I used to set my Year 10s in September was to learn/revise all the 1st declension nouns (in theory they knew most of them already from KS3) and to revise the endings of the 1st declension. In the test, they were expected to be able to give the meaning of the nouns I selected for testing and they were expected to be able to write out their endings also. I felt (and still feel, on the whole) that this was the best approach, but that does not mean that it does not have its own disadvantages. Firstly, it made some learning tasks excessively onerous and others too easy: for example, that task of learning the 1st declension nouns was very easy (because most of the words were already familiar and the forms of the nouns are very simple) but the task of learning 3rd conjugation verbs was much harder (fewer of them were previously known and their principal parts are a nightmare). This meant that students were often hit with homework that turned out to be extremely difficult at what might not have been the ideal time for them. A second disadvantage was that it was impossible to give students a translation test, because one could not create sentences out of a set of words which all belong to one category. Thirdly, and related to that point, testing according to parts of speech made it very difficult to link vocabulary learning to classroom teaching in any meaningful way: in class, we might be studying the uses of the subjunctive, and that could not necessarily be linked to the homework task that was next on the list. This is something that I have been thinking about more and more in recent years as a massive problem in Latin teaching – a disconnect between what students are learning in the classroom and the vocabulary they are invited to learn for homework. The more I think about it, the more I believe this is a fundamental problem which requires a complete curriculum re-think.

The difficulty of linking vocabulary learning to explicit classroom teaching is something that modern language teachers would probably be very puzzled by. Modern linguists are way ahead when it comes to tying vocabulary learning to what’s happening in their classroom and to the relevant grammar. Given this, imagine my excitement when one of my tutees shared with me that she has been presented with the OCR vocabulary list in themes! I was full of anticipation as to how her school was planning to test their students on those themes. For example, one theme might be “fighting and military language”, within which students learn nouns such as “battle” and “war” alongside verbs such as “fight” and attack”. Call me daft, but I hoped and expected that she would be tested using some simple sentences, which would afford teachers the opportunity to observe students’ (hopefully) increasing understanding of grammar and morphology alongside the acquisition of the relevant vocabulary. Surely no teacher would have gone to the trouble of dividing up 450 words into a set of themes unless they were going to make use of some innovative testing methodologies? No? Well …  actually, no. The school are testing the students on a list of words, with no link made between the meanings of those words and the learning that is going on in classroom. I have absolutely no idea what the point of this is. Maybe somebody in the department has read somewhere that “themes” is a good way to classify vocabulary and I am sure it is – but I’d place a hefty bet that there is no tangible pedagogical gain unless that learning is linked to the use of those words in sentence-structures, the kind of approach favoured by Gianfranco Conti.

I said that I would come back to the issue of words with multiple meanings, and that is something I have noted with interest from my tutee’s themed list. Words with multiple meanings appear more than once on the different lists, with their meanings edited to suit the theme of that list. This is an interesting idea and I am still pondering whether or not I think it is a good one. Multiple meanings are a real menace, particularly when the most obvious meaning (i.e. the one which is a derivative) is the least essential. For example, on the GCSE list for both boards is the word imperium, which can mean “empire” and all students immediately plump for that meaning as it is an obvious derivative. However, the word is more commonly used on language papers to mean “command” or “power” – it is therefore those meanings that must be prioritised when a student is learning the word. Similarly, all students need to be drilled on the fact that while imperator does come to mean “emperor” in time, it originally meant “general” and is usually used in that way on exam papers. Even worse is a nightmare word such as peto, which is listed on both boards as meaning anything from “make for”, “head for”, “seek” and “attack”. Students really struggle with learning all of its multiple possible meanings and it is important to show them multiple sentences with the verb being used in lots of different contexts so that they can grasp all of the possibilities.

As so often, I reach the end of my musings having criticised much and resolved little. I am thankful to be working in a one-to-one setting, in which I can support students with vocabulary learning in a proactive and detailed way, one which goes way beyond what is possible in the mainstream classroom and supports their learning in a way that simply cannot be expected of a classroom teacher. I shall continue to ponder what I would do were I in a position to re-shape the curriculum all over again, but I fear that this would entail writing an entire text book from scratch. Many have tried to do this, and even those who have made it to publication remain flawed: I have no conviction that I could do any better.

Photo by Olena Bohovyk on Unsplash