Delayed gratification

This week I have found myself having a very stern conversation with one of my cats. Her name is Piglet. Piglet by name, piglet by nature. The animal simply cannot help herself when it comes to food. If she had her way, she’d be the size of a house, hauling her enormous belly around like a competitor in the World’s Strongest Man. Fortunately – or unfortunately, as far as she is concerned – she has mean old me controlling her food intake.

So, Piglet and I had to have a very serious conversation about her life choices. This is a cat that was in line to receive some small pieces of chicken as a treat. See, I’m not always mean: I had even taken the pieces out of the fridge, to bring them up to room temperature. Piglet, however, elected that evening to wolf down the remaining supper of our other cat, who is currently being rather delicate about her food intake. The second cat is in the early stages of renal failure and so is on a specialist prescription diet. When my back was turned for a nano-second, I failed to register that Dolly had walked away from her food and so I turned around to find Piglet urgently inhaling the last scraps of Dolly’s prescription dinner.

“You could have had some chicken pieces this evening!” I admonished her. “As it is, you’ve made the choice to eat the prescription cat food, so now you’re not getting anything else.” She stared at me, unmoved and unimpressed, still cleaning her whiskers after the extra feed she had claimed for herself.

In reality, of course, the cat’s brain is not capable of understanding the point. She’s a very smart cat, but she has not yet mastered English, nor has she worked out that stealing the prescription cat food means missing out on her chicken treats. She is also – being a cat – not capable of making the fundamental decision of delayed gratification, something which human psychologists and the world in general like to cite as a crucial indicator of our future success as adults. Or is it?

I am quite a fan of The Studies Show, a podcast hosted by two science writers called Stuart Richie and Tom Chivers. In each episode, they debunk various stubborn myths that persist either as a result of poor science or as a result of the science being poorly reported or interpreted (or both). They investigate how science is at the mercy of human bias like any other subject, and explain things such as confounding, publication bias and collider bias (I am still struggling to grasp the last one in full). In one particular episode, they explore the experiment nicknamed “the marshmallow test”, which was hailed as a groundbreaking study into impulse control in very young children, with some quite extraordinary claims made about how the findings were linked to future success in several walks of life – in education, in financial stability, in relationships and in health.

In various tests, performed on a group of 4-year-olds in Stanford University in the late 1960s and early 1970s, psychologists offered several hundred children a choice between either one or two sweet treats. The children were offered the choice of either taking one treat which they could have immediately, or if they waited for an unspecified amount of time, during which the psychologist left the room, they would then be allowed two treats. Times that the children were left to wait varied but could be up to 20 minutes. One point, made hilariously by Tom Chivers during the discussion, is to question whether some smart four-year-olds might already have a sound understanding of the value of their own time. “You know what, one marshmallow isn’t worth 20 minutes of my time, mate!” he imagines them saying. Stuart Richie then ponders whether marshmallows were a significantly bigger deal in the 1970s compared to now – what kid in the mid-2020s is going to wait 15 or 20 minutes just for one extra marshmallow?

The issues with the study are many, but the most dubious are the claims that were extrapolated from two follow-up questionnaires, which were responded to by only around 100 of the original 653 participants – meaning that more than 80% of the candidates were not included in the two follow-up studies, which looked at the children in later life. Chivers and Richie also raise the query that the original test was confounded by the fact that different children were given different coping strategies to assist with the waiting time – for example, some were encouraged to use distraction techniques, others to focus on the end reward. This is because the original purpose of the research at Stanford was to try to find out which of the coping strategies would help children most with delaying gratification – the idea of following them up to see which children became more successful in later life came some time afterwards, which may explain why Stanford lost touch with so many of the participants. However, it is the later follow-up studies that caused all the excitement, as they supposedly found a quite remarkably strong correlation between later success and the period of time that the younger children had managed to wait before receiving their reward. The claim – of course – turns out to be nonsense. The correlation only worked with children who had not been offered any coping strategies to help to delay the gratification, which somewhat begs the question why the primary author of the study believed so strongly in the teaching of delayed gratification as a life-strategy. Far more importantly, however, the correlation all but disappeared in replication studies, when controls were introduced for socio-economic background and previous academic success, both of which are far more obvious likely predictors of future academic attainment and overall success.

Chivers and Richie link the wild extrapolations taken from this particular study to similar attempts to introduce the concept of “growth mindset” in schools, another topic of academic research that they take a sledgehammer to in a previous episode. I remember this particular fad very well, as at the time in my school we had one particular Senior Manager who had read Carol Dweck’s book The Psychology of Success and was a shiny, happy acolyte for the concept that the tiniest shift in rhetoric – basically, praising kids for working hard rather than for their smarts – would somehow revolutionise their success in the classroom. It may not surprise you to know that it didn’t, and that the studies in this area have since been shown to prove nothing of the sort.

This is not to say that delaying gratification is not an important skill. It is, of course, an important part of growing up and becoming a successful adult that one learns to some extent to place tasks in an order of importance and/or urgency, rather than focusing entirely on what you would most like to do in the moment. Studying for an exam, preparing for a competition or an interview, exercising and eating the right things for the benefit of your longterm health are all simple goals shared by many which require this skill. In my experience, children acquire the ability to delay their gratification at different rates and while some teenagers have fully mastered the process others are still grappling with their motivation and find it really hard to set aside the things that they enjoy the most to focus on something important but less interesting. One of the greatest things that schools can do is thus to focus on assisting children in their ability to concentrate, as a lack of attention in class remains by far the biggest barrier to academic success for many of our most vulnerable students.

In the meantime, Piglet remains at the mercy of her desires and will no doubt continue to make a lunge for every tasty morsel she can find in her path. I have often said that one of the joys of keeping a cat is that they teach you how to live your life and speaking as someone who doesn’t always remember to reward myself just for the hell of it, Piglet serves as a feline reminder that sometimes making a dive for the thing you crave the most is to be recommended.

Piglet, who can only delay her gratification while sleeping

France is bacon and other misconceptions

When I was young, my father said to me: “Knowledge is power, France is bacon.” For more than a decade I wondered over the meaning of the second part and what was the surreal linkage between the two. If I said the quote to someone, “Knowledge is power, France is bacon,” they nodded knowingly. Or someone might say “Knowledge is power” and I’d finish the quote, “France is bacon” and they wouldn’t look at me like I’d said something very odd, but thoughtfully agree. I did ask a teacher what did “Knowledge is power, France is bacon” mean, and got a full 10 minute explanation of the “knowledge is power” bit but nothing on “France is bacon.” When I prompted further explanation by saying “France is bacon?” in a questioning tone, I just got a “yes”. At 12 I didn’t have the confidence to press it further. I just accepted it as something I would never understand. It wasn’t until years later I saw it written down, “Knowledge is power,” Francis Bacon, that the penny dropped.

Anonymous post on Reddit, 2011.

The ease with which such misconceptions can arise is something that all teachers should be aware of. Most likely, you can remember some of your own from childhood. For me, most memorably, it was the phrase “rich as Croesus”, which my mother used to use regularly. As a kid, unsurprisingly, I’d never heard of the ancient Greek king of legendary wealth, so I heard “rich as creases.” For years I wondered what being rich had to do with having creases, or why creases were considered to be the same thing as being rich. I just put it down to one of those weird things that grown-ups say.

It is important to remember that much of what adults say is inherently puzzling to young children. Before we berate them for a lack of intellectual curiosity (why on earth didn’t I just ask … ?), it is important to remind ourselves that pretty much everything that adults say or do can seem puzzling on some level to very young children. It is not, therefore, surprising when they shrug and accept a saying or something that they are told is a truism that makes little obvious sense: nothing makes obvious sense when you’re small.

Further to that, the account of the child who heard “France is bacon” illustrates the anxiety that most children have that they have at best missed something obvious or at worst that they are inherently stupid. You can feel the child’s unease as they anxiously test the waters with the various ways in which they attempt to have the saying explained to them. Even the teacher completely misses the opportunity to correct the misconception, as they clearly did not realise where the misconception lay. This illustrates the tendency that we have as teachers to assume that we already understand what it is that a child needs explaining to them: in this case, the teacher assumed that the child was puzzled as to the underlying message of the saying – in what sense can knowledge bring power? What the teacher actually needed to do was to quiz the child on why they were asking about it – what was puzzling them about the quotation? Had the teacher done so, the misconception would have been identified and rectified.

One of the things that I love about tutoring is the opportunity that the one-to-one setting brings to uncover such misconceptions or gaps in a child’s knowledge. This is partly because of the time and focused attention that it affords, but it is also because of the opportunity that you are offering a child to ask all of those “stupid” questions that they’ve been bottling up for years. Nothing brings me greater joy than a tutee who develops the confidence to interrupt me and demand an explanation for something, or to ask me a question that I did not realise that they needed to ask. That’s when the relationship between the tutor and their student has really developed, when a child gains the confidence to demand the most out of their sessions.

Just recently, I was reminded how careful we need to be when assuming what a child knows. I showed my tutee the translation of a Latin poem by Catullus, which contains the metaphor “my purse is full of cobwebs”. Now I went in with the assumption that the child might need encouragement to grasp the metaphor, as many children do not find these as easy as you might assume. During the discussion, however, I discovered that she did not in fact know what “a purse” was. There was no chance of her understanding the metaphor until that was rectified! It had not previously occurred to me that this might be a word that a 16-year-old might not know: but if your family have always used the word “wallet”, or your parents carry their change in their jeans, or – as is becoming increasingly the norm – they don’t really carry cash at all, then maybe it is simply not a word you have come across. We should never, ever assume.

Misconceptions that arise from mishearings such as “France is bacon” or “rich as creases” also illustrate the essential importance of dual coding. A couple of years ago, I realised that one of my tutees was convinced that the dative case had something to do with numbers. After a couple of minutes of trying to explore where this misconception had come from, I suddenly realised what had happened: his teacher had (quite rightly) taught his class that the dative case was to be translated as “to” or “for”. My tutee, however, had heard “two” or “four”. He heard numbers instead of words, and he had been understandably confused ever since. Yet had the teacher simply written the words “to” and “for” on the board as well as saying them out loud, this misconception would have been avoided. So many people confuse dual coding with the idea of simply putting a nice picture on their handouts, or the ridiculous belief that illustrations are essential for basic vocabulary learning. Not a bit of it. Dual coding is the process of combining words with visual stimulus. It is used to help the brain to grasp a concept without misconceptions: using a visual representation of what you are explaining in written words, or writing down what you are explaining verbally.

Children will always form misconceptions and that fact is nothing to be feared. It does, however, mean that teachers must be particularly alert to them and the methods that are most likely going to help to resolve them, or to prevent them from forming in the first place.

Photo by Daniele Levis Pelusi on Unsplash

Stress? What stress?

For various reasons, I’ve been thinking about stress. More specifically, stress relating to the work that people do. As we bed in to the holiday spell (for some, I have read, quite literally), there will be people reading this who find themselves wondering where they will find the strength from to go back into work.

While everyone will experience work-related stress from time to time, it is a truth universally acknowledged that some jobs are apparently more stressful than others. This universally-accepted truth is riffed upon beautifully in an old Mitchell and Webb sketch, which I won’t link to because it gets a bit post-watershed towards the end. The scenario drawn is one partner coming home from a tough day at work as a paediatrician, working with sick and dying children; the running gag is his earnest desire to reassure his partner, whose job entails tasting new products at an ice-cream factory, that their careers are both equally important and equally pressurised. “Just because I’m a paediatrician dealing with severely ill children, doesn’t mean that you can’t have a tough day tasting ice cream,” he says.

People have wildly varied takes on the levels of stress that they assume come with classroom teaching. Some people seem irrevocably wedded to the idea that teachers are work-shy layabouts who finish at 4.00pm on the days that they do work, plus luxuriate in an almost unlimited supply of holiday time when they don’t. I lost count of the number of times someone hurled the “long holidays” at me like it was a brilliant gotcha. After a while, I used to hurl it back. “Teaching is a fantastic job,” I would say. “Did you know that there is currently an enormous drive to get more people into teaching, so given how convinced you are of the benefits, shall I send you a link to the courses that are recruiting? You even get paid to train!” That usually shut them up.

There have always been people who think that teaching’s a breeze. There are plenty of others who believe that it is horribly stressful. At times, they were right. While the average classroom teacher will not find themselves in charge of a multi-million pound budget, nor will they find themselves in a position where they are hiring and firing, nor indeed are they likely to find themselves presenting their work to a roomful of demanding CEOs, I’d like to see those same CEOs try their hand at managing a roomful of Year 10s on a hot afternoon when there’s a wasp in the room.

Let’s be honest. My subject, in the grand scheme of things, is relatively unimportant. While I can bang the drum of what A Good Thing Latin is for all students, let’s not be silly about this: whether or not a student attains a respectable grade in their Latin GCSE is not going to affect their life-chances (unless their life-plan is to become a Professor of Classics, and even then there are ways around that particular problem). However, most Latinists who work – as I did – in the state sector, will find themselves expected to earn their keep by offering at least one other mainstream subject. For me, that was English. As a result, I have found myself solely responsible for the GCSE English grades of several cohorts. This has included sets where there was an enormous focus on what used to be the C/D borderline and sets where their chances of making it to that borderline were considered slim. This, in very real terms, meant that I was directly responsible for a student’s life chances. I am not being over-dramatic, I don’t think. In all honesty, whether a child attains a pass grade in both English and Maths will shape their destiny in ways that few people outside education are fully aware of. A child who does not attain their GCSE English and Maths is largely condemned to a life on minimum wage. This may sound over-dramatic, but it is broadly true. Of course, there are plenty of exceptions, including many successful entrepreneurs who take pride in citing their scholarly failures as a badge of honour. I’m glad for them that they overcame this hurdle, but a hurdle it is, and one which proves impossible for the majority to overcome. I have never cried more tears of joy than when my students who had been classified as unlikely to pass managed to do so. For them, it quite literally meant the difference between poverty and a fighting chance. These kids, by the way, fought me every step of the way and if they’d had their way they never would have sat the exam in the first place. That, I would argue, is a considerable pressure, one faced by thousands of teachers across the country every year: helping kids to get over a barrier, with them quite literally doing everything in their power to remain behind it.

Another factor which many people fail to appreciate is the number of safeguarding concerns that your average teacher is exposed to during their career. I never specialised in pastoral care and did not do any training in the field of safeguarding beyond that which is expected of anyone working with young people, yet in my time I came across cases of neglect, of child sexual exploitation, of child criminal exploitation, of illegal drug use and more besides. On the penultimate day of my 21 years at the chalkface I became aware of what I was concerned could be a potential case of FGM and was urgently summoning Designated Safeguarding Leads to my classroom for advice, all while maintaining a calm demeanour and continuing to run the classroom and teach my lessons as if nothing were afoot. This is the kind of thing that teachers do every day and I am not sure whether other people realise this. We don’t talk about it much, partly because it’s not appropriate, but partly because it is – or has become – the norm. It is not unusual for teachers to be working with children who are experiencing genuine trauma; it is not unusual to be painfully aware of some deeply troubling circumstances that a child may be experiencing at home.

For most of my career, I loved my job. I also considered it a considerably less stressful deal than others experienced by more high-powered friends who managed large budgets or were responsible for people’s livelihoods in their business. Yet sometimes I would remind myself that I was, in many ways, responsible for people’s livelihoods. A teacher can shape someone’s future in unimaginable ways and their influence – for better or for worse – can dictate which doors are open and which ones are closed in the future. If you are a teacher, never underestimate that power.

Photo by Stormseeker on Unsplash

An actual Nazi on campus?

It’s been on my mind to write about this for a while, but I was waiting for the right trigger in current events. This week, news has broken that a student at Leeds University has been suspended from her work at the student radio station and investigated by the Students’ Union for, allegedly, “not acting in a duty of care,” putting the “health and safety” of members at risk, not “upholding the values” of Leeds Student Radio and the Student Union, and “bringing the reputation of the University, the [Student Union, or Leeds Student Radio] into disrepute.”

I’d already had some online contact with Connie Shaw, as she seemed to me to be a very impressive young woman who has been treated quite outrageously by her university and I sent her a message to that effect; her situation has now been reported in the mainstream media, so many more people are aware of what has happened to her. Connie was interrogated by the Union about her “gender critical views” (which are protected in law) and it seems pretty clear that the apparent complaints about her “conduct” arise from the fact that she has launched a podcast on which she interviewed Graham Linehan, Andrew Gold and Charlie Bentley-Astor; these are all people who have had personal experiences and/or hold views that do not align with the prevailing narrative on a typical university campus these days, so Connie has found herself in a whole heap of trouble. Unfortunately for Leeds, Connie is not somebody to be pushed around or silenced and her predicament has now been highlighted in the national press.

I wish my recollections were clearer for the situation I wish to contrast this with, but when I was at university I really was not involved with Union politics. I made sure to vote for representatives, as I have always believed that voting is important. One of the things that has driven me absolutely wild over the many years that I have spent signed up to various Unions is that the average member rarely votes. The number of conversations I have had with people who bemoan the fact that their Union committee is dominated by political zealots at the same time as admitting that they don’t bother to vote makes me want to bash my head against the wall. I will point out until the end times that the reason why so many Unions are dominated by representatives with extreme or bizarre views is because people with extreme or bizarre views get off their butts and run for office, and people who support those views get off their butts and vote for them. The problem is rarely the extreme or bizarre views themselves (which are not held by the vast majority of Union members), it is the apathy of the majority which allows them to thrive. So, yes, I always voted. My only other involvement was I acted as a volunteer for the Nightline service, a telephone support line manned by students and modelled on the service run by the Samaritans. But that was it. I didn’t go to hustings and I wasn’t involved with the day-to-day drama of Union politics.

Despite my lack of involvement, even I managed to hear about the fact that we had a Nazi on campus in 1992. “Nazi” is an over-used word these days and Connie Shaw has joked about being called “a Nazi” by those who disagree with her. It is beyond doubt that, in the current climate, this ridiculous insult is regularly rolled out by people on the political left when they don’t like what somebody else is saying. But this was university in 1992: there were no websites, no chat rooms, no social media, no hashtags and no mobile phones. We used to leave a note on our doors to tell friends where to find us. These were different times in every sense: I recall hearing another student making an openly homophobic remark about one of our lecturers within earshot of dozens of students (and the lecturer himself), and I was the only one to call him out on it. Even when I did so, nobody else backed me up. And again, when I say “homophobic” I really mean it: “Better keep your backs to the wall, lads” was what he actually said as the poor man walked past. Yeah, I know. This was how the world was in those days and believe me when I say that very, very few people were willing to step in and say something. At 19 years old I was already one of them and I’m proud of that.

So, the concept of labelling anyone who failed to meet the exacting liberal standards of a certain kind of Guardian journalist “a Nazi” had very much not taken off in 1992. Quite the contrary. Yet rumours abounded that we had a genuine, bona fide Nazi on campus and he was causing trouble. I first became aware of the situation when I heard that this self-confessed Nazi had applied to speak publicly at a Union meeting and lots of people were very upset about it. From what I could gather, there was a lobby of students pushing that he should be disallowed: nobody wanted to hear what he had to say and why should we have to put up with his revolting opinions being platformed and aired in our own Union? I had a considerable amount of sympathy with this view and understood the strength of reaction that his application to speak had sparked. However, after much discussion, everyone accepted that under the rules of the Union – of which this student was a member – Nazi Boy had the right to speak. Lots of people were very unhappy about it, but those were the rules.

On the day after the event, I spoke to one or two people who were present at the meeting when it happened. Apparently, the guy stood up and said his piece. Nobody shouted him down, because the decision had been made that under the rules he was allowed to speak. However, by the same token, nobody was interested in listening. His speech was not good: it was not articulate, it was not rational and it was, of course, offensive. After he sat down, nobody applauded. The meeting moved on. That was the sum total of his impact: zero. Following what turned out to be quite the non-event, the student in question did not last the year on campus: he left after a few months, and was quickly forgotten.

I am agog as to how quickly we have shifted from a committee of students in 1992, who reasoned that the right to free speech must prevail above all else – even if that meant sitting on their hands and grinding their teeth while the worst of all views were shared publicly – to so many of them believing that nobody has the right to say anything that might challenge a prevailing social narrative in 2024. Here’s the thing, kiddos – when you let people speak, they reveal the truth about themselves and their views. If those views are insane, offensive or irrelevant, perhaps it is all to the good that they are exposed for what they are. If I’m honest, I’m still not sure whether it truly was the right decision to allow a Nazi to speak in the Union, but I believe that the scenario is worth recalling and I applaud the Union committee of 1992 who believed that the agreed democratic process was what mattered most, despite the pressure that they were under to ban the guy from speaking.

We have moved from a situation in which the youngest of people were capable of grasping the dangers of curbing free speech in even the most challenging of circumstances, to one in which students refuse to even entertain a narrative which may jar with their own. Quite how these young people navigate their way through the world I struggle to understand. What a terrifying and dangerous place it must seem, when you cannot cope even with hearing some politely-spoken words you disagree with. It seems to be a frequent occurrence in many universities now, with students either refusing to platform certain speakers or protesting their very presence when they do appear. I defend anyone’s right to protest, but it seems to me that this important right is now exploited by people who simply do not wish to allow others to speak freely. Ask any student who protested the appearance of Kathleen Stock at the Oxford Union what their purpose was and I am quite sure that they will happily tell you that they wanted to drown her out, as they believed that her views were hateful.

Perhaps some students are terrified of any alternative narrative because deep down they are actually afraid that they might be persuaded by it. What if I start to believe what the other side has to say? Yet surely it says very little for the strength of anyone’s convictions if they are genuinely terrified of a conversation. I guess if you lack all moral fibre and courage then it’s easier to scream until you can no longer hear the other speaker. In that way, you also get to drown out the niggling voice inside your head: the voice that says maybe – just maybe – you’re the bad guy.

Photo by Kristina Flour on Unsplash

Call that a PhD?

You could be forgiven for thinking that Gregg Wallace’s video was the most explosive thing to happen on social media this week, but you would be wrong.

Picture the scene: a young, female academic at Cambridge shares a happy picture of herself, smiling and clutching her freshly-acknowledged PhD thesis in English literature. Ally Louks, now Dr Ally Louks, probably thought that her message of celebration that she was “PhDone” would be liked by a few and ignored by the majority. Yet her post at the time of writing has been seen by hundreds of thousands of people and Ally has received torrents of abuse, some of which beggars belief. The whole storm has sparked outraged discussion on all sides – most of it thoroughly ignorant – about what a PhD is or should be.

Here’s the thing, for those of you that haven’t been there. A PhD is like going potholing: you wriggle down into some difficult spaces and explore the subterrain. Nobody will ever know those particular underground passages better than you, because nobody else is ever likely to go there or, indeed, even want to go there. The reason you’re awarded the PhD is because you have traversed new terrain and – in the judgement of the potholing community – you are the first to do so, or you have uncovered a sufficient number of nooks and crannies that previous potholers did not comment upon. Most of the time, you don’t find an underground palace, a glistening river of stalactites or a dazzling crystal chamber: you simply wriggle your way back up to the surface and get on with your life. Your thesis will sit on the shelf of whichever institution recognised it and – if you’re lucky – it will be consulted by a tiny handful of niche-hole specialists over the next few decades, the number of which you could count on one hand.

Personally, I blame Stephen Hawking. During his doctorate, he hit upon a leap of understanding so brilliant that it changed the direction of theoretical physics forever. Most of us don’t manage that. This does not mean that our PhDs are not worthy of the title: it simply means that most of us are – demonstrably – not a genius like Hawking. There is a reason why Hawking has been laid to rest between Newton and Dawin: he is right up there with those two when it comes to the significance of his contribution to his field. Yet many people seem to assume that Hawking is an example of what is expected of a PhD candidate – a particularly famous example, perhaps, but an example nonetheless. In reality, most research is utterly banal and unimportant: it’s not going to shake up our understanding of the fabric of the universe.

Louks’ PhD sounds – to me – rather fun. Okay, I’m one of those wish-washy artsy types that got a PhD in Classics, not theoretical physics, but I reckon her thesis “Olfactory ethics: the politics of smell in modern and contemporary prose” sounds like a more stimulating read than a huge number of PhDs that have passed under my nose over the years (pun intended). In response to the unexpected interest in her work, Louks shared her abstract, which only further made my nostrils twitch. Her thesis explores “how literature registers the importance of olfactory discourse – the language of smell and the olfactory imagination it creates – in structuring our social world.” Her work looks at various authors and explores how smell is used in description to delineate class, social status and other social strata. I mean … fine? No? Quite why a certain type of Science Lad on the internet decided that this was a completely unacceptable thesis baffles me. Apparently, there is a certain type of aggressively practical chap, who believes that exploring how things are represented in literature and how that literature has in turn helped to shape our world is utterly unworthy. Well, more fool them. They should read some literature. I suggest they start with Perfume by Patrick Suskind, a modern classic that is quite literally a novel about smell.

I’ll confess that the whole thing has left me feeling quite jumpy about my own thesis, which in 1999 was welcomed as an acceptable contribution to my very narrow, very obscure corner of the underground caves. Once I had seen the reaction to Louks’ abstract I decided to re-read my own. Having done so, I concluded not only that it would sound utterly barking to the rest of the world, it sounded utterly barking to me! This was a field in which I was immersed at the time but have read nothing about since I walked out of the room in which my viva took place.

The viva itself is something that most people do not really understand and is difficult to explain. It is not an examination. Short for viva voce, which is Latin for “with the living voice”, the viva is there in principle for the PhD candidate to demonstrate that they are the author of their own work. In practice, it is also an opportunity for the examiners to quiz the candidate and explore their hypothesis further. The examiners may have questions and it is common for them to advise corrections and amendments; often, the examiners make the passing of the thesis conditional on these amendments. Best case scenario (and one enjoyed by Ally Louks), the examiners pass your thesis with nothing more than a few pencil annotations, none of which require attention for the thesis to be accepted. Worst case scenario, they say that your thesis is a load of old hooey and that you should not – under any circumstances – re-submit it, corrected or otherwise.

While the worst-case scenario is rare and indicates a profound failure on the part of the candidate’s supervisor, who never should have allowed the submission, it does happen on rare occasions. The last time I saw one of my old lecturers from my university days, he reported being fresh from a viva on which he had acted as an external examiner and had failed the thesis. This happens so rarely that I was agog. Having been so long out of the world of academia, it is impossible for me to express in simple terms the intellectual complexities that he explained were the reasons behind his decision, so I shall have to quote him directly: apologies if the language is too academic for you to follow. “Basically, it was b*****ks,” he said. “I mean, don’t get me wrong, it was kind of brilliant b*****ks: but it was b*****ks nevertheless.” That poor candidate. I ached for him. I also found myself recalling the gut-wrenching moment during which Naomi Wolf’s PhD thesis was exposed as fundamentally flawed by Matthew Sweet, live on Radio 3. If you’ve never listened to the relevant part of the interview, I highly recommend it: it is – especially for those of us who have submitted a thesis for judgement in the past – the most toe-curling listen imaginable. Wolf’s entire thesis appears to have been based on a misunderstanding of a legal term, which Sweet discovered simply by looking it up on The Old Bailey’s website. Wolf’s thesis had been passed at Trinity College, Cambridge, an institution that would be hard to beat in terms of intellectual clout and reputation, so quite how this happened is mind-boggling and shameful.

The reaction to Souks’ thesis does, I suspect, have a great deal to do with the increasing suspicion with which academia is viewed, and in many ways I am not unsympathetic to people’s disquiet. There is, without question, a good deal of nonsense (or b*****ks, to use the technical term) talked in a lot of fields, particularly in the arts and social sciences. Yet the vitriol with which Souks was criticised has nothing to do with this. Her abstract, to anyone with even a grudging respect for the field of English literature, makes intellectual sense. No, the roasting of Souks and her work betrays a profound and depressing ignorance as well as a nasty dose of good old-fashioned cruelty. Before people decide that an entire field of study is unworthy of merit, they should maybe ask themselves whether there is even the tiniest possibility that they perhaps don’t know enough about it before they pounce. One can but hope that these people who value their rationality so much will next time run a more scientific test, rather than dunking the witch to see whether she floats.

Photo by Alex Block on Unsplash

Fulfilling your destiny

“Life is like a game of cards. The hand you are dealt is determinism; the way you play it is free will.”

Jawaharlal Nehru

Currently, I am obsessively plugged in to an audiobook, the latest release from my favourite author, Liane Moriarty. Moriarty writes what is often scathingly referred to as “chick lit”: a genre which at its worst can be undeniably vacuous, but no more so than the two-dimensional thrillers churned out by authors marketed to men. The withering contempt with which “chick lit” is viewed says a lot more about how society treats the everyday lives and concerns of women than it does about this particular genre of popular fiction.

It is undeniable although perhaps a little depressing that Moriarty is an author unlikely to be read by vast quantities of male readers. Her stories revolve around people – mainly suburban women – and the thoughts inside their heads. Often there is an unfolding plot, but the focus is on the development of character and relationships rather than on action or suspense. Moriarty is an absolute master of the genre and writes with an effortless charm that belies her talent; the best authors make it look easy when it isn’t. It’s a great shame that more men aren’t interested in some of the things which interest women, and a truth that I have pondered the reasons for on and off. I speak as someone who has read quite broadly and have flirted with books categorised in modern times as “lad lit”: I am a huge fan of Martin Amis and if you haven’t read David Baddiel’s forays into novel writing in this genre then you should – they are annoyingly good. So if I, as a woman, can enjoy books written from a male perspective and read by men, I find it somewhat irksome that so few men have the desire to show any kind of interest in the fiction favoured by women. Anyway, I digress.

Much as many of Moriarty’s books (perhaps most famously Big Little Lies) focus on the lives of suburban women, some of them are intricately plotted and follow the lives of a complex set of characters, all of which cross paths in various ways and with a myriad of consequences. Because of this, I was greatly surprised when I heard the author interviewed and she revealed that she writes without a plan. Prior to her most recent release, the last novel she wrote called Apples Never Fall followed the tensions and anguish within a family from whom the matriarch has disappeared: most of the novel we spend wondering what has happened to this character (including whether she has merely walked out of her life or has been horribly murdered by someone within it), and Moriarty reports that she too spent much of her writing time wondering the same thing. She had not, by her own account, decided what had actually happened to this key character when she began to write the book. She started with the idea of the disappearance and discovered the truth behind it along with her characters. It is perhaps this very unconventional approach to plotting that enables her to write with such authenticity – she’s not dropping hints or trying to plant red herrings in relation to the real outcome, for she has no idea what that outcome will eventually be.

I am around one third of the way through Moriarty’s latest and am gripped as ever by her writing. Here One Moment is perhaps her most ambitious novel yet as it circles around the idea of free will and destiny. In summary, the scenario is that a group of people on a flight from Hobart to Sydney are each pointed at by a woman on board the flight and told the supposed time and manner of their death. Some passengers are given what amounts to welcome news by most people’s standards (heart failure, age 95), others – inevitably – are told that they will die very young. Some are even told that their death will be as a result of violence or self-harm. The rest of the novel is about the fall-out from this thoroughly alarming and unscheduled in-flight entertainment.

One of the ideas explored in the novel is the impact that such an experience might potentially have, not only on the feelings of those receiving the predictions but on their actions too. One of the passengers pays a visit to another “psychic” after the flight, and this “psychic” points out to him that he will not be the same person after the reading as he was before it. He points out that whatever he says to his client will make him act differently and that this will then potentially have an impact on the outcome of his life. Moriarty refers constantly to the idea of chaos theory throughout her writing – the idea that one small event in nature has a ripple effect that causes huge impact in other areas. At the point in the novel where I am right now, a mother who has been told that her baby son will die by drowning while still a child has elected to take him to swimming lessons. He takes to the lessons like the proverbial duck to water and it becomes clear that he is going to become a huge lover of swimming. As readers, we now sit with our hearts in our mouths and await the inevitable: will the mother’s decision to take her child to swimming lessons, sparked solely by the psychic’s so-called prediction, end up leading to the death of her child in the future?

The same thought experiment was run by a Greek playwright called Sophocles almost 3000 years ago. He wrote what I would argue is perhaps the most influential work of literature ever published, in the form of the tragedy called Oedipus Rex. Most people know the name “Oedipus” only as a result of Freud’s early 20th century ramblings about motherhood and sexual repression; very few people have any idea what a frankly brilliant and chilling story that of Oedipus was when it was written. It is emphatically not a story about motherhood, nor is it a story about sexual repression; to be honest I don’t think I can ever forgive Freud for making it so. Oedipus Rex is a story about destiny, about free will and about the extent to which we have control over either of those things. If you don’t know the story, it can be summarised as follows …

In ancient Greece, a king and queen are horrified to be told by an oracle that their baby son will grow up to murder his father and marry his mother. Terrified by this ghastly prediction, they send the baby away to be exposed on the hillside and die. The kindly old shepherd gifted with the unhappy task cannot quite bring himself to do the dreadful deed, so he ends up passing the baby to another ruler and his wife in a far-distant land who are childless, and they bring the baby up as their own. The baby is named Oedipus. He has no idea that he is adopted.

When Oedipus grows up, like all curious young men, he too consults the oracle and asks his destiny. The oracle tells him that he is destined to kill his father and marry his mother. Horrified, he does the only sensible thing: he removes himself from his family home and goes off on his travels, thus removing any possible risk of somehow murdering his father and marrying his mother. Oedipus believes that he has taken control: he is the master of his own destiny and he has cheated the oracle. Trouble is, remember … he doesn’t know he is adopted.

Several months into his lonely travels, Oedipus gets into an altercation on the road with an arrogant older man who tries to tell him what’s what. Long story short, Oedipus does the only thing any decent red-blooded young male would do, he kills the old fool. Afterwards, he continues on his travels and eventually comes to a kingdom which is in a bit of trouble because it’s being harassed by a nasty monster. Clever Oedipus defeats the monster by solving its riddle and – would you know it – it turns out that the king of this particular dominion has recently died and they’re in need of a chap to take over. What a stroke of luck! Oedipus marries the widowed queen – who is granted a little older than him but still young enough to bear children – and becomes King of Thebes. The rest, as they say, is a truly horrible history.

The whole point of Oedipus’ story is exactly the thought experiment that Moriarty is playing out in her novel. To what extent does a sense of destiny itself predetermine our actions? To what extent do people inevitably fulfil the path that they are told lies in front of them? It is easy to point out that if the oracle had not said what it said – on either occasion – the story of Oedipus would not have unfolded as it did. In the ancient world, the story was taken as a morality tale about man’s arrogance: humans are convinced that they can outwit the gods and cheat their destiny, and that arrogance begins and ends with asking the question. If nobody had asked, would nothing have happened? Does the asking trigger the event?

It is easy to assume that these big philosophical questions don’t affect our lives on a day-to-day basis, but in fact this loop of thought is inescapable and resonates in daily life. During my career, a trend came and (thankfully) went of sharing what were laughably called “predicted grades” with students. These grades were not teacher predictions (although teachers are indeed asked to make such psychic predictions and that nightmare continues) but based on a crushing weight of data that looks at “people like Student A” and attempts to make a mathematical prediction about how “a person like Student A” is most likely to perform in an exam. All sorts of data get included in the mix, from prior academic performance to socio-economic background. The happy news that a bunch of data analysis that hardly anybody fully understands “predicts” that Student A is likely to get a Grade 3 or below was – until alarmingly recently – shared with Student A. What an absolute travesty. I will never forgive the system for sitting a child down and telling them that the computer says they’re likely to fail. Likewise, I have seen children who are “predicted” a line of top grades spiral out of control under the pressure. For heaven’s sake stop telling kids what “the data” (our new name for the divine oracle) says about their destiny. It’s a seriously grotesque thing to do.

For similar reasons, I know parents who are understandably jumpy about their children being labelled as anything. Who doesn’t remember well into middle age having “he’s shy” or “she’s anxious” being said over their head, while they were going through an entirely normal phase of being wary of strangers? Before you know it, the label of “shy” or “anxious” or whatever the grown-ups have decided befits you becomes you. I am absolutely in support of my friends who will not have their children referred to in this way: if history teaches us anything, it’s that people tend to fulfil their destiny. So be careful what path you pave.

Photo by Johannes Plenio on Unsplash

Reading their minds?

Classroom teachers are expected to be psychics. According to the Teachers’ Standards, which are many and complex, every classroom teacher must not only understand how children think and learn but must know when and how to differentiate appropriately, using approaches which enable pupils to be taught effectively; they must have a secure understanding of how a range of factors can inhibit pupils’ ability to learn and how best to overcome these; they must have a clear understanding of the needs of all pupils, including those with special educational needs, those of high ability, those with English as an additional language and those with disabilities; they must be able to use and evaluate distinctive teaching approaches to engage and support all of these different young people … and all of this must happen while there are 30 of these diverse learners in the same room.

Much of what is demanded of the average classroom teacher is impossible. I say this not to be a doom-monger or to preach the acceptance of mediocrity – far from it. Throughout my career I strived to be the best teacher I could possibly be. Yet in reality, we cannot be all things to all men and we cannot possibly fathom the inner workings of every single one of the minds that are sat in front of us.

I have written numerous times on the differences between classroom teaching and tutoring but this week something hit me that had not occurred to me before. While I have always been aware that one-to-one sessions give me an insight into the misconceptions each child may have and thus the ability to address those, it has not previously dawned on me that tutoring a large number of students in the way that I do has given me a broader insight into how children think and learn in a way that I could not have experienced as a classroom teacher. Working one-to-one means that I get to listen to how my students think and reason in real time.

It is often said by modern cognitive scientists that education has placed too much focus on the diversity of learners in the past. While every parent likes to think that their child has a unique set of needs that can only be met in a unique way, the reality is that there is far more that unites young learners than divides them. We now know a great deal about how memory works and how best to support students with the learning process: this is not to say that some will not find it harder than others and require more time and effort than others, but broadly speaking the approaches that work for those with special educational needs in fact work well for the mainstream classroom as a whole. If you tailor your classroom towards providing the best learning support for your neediest learners, everyone benefits as a whole.

Working one-to-one with the huge number of students that I do has furnished me with a real insight into how students tackle the process of translating and what the common pitfalls are when they are doing so. It has also provided some perhaps surprising insights into which constructions that children tend to be able to translate on instinct, without a full grasp of understanding. This information is actually gold dust and links to what I blogged about last week – the necessity of designing a curriculum around the learners sat in front of you and in relation to the time you have available as well as the end goal when it comes to examinations. I have realised in the last year or two that there are some complex constructions which many classroom teachers tend to focus too much time on, to the detriment of the basics, when in fact many students could translate those constructions without difficulty so long as they had a grasp of their verb and noun forms and their vocabulary.

Working one-to-one has given me more of an insight into what doesn’t need to be taught as well as what does. While most of my students have gaping holes in their basic knowledge, many of them have spent an unnecessary amount of time being taught things that they do not need to understand in detail. Sometimes, a construction has been so over-taught that a child has been left in complete confusion; their natural grasp of it, one which they would in all likelihood have stumbled upon if given the right basic tools and a decent dose of confidence, has been lost forever.

I am still pondering what to do with these insights as it occurs to me that they could quite honestly be of enormous use to any classroom teacher who is willing to listen. For now, my understanding of how children go about acquiring the skills that they need to do well in Latin is ever-increasing and remains endlessly fascinating to me.

Photo by Danaisa Rodriguez on Unsplash

The best use of curriculum time

“Time is the most valuable thing a man can spend.”

Theophrastus.

On Wednesday, I had my regular fortnightly meeting with the new teacher who has taken over the teaching of Latin in the school where I used to work. This teacher is an ECT (in her first year of teaching) and while she will of course have a professional, in-house mentor to oversee her development within the school, the Head was conscious of and rightly concerned about the fact that she will have no subject expert in the building to offer her support. That’s where I come in. This week, I found my young protégé in a bit of a flap about one particular part of the language curriculum and since reflecting on our time together I realise that I was less helpful than I could have been. Rather than letting our conversation continue when it comes to the grammar at a granular level, what I needed to do was to get her to reflect on which aspects of the curriculum actually require the most time spent on them. Next time I see her, I shall do so.

One of the most frustrating things about leaving teaching is at last having the time to see and understand how one could completely re-write the curriculum to reflect more accurately the way that the exam papers are written. What those outside the profession will find difficult to understand is that it is left in the hands of often new and experienced teachers to design an entire curriculum to prepare for an exam they did not write. No real guidance is shared by the exam boards (and on the odd occasion when some guidance is offered, it is usually either unrealistic or unworkable in some or most settings). What we really need is for exam-setters to work alongside schools to build an appropriate curriculum, but that’s never going to happen.

As we talked, my instincts were telling me that this teacher was becoming unnecessarily bogged down by her worries about a particular construction and was planning to spend a huge amount of time on it. I need to make sure that she does not do this. The reason? Well, I have just reviewed the 8 separate past and specimen papers that we have from the exam she is entering her students for, and this particular construction appears either once or twice in each language paper. Around half of the time, its appearance is supported by comprehension questions, which guide the candidate towards the correct interpretation. The rest of the time, the examples used are almost exclusively ones which most students would be able to translate on instinct, even if they had never been taught the existence of this particular construction. Compare that to another kind of construction, which most teachers skim over very briefly, but which in fact appears multiple times in every single exam paper. Which would you focus on? Sounds obvious now, doesn’t it? But you wouldn’t believe how few teachers go through this thought-process when designing their curriculum and planning their lesson time.

Having made the switch from the classroom to private tutoring, I am in contact with dozens of students from multiple different types of schools. Something I have come to realise is that almost all teachers over-teach the aspects of the curriculum that they believe to be difficult. It is not that their beliefs are incorrect, but what they get wrong is the amount of curriculum time that they dedicate to these concepts as a result of their relative complexity. It’s a common assumption in education that one must spend more time on something because it is difficult. In fact, this must be weighed up against three crucial realities: firstly, the nature, knowledge and curriculum history of the students that we have in front of us; secondly, the amount of time that we actually have with them; thirdly – and perhaps most crucially – the relative weighting that this difficult concept carries when it comes to final outcomes. This requires an understanding of how much, how often and with how much depth that difficult concept is tested, as well as how many marks that testing carries. Once you start trying to balance this equation, it can lead to some surprising conclusions, which might not seem obvious to anyone but the most experienced in curriculum design.

If a concept or construction is so difficult that its full understanding will require multiple hours of curriculum time, yet that very construction is only likely to add up to three marks on one paper, which converts to 1.5% of the student’s overall score … is that concept actually worth teaching at all? It’s something to think about, at least. Perhaps one could teach it in a very condensed form, teach some broad strategies that work in the majority of cases and leave it at that. Certainly, what one should not do, is spend hours and hours of precious curriculum time trying to bring students to the point of full understanding whilst neglecting other concepts which we might consider simpler but appear multiple times on the paper and are thus integral to success. It simply isn’t the sensible approach, given the huge constraints that all schools face when it comes to curriculum time.

The tendency for teachers to labour what’s difficult is something which I share openly with my tutees. I am very careful not to criticise or undermine the school’s curriculum, but I simply explain that it is natural for teachers to spend lots of time on the things that they know are difficult as they are setting the bar high for their students. Children of the age that I work with are perfectly capable of understanding that this might be a noble and understandable approach, but is perhaps not the best strategy to help them if they are struggling with the basics. Even the most able students, who are aiming at the highest grades, can still be reassured by the knowledge that the most challenging aspects of the curriculum are of less importance than perhaps they thought they were; it actually frees them up to grapple with them, once they have been released from the anxiety that their full understanding of this concept is absolutely essential for success. Knowing that you’re working on something that might gain you an extra mark or two is very freeing, and it enables the students who are aiming high to make sensible decisions about how to spend their own time, which is often very stretched.

In Latin, it is not only the language paper that requires this frankness of approach and a realistic analysis of where one’s time should be directed. I have written before about the extent to which teachers over-teach the stylistic analysis of literature texts, when the overwhelming majority of marks are gained in the exam through students simply knowing the text off by heart. I emphasise this over and again to the students I am working with, many of whom come to me because they are scoring very low marks in this aspect of the examination. Students can score at least 80% by simply knowing the text like the back of their hand, so this should be the overwhelming focus of the lesson: despite this, I have so far come across only one school where I would say this is happening – where the focus is on drilling and making it clear to students that they must be learning the text in detail. I shall not name the school, but one thing I will say is that it is a very high-achieving school, where the Latin department produces results of almost exclusively 8s and 9s in the GCSE every single year: this goes to show that the school is not avoiding the trickiest concepts – there is no way a student could score a Grade 9 without getting a decent score in the style questions – but it shows that they understand how to balance their curriculum and focus their efforts on what gains students the biggest advantage. The emphasis must be on knowledge, with the complex skills being supplementary to that. The final clincher, which again I share with my students, is that the high-level questions become infinitely easier and more doable once you know the text. Thus, a student who has already gained a solid knowledge of the text that is in front of them has a much better chance of being able to understand and apply the ideas he/she is being taught to gain those elusive extra marks.

Photo by Morgan Housel on Unsplash