Fulfilling your destiny

“Life is like a game of cards. The hand you are dealt is determinism; the way you play it is free will.”

Jawaharlal Nehru

Currently, I am obsessively plugged in to an audiobook, the latest release from my favourite author, Liane Moriarty. Moriarty writes what is often scathingly referred to as “chick lit”: a genre which at its worst can be undeniably vacuous, but no more so than the two-dimensional thrillers churned out by authors marketed to men. The withering contempt with which “chick lit” is viewed says a lot more about how society treats the everyday lives and concerns of women than it does about this particular genre of popular fiction.

It is undeniable although perhaps a little depressing that Moriarty is an author unlikely to be read by vast quantities of male readers. Her stories revolve around people – mainly suburban women – and the thoughts inside their heads. Often there is an unfolding plot, but the focus is on the development of character and relationships rather than on action or suspense. Moriarty is an absolute master of the genre and writes with an effortless charm that belies her talent; the best authors make it look easy when it isn’t. It’s a great shame that more men aren’t interested in some of the things which interest women, and a truth that I have pondered the reasons for on and off. I speak as someone who has read quite broadly and have flirted with books categorised in modern times as “lad lit”: I am a huge fan of Martin Amis and if you haven’t read David Baddiel’s forays into novel writing in this genre then you should – they are annoyingly good. So if I, as a woman, can enjoy books written from a male perspective and read by men, I find it somewhat irksome that so few men have the desire to show any kind of interest in the fiction favoured by women. Anyway, I digress.

Much as many of Moriarty’s books (perhaps most famously Big Little Lies) focus on the lives of suburban women, some of them are intricately plotted and follow the lives of a complex set of characters, all of which cross paths in various ways and with a myriad of consequences. Because of this, I was greatly surprised when I heard the author interviewed and she revealed that she writes without a plan. Prior to her most recent release, the last novel she wrote called Apples Never Fall followed the tensions and anguish within a family from whom the matriarch has disappeared: most of the novel we spend wondering what has happened to this character (including whether she has merely walked out of her life or has been horribly murdered by someone within it), and Moriarty reports that she too spent much of her writing time wondering the same thing. She had not, by her own account, decided what had actually happened to this key character when she began to write the book. She started with the idea of the disappearance and discovered the truth behind it along with her characters. It is perhaps this very unconventional approach to plotting that enables her to write with such authenticity – she’s not dropping hints or trying to plant red herrings in relation to the real outcome, for she has no idea what that outcome will eventually be.

I am around one third of the way through Moriarty’s latest and am gripped as ever by her writing. Here One Moment is perhaps her most ambitious novel yet as it circles around the idea of free will and destiny. In summary, the scenario is that a group of people on a flight from Hobart to Sydney are each pointed at by a woman on board the flight and told the supposed time and manner of their death. Some passengers are given what amounts to welcome news by most people’s standards (heart failure, age 95), others – inevitably – are told that they will die very young. Some are even told that their death will be as a result of violence or self-harm. The rest of the novel is about the fall-out from this thoroughly alarming and unscheduled in-flight entertainment.

One of the ideas explored in the novel is the impact that such an experience might potentially have, not only on the feelings of those receiving the predictions but on their actions too. One of the passengers pays a visit to another “psychic” after the flight, and this “psychic” points out to him that he will not be the same person after the reading as he was before it. He points out that whatever he says to his client will make him act differently and that this will then potentially have an impact on the outcome of his life. Moriarty refers constantly to the idea of chaos theory throughout her writing – the idea that one small event in nature has a ripple effect that causes huge impact in other areas. At the point in the novel where I am right now, a mother who has been told that her baby son will die by drowning while still a child has elected to take him to swimming lessons. He takes to the lessons like the proverbial duck to water and it becomes clear that he is going to become a huge lover of swimming. As readers, we now sit with our hearts in our mouths and await the inevitable: will the mother’s decision to take her child to swimming lessons, sparked solely by the psychic’s so-called prediction, end up leading to the death of her child in the future?

The same thought experiment was run by a Greek playwright called Sophocles almost 3000 years ago. He wrote what I would argue is perhaps the most influential work of literature ever published, in the form of the tragedy called Oedipus Rex. Most people know the name “Oedipus” only as a result of Freud’s early 20th century ramblings about motherhood and sexual repression; very few people have any idea what a frankly brilliant and chilling story that of Oedipus was when it was written. It is emphatically not a story about motherhood, nor is it a story about sexual repression; to be honest I don’t think I can ever forgive Freud for making it so. Oedipus Rex is a story about destiny, about free will and about the extent to which we have control over either of those things. If you don’t know the story, it can be summarised as follows …

In ancient Greece, a king and queen are horrified to be told by an oracle that their baby son will grow up to murder his father and marry his mother. Terrified by this ghastly prediction, they send the baby away to be exposed on the hillside and die. The kindly old shepherd gifted with the unhappy task cannot quite bring himself to do the dreadful deed, so he ends up passing the baby to another ruler and his wife in a far-distant land who are childless, and they bring the baby up as their own. The baby is named Oedipus. He has no idea that he is adopted.

When Oedipus grows up, like all curious young men, he too consults the oracle and asks his destiny. The oracle tells him that he is destined to kill his father and marry his mother. Horrified, he does the only sensible thing: he removes himself from his family home and goes off on his travels, thus removing any possible risk of somehow murdering his father and marrying his mother. Oedipus believes that he has taken control: he is the master of his own destiny and he has cheated the oracle. Trouble is, remember … he doesn’t know he is adopted.

Several months into his lonely travels, Oedipus gets into an altercation on the road with an arrogant older man who tries to tell him what’s what. Long story short, Oedipus does the only thing any decent red-blooded young male would do, he kills the old fool. Afterwards, he continues on his travels and eventually comes to a kingdom which is in a bit of trouble because it’s being harassed by a nasty monster. Clever Oedipus defeats the monster by solving its riddle and – would you know it – it turns out that the king of this particular dominion has recently died and they’re in need of a chap to take over. What a stroke of luck! Oedipus marries the widowed queen – who is granted a little older than him but still young enough to bear children – and becomes King of Thebes. The rest, as they say, is a truly horrible history.

The whole point of Oedipus’ story is exactly the thought experiment that Moriarty is playing out in her novel. To what extent does a sense of destiny itself predetermine our actions? To what extent do people inevitably fulfil the path that they are told lies in front of them? It is easy to point out that if the oracle had not said what it said – on either occasion – the story of Oedipus would not have unfolded as it did. In the ancient world, the story was taken as a morality tale about man’s arrogance: humans are convinced that they can outwit the gods and cheat their destiny, and that arrogance begins and ends with asking the question. If nobody had asked, would nothing have happened? Does the asking trigger the event?

It is easy to assume that these big philosophical questions don’t affect our lives on a day-to-day basis, but in fact this loop of thought is inescapable and resonates in daily life. During my career, a trend came and (thankfully) went of sharing what were laughably called “predicted grades” with students. These grades were not teacher predictions (although teachers are indeed asked to make such psychic predictions and that nightmare continues) but based on a crushing weight of data that looks at “people like Student A” and attempts to make a mathematical prediction about how “a person like Student A” is most likely to perform in an exam. All sorts of data get included in the mix, from prior academic performance to socio-economic background. The happy news that a bunch of data analysis that hardly anybody fully understands “predicts” that Student A is likely to get a Grade 3 or below was – until alarmingly recently – shared with Student A. What an absolute travesty. I will never forgive the system for sitting a child down and telling them that the computer says they’re likely to fail. Likewise, I have seen children who are “predicted” a line of top grades spiral out of control under the pressure. For heaven’s sake stop telling kids what “the data” (our new name for the divine oracle) says about their destiny. It’s a seriously grotesque thing to do.

For similar reasons, I know parents who are understandably jumpy about their children being labelled as anything. Who doesn’t remember well into middle age having “he’s shy” or “she’s anxious” being said over their head, while they were going through an entirely normal phase of being wary of strangers? Before you know it, the label of “shy” or “anxious” or whatever the grown-ups have decided befits you becomes you. I am absolutely in support of my friends who will not have their children referred to in this way: if history teaches us anything, it’s that people tend to fulfil their destiny. So be careful what path you pave.

Photo by Johannes Plenio on Unsplash

In Praise of Idleness

When I was around 13, my grandfather advised me to read an essay by Bertrand Russell called In Praise of Idleness. I don’t recall what his point was at the time, but it was probably a side-swipe at what he rightly saw as my privileged middle-class upbringing compared to his own. Well, better late than never, so almost 40 years later and 30 years after his death, I have taken my grandfather’s advice and read the essay.

Since giving up my full-time career at the chalkface, I have been plagued by more or less the same question from everyone. “What are you going to do with yourself?” asked my mother. “What have you been up to?” asks my sister, almost every time we speak. “Doing anything today?” asks my hairdresser every six weeks. “What are you doing for the rest of the day?” asked the fellow tutor I met for a Zoom coffee yesterday. Now I am not working in a job that is universally acknowledged to be all-consuming, people have suddenly become fascinated by what I must be doing with my time. The pressure is on to come up with something life-affirming that I can cite as evidence for the validity of my existence on earth. Usually, I come up blank.

Partly, I think, it’s because I struggle with this kind of small talk. While I literally cannot bear to outline to someone else the uninteresting activities that will, inevitably, form part of my day, most people seem only too happy to share the most mundane aspects of their lives under the apparent the assumption that everyone else is fascinated by them. In the modern world, this is evidenced by the quite remarkable plethora of social media posts in which people inform everyone else of every single unremarkable act they perform. Doing was always the point … if you recall, when Facebook first came up with the idea that people should post updates on their own lives, the status bar read “Emma Williams is …” and you had to fill in the rest. There was a huge campaign to remove “the mandatory is” and Facebook listened. The rest is history, if you can bear to read it. I can’t.

You see, I simply cannot be bothered to say, “well, today I’ll go to the gym, then I’ll come back and write my weekly blog post, then I might do 5 minutes of mad dancing because that’s what I do for my regular dose of HIIT to get my heart rate up, then I’ll make coffee and I might treat myself to an episode of The Mentalist on Amazon Prime as I’m really into that, then I’ll finish my work on the last OCR set text, which I need to translate and put onto Quizlet for my students, then after lunch I’ll make sure my evening’s lessons are prepared. Oh, and there’s a load of washing to do, Sainsbury’s are coming with our groceries and David wants me to put the lawn sprinklers on and I might also go to Morrisons at some point. Are you bored yet? I mean … who gives a rat’s ar*e? And that’s all before I actually start tutoring in the evening, when I will do the work I am paid to do. See, I am perfectly happy with my day today, indeed I am really quite looking forward to it: that does not mean it’s interesting to anybody else.

Russell would argue, I think, that I have not had sufficient education in order to make the most of my abundant leisure time. According to his essay, an education is required for the wise use of leisure and without it then we are prone to time-wasting, examples of which include watching the football. Despite this, Russell is actually trying enormously hard not to be a snob, and I love the fact that the whole point of his essay was to challenge the assumption that “workers should work” and to float the idea that everyone, including the working classes, should be working significantly fewer hours: his model actually argues for everyone working four hours per day instead of eight. He attacks the futility of unfettered capitalism (although, by the by, he’s got some spectacularly naïve views on the equity of Russia’s economy) and takes a very pleasing swipe at the way in which the Christian work ethic has been used as a mechanism of control, to keep the workers in their place. He ruefully observes that “Athenian slave-owners … employed part of their leisure in making a permanent contribution to civilization which would have been impossible under a just economic system”, although he later goes on to show some insight into the fact that not all of intelligentsia are deserving, remarking that – for every Darwin, “against him had to be set tens of thousands of country gentlemen who never thought of anything more intelligent than fox-hunting.” Bravo, Bertrand.

All in all, I really enjoyed the essay and am drawn to read the next one in his collection entitled “useless knowledge”. I am still not sure what reason my grandfather had for recommending it to me, but it is rather nice that his recommendation has come in handy some 40 years later in order to help me express my current thoughts on my relatively free and easy life compared to the one I was leading a few years ago. One of the things that I have taken back with both hands is the opportunity to read, which I all but lost during my busiest times. Would I have found the time to read a philosophical essay when I had a full day’s teaching ahead of me? Like heck I would. Such time in itself is a luxury and one which I value enormously.

Photo by Clay Banks on Unsplash

Reflections on Failure

Well over two years ago, I resolved to write a blog post every single week. So far, I have managed to do so. One of the many ways that this has been possible is that I forgive myself when the writing and/or the idea I come up with in one particular week is not exactly going to set the world on fire. If I am going to achieve the goal of writing something every week, I need to accept that not every single post is going to be a work of art. I can’t even imagine the pressure of coming up with a weekly Op Ed for a respected newspaper or journal. Indeed, the only paid writing gig I ever managed briefly was fortnightly and even that was one that I had to resign from after a while; the expectation to produce a well-researched, top-quality piece of writing on a topic of interest that was relevant to the right readers was something I simply couldn’t cope with. And by the way, the going rate for writing of this sort is utterly dismal – well below minimum wage if you calculate your earnings by the hour.

One of my earliest blog posts remains one of the ones that I am most fond of. It’s called “The one that got away” and was a reflection on the student that I remember with the most regret from my career at the chalkface. A student I felt I had failed. I’m a huge believer in the fact that one should acknowledge one’s failures and reflect on them. Too often we are encouraged not to even use the word “failure” but I think it’s important. All of us fail. It’s not a dirty word, it’s a part of a full life well-lived and an ambitious career. “Show me a man who has never made a mistake and I’ll show you one who has never tried anything” is a viral internet quote which – in various forms – has been attributed to pretty much everyone including Albert Einstein, Theodore Roosevelt and – my personal favourite – Joan Collins. Whoever said it (and today I truly cannot be bothered to try and find out who did so) was absolutely right.

My failures in tutoring have been few and far between. I say this not to boast about how great I am at what I do but rather to demonstrate how much easier and more powerful one-to-one tutoring is compared to classroom teaching. If you are an expert in your subject (by which I mean the academic content and the expectations of the relevant examinations), plus if you’re used to communicating with students of the age you’re trying to work with, tutoring is a breeze. One-to-one work is so phenomenally powerful that you really don’t need to be a genius at it for it to have a tangible impact. I like to think that I am good at what I do, but compared to the ambition of being a good classroom teacher, being a really good tutor is remarkably easy. Being a really good classroom teacher? Oh my goodness it’s hard. Like you wouldn’t believe. I cannot emphasise this enough. You wonder why teachers are leaving the profession in droves? I’ll give you a hint. It isn’t the salary.

Being good at what you do does not mean you will not fail sometimes. I keep a record of students who have discontinued (as opposed to those who have simply reached the end of their time with me because they have completed the course or finished their exams). There are not many, but given the sheer volume of students that I work with there are always going to be a few. This week I decided to reflect on each case and try to glean what – if anything – can be learned from them. It turns out, they all have one thing in common.

Generally speaking, the underlying reason why a student will discontinue working with me is that they remain reluctant to engage with the sessions. This is sometimes because the tutoring has been foisted on them, rather than something they have asked for themselves, or it’s sometimes when they realise that they will have to do some work during the sessions – a student may have asked for help, but the process is not going to work unless they are up for a challenge. I have worked with scores of students who are deeply reluctant to work independently outside of the sessions, and I always make it clear to the bill-payer that the impact of what I do will be limited when this is the case; yet so long as a student engages with the sessions during our one-to-one time together then it is still possible to have some kind of impact on them. By contrast, a student that really won’t engage with the learning process will not progress. It is often because they are afraid of failure and while I’m pretty experienced with helping a disaffected student to overcome this barrier, I accept that I simply cannot win them all.

So, what can I do to mitigate against such failures? After all, there is no point in reflecting on failure unless to improve. Well, something I have got better at is the early identification of students who are not responding well to the process. I would much rather get in touch with home and have this frank conversation than continue to take someone’s money when I believe that I am unlikely to have much of an impact on that student’s outcomes. Sometimes, that very frank conversation can jolt a student into realising that they have been resistant to the process and if they actually do wish to continue with the tutoring then it’s usually the catalyst towards engagement and progress – a turnaround in what might otherwise have been a failure. If the student does not want to keep working with me, it gives them the opportunity to say so, which is fine too.

Beyond that, another way in which I have tried to mitigate against the risk of failure is to specialise more and more in the areas I know best. I am a GCSE expert and now I am so much in demand then that’s what I offer. I work with students who are preparing for the GCSE or who have it in their sights and am no longer advertising myself as a tutor who works outside of this field: my expertise at working with that material and that age-range is the greatest and the more I am in my field of expertise, the more likely the process is to succeed. My advice would be to be wary of tutors who offer a bounteous range of subjects and/or levels: the best tutors hone their skills in one particular offering and become a genuine expert in what they do.

One of the things I tell my students is that mistakes are important. They inform me of their misunderstandings and misconceptions, so they’re a hugely important part of the tutoring process. Mistakes and failures make us better at what we do and we should embrace them and learn from them, not see them as a reflection on us as a person or a professional. It is not the failures that define us, but rather how we respond to them. Failures can make us more likely to succeed in the future.

Photo by Kind and Curious on Unsplash

Following the Herd

At primary school, I rarely played with other children. For me, playtime usually meant a walk around the edges of the playground, observing others and thinking to myself. There were lots of reasons why I found it difficult to connect with my childhood peers, none of them particularly interesting or unusual, but I have always wondered whether my early childhood experiences have shaped my temperament: to this day, I’m not much of a joiner.

More recently, I have begun to ponder whether in fact my own biology has had more influence on my personality than I would like to admit: as someone who suffers with extremely poor eyesight and less-than-perfect hearing, I am naturally quite cut off from much of the world. In recent years, I have begun to realise how this has in many ways defined how I relate to others and in turn how others respond to me. Motivated by a desire for acceptance, I have always tried to disguise my disablities, to the extent that many people are genuinely surprised when I admit to them. The price I have paid for this – ironically – is that I have gained a reputation of being “stand offish”, with many people firmly convinced that I have ignored or blanked them over the years. So, for anyone reading this who is convinced that I have overlooked them in the street or in the corridor (especially to whomever it was that made me aware of it by writing a rather nasty comment on this blog): the truth is, I probably didn’t see you or hear you. I’m sorry. It wasn’t deliberate.

Large scale groups have always made me feel uncomfortable and I hate the idea of “losing myself” in a crowd. The thought of going to a football match terrifies me. I did a few big concerts in my youth but struggled with the sheer number of people around me and I would not do it again now I’m older. A crowd takes on a mind-set and a force of its own, one that’s both independent from and beyond the control of the individuals it contains. Recent events have served as a horrific and tangible reminder that herd mentality – in all its forms, both ancient and modern – is something that should frighten us all.

Experience has certainly taught me that being part of a group is not in my nature and broadly speaking I am proud of the fact that I won’t play ball for the sake of staying on the team. It may not be my most attractive quality, but it’s one that will drive me to raise the alarm whilst everyone else stays silent. It makes me the kid who will shout that the emperor’s got no clothes on. Some employers have thanked me for this, others have not: it takes a robustly confident leader to tolerate being told that they’re naked in front of the world. There are times when I have reflected that I could have led a somewhat easier life – certainly professionally – had I been more willing to march in time, but generally speaking I quite like being an outsider. This is not to say that my failure to merge cohesively with a group has not caused me some anguish over the years – it can be a lonely existence. In the past, it has meant being kicked out of a group of writers with whom I shared many values, due to my innate inability to agree with them on everything – or at least, to pretend that I did. It meant the Editor of the magazine blocking all contact with me as “no longer an ally” because I asked questions and defended other people’s right to to do so. As a lifelong supporter of social justice, the increasing phenomenon of these kinds of activists, who denounce all forms of debate or discussion, has come as a genuine shock to me.

Until a few years ago, I believed that the fight for equality would usher in a new era of empathy, diversity and understanding – a new age, in which our ability to relate to each other would be improved by our ever-evolving understanding of how human rights intesect and – at times – conflict. It is what being a liberal is all about. Yet it seems to me that most of my so-called liberal allies have been taken over by a collective fear of rejection. Like the teenagers I have worked with over the years, they constantly check in with each other to affirm whether or not what they think is acceptable – and who can blame them? The consequence of dissent these days is excommunication from the tribe. Man, as Aristotle said, is a social animal: rejection is frightening and dangerous.

In the past, I found myself briefly drawn to people who described themselves as “libertarians” – only to find once again that there was a hymn sheet of horrors that I was expected to sing from if I wished to be initiated into the tribe. According to most of the Americans that I met online, to be accepted as a “libertarian” then one must be in favour of guns. Lots of guns. One must agree that the act of carrying a gun is a liberating experience (I mean – what?) and certainly that the act of carrying one is none of the government’s business. Every time I tried to propose a different line of thinking (held by most sane individuals on this side of the Atlantic), I was simply told that I was “not a libertarian”. So there we are. Another crowd to watch from the sidelines as they descend into madness.

Another “libertarian” approach that I struggled to respect was the puerile desire to offend, bolstered by the dubious claim that this is somehow a noble and worthwhile antidote to the equally tedious culture of taking offence. Certainly, I relish challenge and debate, and I also believe that free speech is more important than the inevitable risk of causing offence to some. As Salman Rushdie said following the horrifying attacks on the staff at Charlie Hebdo in 2015, “I … defend the art of satire, which has always been a force for liberty and against tyranny, dishonesty and stupidity.” But in an article on what he has termed “cultural libertarianism,” Breitbart author Allum Bokhari argued that “deliberate offensiveness plays an important role in the fight against cultural authoritarianism, … showing that with a little cleverness, it’s possible to express controversial opinions and not just survive but become a cult hero.” This surely sums up the unambitious and self-seeking aims of the internet-famous shock-jocks, who make it their business to offend – preening contrarians, whose sole function is to cause shock and awe, their online communications a heady mix of clickbait, worthless insults and self-aggrandisement. There is no evidence whatsoever that anyone’s personal liberty is furthered by such infantile sneering, yet swarms of self-proclaimed free-speech advocates rejoice in this toxic effluence with excited applause.

Maybe I’m still that little girl on the edges of the playground, the one with the problem joining in – but as I stand at the periphery, I see the herd mentality all around me. At its best, it gives us a sense of solidarity as we strive for the greater good or find our feet in the world. At its worst, it gives us mindless savagery, the kind of collective violence exemplified and explored in William Golding’s Lord of the Flies. On a day-to-day level, however, it results in something much more mundane and insidious: it endorses mediocrity and prevents us from thinking.

Photo by Steffen Junginger on Unsplash

This is an updated and adapted version of an article I wrote originally for Quillette magazine in 2016.

Shooting the Moon

During the period when I was writing my PhD, my main source of temptation and distraction was an electronic card game called Hearts. This was before the turn of the 21st century and while there were indeed some strange men in some of the science departments talking about a mysterious and abstract notion called “The Internet”, most of us had not discovered it yet. So, in 1998, I had neither cat videos nor social media to distract me, but I did have Hearts. Traditional card games such as Hearts and Solitaire (which I have always called Patience) were included along with the Microsoft software on my laptop, and it turned out to be a genuinely powerful temptation when the alternative was doing some work.

Hearts is a simple game for four players (or you plus three players driven by the computer). It is an evasion-game, in which you must try to avoid collecting any cards in the suit of hearts, plus particularly avoid collecting the Queen of Spades, which carries a heavy penalty and is essential to avoid. Generally speaking, the more hearts you end up stuck with at the end of the game, the worse your score, plus if you end up with the Queen of Spades you are particularly in trouble. I discovered all of this gradually: the motto in my family has always been, “as a last resort, read the instructions”, so in the style to which I had become accustomed, I plunged into the game and learnt the rules through trial and error.

One day, I was having such a bad round that it became clear that I was going to lose every single hand. Amused, I continued on my losing streak, keen in fact to make sure that I did indeed lose every single hand, purely for entertainment. (Please remember – the alternative was neoplatonic metaphysics). It was through this throwing in of the towel that I discovered the phenomenon of “shooting the moon” – it turns out that that in Hearts, if you lose every single hand and thus collect every single card in the suit of hearts and you collect the Queen of Spades, you actually win that round. It’s a slam-dunk, all-in move, like placing all your chips on one roll of the dice. I never managed to replicate the phenomenon and so only ever managed to win through shooting the moon on that one, accidental occasion.

In the last couple of years, I have become of aware of an increasing number of people who are keen for their children to “complete the syllabus early”. Some parents have expressed their wish that the entire specification be covered by the end of Year 10 (good luck with that!) and others adamant that they want the most complex concepts taught early or taught from the beginning. I have no idea where this notion has come from, but it wouldn’t surprise me if it found its origins on some online parent forum somewhere. Some high-achieving schools used to push this kind of rhetoric but with the shift in 2018 to specifications which are far more content-heavy, most schools find themselves struggling to complete the entire syllabus on time in some subjects, never mind early. The desire to push ahead also fails to take into account the rapid development that children are undergoing in their mid-teens. What a child is capable of towards the end of Year 11 may be poles apart from what they were capable of at the start of Year 10. On the other hand, it may not. It’s impossible to predict and – lest we forget – children are not machines.

One or two parents I have spoken to are so utterly wedded to the idea that the syllabus must be completed months ahead of the exam that they simply cannot be persuaded otherwise. Sometimes they claim that their child is vastly ahead in another subject – often mathematics – and express frustration that this is not the case in all. In the past, I might have accepted their take that their child was indeed in this position and argued that languages are different. Now I am married to a man with a mathematics degree, who rues the fact that he feels – on reflection – that he did not have the intellectual maturity to cope with the more nebulous fields of study that he was exposed to during his degree, it gives me pause. Is there honestly any subject in which a child or a young adult, however intelligent, can advance so rapidly without paying a price further down the line? Do they really understand what they are doing, or will it all come crashing down like the proverbial house of cards when they get a little further down the road? My feeling is that unless your child is some kind of savant (and to date I have never met one of those, so I’m telling you your child isn’t one of them) then you’re taking quite a risk with this approach.

Many parents who want their children to do well are concerned about the trickiest concepts in the syllabus. Sometimes they have feedback from their child’s schoolteacher that they have struggled with one or more of these more complex concepts. What some people find difficult to accept is that much of the time, it is not the tricky concept that is the problem – the problem lies deeper, in the foundational studies that their child may have been whisked through at high speed and left with tiny, often imperceptible gaps in their knowledge. Like the invisible holes in the enamel of a tooth, these gaps store up trouble for the future and before you know it you’ve got a gaping cavity in front of you. It is the rarest of occasions when this is not the case and indeed it is often the children who have historically done well in a subject that are most at risk. The better a child appears to be doing in a subject, the harder and faster they are pushed and the greater the number of tiny, undetectable cracks are formed which will make their presence known in the future. It’s the nature of the beast and nobody’s fault, but parents do need to trust a tutor who tells them that it’s time to go back to basics.

The overwhelming joy of what I do now is having the one-to-one time in which to genuinely test and shore up a child’s fundamental understanding. Asking them the same question in multiple different ways to ensure that they possess a genuine grasp of the topic, not a superficial ability to provide a text-book answer to an anticipated question worded in a style that they recognise. Asking them to define a grammatical term and give an example. Most of all, asking them to explain why a phrase or a sentence translates the way it does – does their translation stem from the ability to skate on thin ice or from a genuine grasp of the underlying principles?

You see, shooting the moon is exciting. But risking it all on one turn of pitch and toss is – as any recovered gambler will tell you – a seriously bad idea. Success comes from baby steps, strong foundations and a genuine grasp of how things are put together. Success in study is a marathon, not a sprint, and if a marathon runner started the race with the speed of a 100-metre sprinter, they would never make it to the end, never mind win. Early and fast does not mean better – quite the opposite. It can mean failure. So be patient and trust in the process. Shooting the moon is both elusive and risky and there are infinitely safer ways to win a round of cards.

Photo by Sam Tan on Unsplash

See you in three weeks

This week, at a garden party, I chatted to a man in his 80s who reminisced about a school trip he went on in the 1950s. My neighbour’s father was given the opportunity to visit Dubrovnik in what was then Yugoslavia, when another local school had a few spare places for students to join the trip. Nothing seemed unusual or particularly surprising about his story until he reported their arrival in the city.

“So, the teacher pointed out some features at the train station and said that we should meet at the same spot for our return journey,” he explained. “After that, the teacher said, we’ll see you in three weeks.”

There was a pause, while my husband and I stared at Geoff in silence.

“I’ve got no idea what he and the other staff did from that point on,” he continued, “but we didn’t see them until it was time to go home.”

I then had to check in with him that I had heard him correctly.

“Wait … they just left you there to get on with it? For three whole weeks?

“Absolutely,” he said.

Well. Needless to say, my flabber was gasted. Geoff went on to talk about his memories of the trip, which boiled down to basic survival. He and his friends bought some eggs from a local farm and discovered that every single one of them was bad. He expressed regret that some diaries he had kept at the time had gone missing during a recent move. Let’s hope they turn up at the bottom of an unpacked box somewhere, as they will surely make for fascinating reading when lined up next to the experiences of children today in a school trip setting.

Anyone who knows anything about school trips in a modern setting will be equally struck by the difference between Geoff’s experience and the ones had by students now. I have written before on the pressures of running school trips, most especially school trips abroad, and indeed that piece of writing remains my most-read blog by a considerable margin: it’s been read tens of thousands of times and clearly resonates with teachers who are still faced with the challenge of working in loco parentis. In summary, the original post was an exploration of a case where teachers on a school trip abroad were unjustly charged with “manslaughter as a result of gross negligence” in a French court, seven years after a child had died in an accident on a trip while in their care. Fortunately, the judge threw out the case, but the distress and suffering undergone by those three young professionals can only be imagined. The post also explained how I made the decision several years ago to stop running school trips abroad, purely because I could no longer cope with the stress and anxiety of doing so.

While I would never suggest that Geoff’s experience is one we should try to replicate in the modern world, as it displays a level of naivety and foolishness on the part of staff back in those days that I can only wonder at, it has got me thinking again about what was expected from teachers in the past compared to what is expected from us now. It has also caused me to think deeply about the vast chasm of difference between the day-to-day experiences that were once readily available to young people compared to what we assume is appropriate for them now.

One of the things that Jonathan Haidt explores in his recent book The Anxious Generation is the degree to which children now experience near-permanent adult supervision (to the extent that one might call it surveillance) and thus increasingly less real-world freedom and independence as they grow up; he contrasts this with the complete lack of supervision which most youngsters have when it comes to the online world, which is where – he argues – the worst dangers actually lie. He calls the effect on Generation Z – the generation who grew up with smart phones in their pockets – “the great rewiring” and urges society to roll back on the online freedoms we have grown used to and to replace them with more real-world freedom and risk. Haidt is a Professor at New York University and collaborates often with the American psychologist Jean Twenge, who was one of the first psychologists to argue that the rising rates of poor mental health among Generation Z can be attributed to smartphones. Sceptics of such research argue that young people simply have more things to feel anxious and depressed about, but in my opinion Haidt makes his case pretty persuasively. Earlier generations have also grown up in the shadow of war and global instability, he points out, yet such collective crises in the past did not manifest themselves in psychological distress; quite the opposite, they often engendered a sense of greater social solidarity and purpose, a net positive for mental health. By contrast, the evidence linking mental illness to smartphones and the inescapable and thus addictive access they bring to social media use is genuinely alarming.

Haidt’s argument builds upon a case he has made in his previous book, The Coddling of the American Mind, that overprotectiveness has contributed to the mental health crisis. He argues that Generation Z children are what he calls “antifragile”: they lack exposure to the varied experiences that are required in order to develop resilience. Haidt argues in both books that children ought to be given greater freedom to play unsupervised, free from adult surveillance.

In my last blog post I mentioned that in my last few years at the chalkface it was quite normal to walk down the school corridor and find a child outside every classroom – not necessarily because they had been thrown out of class, but because they were refusing to enter it in the first place due to the extreme level of their anxiety. I have no concrete answer as to why this is happening, but happening it is. There is an emerging school of thought that the well-meaning work that has increasingly been done in schools to address the issue of children’s mental health has in fact done more harm than good. I have recently read Bad Therapy by Abigail Shrier and her research most definitely raises causes for concern. Shrier is a journalist and a controversial figure for some, but her concerns echo those raised by numerous psychologists, who talk of our modern tendency to pathologise normal feelings (who didn’t feel genuinely overwhelmed with fear and at times bone-crushingly miserable during their teenage years?) and push children down a path of sickness rather than allowing them to negotiate their way through their feelings and trust that the storms will pass. These concerns are summarised quite nicely here, in a piece from 2022 in the Telegraph.

So, Geoff’s brief and cheerful reminiscence has left me with much to think about. While none of us would dare to send our children out to a foreign country to fend for themselves for three weeks, perhaps we can learn from what was presumably the innocence of our forefathers. There was an enormous plus side to growing up and living without fear; if that kind of life and freedom produced the vibrant man that Geoff remains, then perhaps they weren’t getting it so wrong in the 1950s.

Photo by Dino Reichmuth on Unsplash

Endtimes

The toe-curling indignity of Joe Biden’s current situation is a lesson to us all. A lesson in what happens when a system favours old guys and then wonders why those old guys won’t move over when it’s time. A system that appears not to have considered what might happen if it’s desperately obvious that one of those old guys should take a back seat, but the dude wants to stay behind the wheel. A system so unwieldy and expensive that the only people who can afford to play the game are – as a general rule – those same old white guys, the ones who don’t want to take their hands off the wheel.

How does anybody know when it’s time to stop? Biden’s painful crumbling in front of the world has reminded me how as a youngster I promised myself fiercely that I would know when my time was done. To me, this does not just apply to when it’s time to retire, but throughout your career when you’re done with a particular role. Whatever I took on in education, I gave it my best shot and then handed it over. I made whatever changes I felt were needed, led people and adjusted systems to what I felt worked best, but always handed over the role when I had run out of ideas. Every. Single. Time. Quite literally my worst nightmare was the idea that people were saying behind my back “why doesn’t she just go?” The thought genuinely filled me with dread. Happily, due to my overwhelming desire to avoid this situation, I’m pretty sure it’s never happened.

On this side of the Atlantic, whatever your politics, I think it’s fair to say that our outgoing government was running out of ideas. Our system is based on a pattern of rotation, ensuring that nobody gets too stale in their role: when a cabinet and the government in general is fresh out of new proposals, we vote them out. The whole process runs on a cycle and – broadly speaking – it works for the best. Only the most partisan (and those who haven’t lived very long) really believe that seismic change will come with a change of government, but everyone can get behind the idea that a fresh line of buttocks on seats in the cabinet office can only be a good thing. Time for something different, for those who are not worn down by cynicism to give it a go. Nothing could be more true this time, when it’s fair to say that the outgoing government has had some issues.

Although not a great follower of any kind of sport, I did smile to myself this time last year when the 20-year-old Carlos Alcaraz smashed Novak Djokovic’s bid for his 8th Wimbledon title. You see, however outstanding you are in your field, there will always be the next youngster snapping at your heels. That’s just as it should be. Personally, I find it inspiring and comforting that there is always somebody coming up through the ranks that is likely to do your job better than you did. I do not find this a threat. I am at peace with the contribution that I made at the chalkface and continue to make as a tutor in extremely high demand – experience counts. But I am genuinely delighted to have met the next person who will be doing my old job in the comprehensive school I left two years ago and to find that she is enthusiastic, passionate and bursting with ideas. Nothing would give me more joy than to see the role flourish and grow. It is not my possession, it is my legacy – and a legacy only works when there are new people keen to do something even better than you did.

Will Biden finally realise that it’s time to step back and spend more time on his sofa – one that isn’t in the Oval office? One can only hope that he is surrounded by advisors with courage, not the usual troupe of sycophants that great world leaders tend to find themselves hemmed in by. Will he listen? The message seems to be that it’s unlikely. The strongest and best leaders I have ever known are the ones who listen to the things they do not want to hear. As someone who is quite good at opening their mouth when others tend to keep theirs closed, I have often found myself to be the reluctant Cassandra in the room. In my experience, the best leaders will listen, nod and thank you for having the gumption to challenge them. The worst will destroy you for speaking the truth. Quite how and why the Democrats have ended up in this position is for those who understand US politics in depth to explain, but I suspect that it’s inertia that has brought them here. Nothing is worse than doing things as they’ve always been done for no other reason than the fact that they’ve always been done that way. Presidents always run for a second term, even if they’re in their 80s and showing clear signs of deterioration despite the best healthcare that their capacious wallet can buy.

Photo by Wonderlane on Unsplash

eligo, eligere, elegi, electus

Given the undeniable unfairness baked into Roman society, it might be a surprise to some that the Romans embraced a democracy of sorts. Only a small fraction of people living under Roman control could actually vote, but male citizens during the period when Rome was a Republic did have the opportunity to cast their vote for various administrative positions in government. The Latin verb “to choose”, which forms the title of this blog post, is what produced the participle electus and gives us the modern word election.

In the 6th century BCE, with the overthrow of the Roman monarchy, the city-state of Rome was re-founded as a Republic and by the 3rd Century BCE it had risen to become the dominant civilisation in the Mediterranean world. The ruling body known as the Senate was made up of the wealthiest and most powerful patricians, men of aristocratic descent. These men oversaw both the military campaigns that brought expansion and wealth to Rome and the political structures that managed its society. At the beginning of the Republic, only the Consuls were elected, but in later years Roman free-born male citizens could vote for officials in around 40 public offices which formed a complex hierarchical structure of power.  Yet this public performance of voting did not really offer the citizens any kind of real choice. If you’re feeling depressed about the choices offered to you in your polling booth today, take heart: things were considerably worse two thousand years ago (even if you were a man).

Candidates for office under the Roman Republic were originally selected by the Senate and were voted for by various different Assemblies of male citizens. These Assemblies were stratified by social class and the weighting was heavily skewed in favour of the aristocracy. In the early years of the Republic, candidates were banned from speaking or even appearing in public. The Senate argued that candidates should be voted for on the merit of their policies, rather than through rhetoric and personality; in truth it meant the general public had no real opportunity to hear candidates’ arguments or indeed to hold them to account. In the later Republic the ban on public oracy was lifted and the empty promises so familiar to us today abounded, alongside some good old-fashioned bribery which – while theoretically illegal – was widespread. As the practice of electoral campaigning developed things did begin to change, with the pool of candidates no longer tightly limited to a select group of aristocrats under Senatorial control. In the long-term, however, this led to even greater misery for the citizens. They lost what little democracy they had during the Roman revolution, when what should have been a righteous and deserved uprising against the ruling oligarchy ended up turning into something arguably worse. Rome’s first ruling emperor, Augustus Caesar, claimed that voting was corrupt and had been rigged by the Senate for years in order to perpetuate the power of a handful of aristocratic families. His neat solution was to abolish voting altogether. Be careful what you wish for?

Once the early ban on public oracy was lifted, a key component of public campaigning during the Republic was canvassing for votes in the Forum. A candidate would walk to this location surrounded by an entourage of supporters, many of whom were paid, in order to meet another pre-prepared gathering of allies in the central marketplace. Being seen surrounded by a gaggle of admirers was hugely important for a candidate’s public image and was worth paying for. Once in the Forum, the candidate would shake hands with eligible voters aided by his nomenclator, a slave whose job it was to memorise the names of all the voters, so that his candidate could greet them all in person. The man running for office stood out in the crowd by wearing a toga that was chalk-whitened called the toga candida: it is from this that we get the modern word candidate.

To further attract voters among the ordinary people, candidates gave away free tickets to the gladiatorial games. To pay for such a display a candidate either had to be extremely wealthy, or to secure the sponsorship of wealthy friends. Cases are documented of men ending up in ruinous debt as a result of their electoral campaigning. Several laws were passed attempting to limit candidates’ spending on banquets and games, which evidences the fact that that the Senate didn’t like electoral corruption except when they were in charge of it.

Democracy under the Roman Republic was very much controlled by the select few male members of the aristocracy who held seats in the Senate. They essentially held all of the power, having been born into wealthy patriarchal families. The majority of people who inhabited the Roman world were not allowed to vote, including women and slaves. It is striking and not to say infuriating how many modern sources on Roman voting talk about “citizens” and “people” without seeming to feel any need to clarify that they are talking about male citizens and male people only. We do have evidence that women in the wealthiest families put their money and their energy behind their preferred male candidates, most usually because they were members of the same family. Electioneering in the form of visible graffiti in Pompeii evidences women’s support of their husbands, fathers and brothers but this is all produced by women of considerable means; what the poorest women in society thought and felt about the men who controlled their lives is anybody’s guess.

Cicero denounces Catiline in the Senate by Cesare Maccari (1840-1919 CE) .
Palazzo Madama, Rome