Reflections on Failure

Well over two years ago, I resolved to write a blog post every single week. So far, I have managed to do so. One of the many ways that this has been possible is that I forgive myself when the writing and/or the idea I come up with in one particular week is not exactly going to set the world on fire. If I am going to achieve the goal of writing something every week, I need to accept that not every single post is going to be a work of art. I can’t even imagine the pressure of coming up with a weekly Op Ed for a respected newspaper or journal. Indeed, the only paid writing gig I ever managed briefly was fortnightly and even that was one that I had to resign from after a while; the expectation to produce a well-researched, top-quality piece of writing on a topic of interest that was relevant to the right readers was something I simply couldn’t cope with. And by the way, the going rate for writing of this sort is utterly dismal – well below minimum wage if you calculate your earnings by the hour.

One of my earliest blog posts remains one of the ones that I am most fond of. It’s called “The one that got away” and was a reflection on the student that I remember with the most regret from my career at the chalkface. A student I felt I had failed. I’m a huge believer in the fact that one should acknowledge one’s failures and reflect on them. Too often we are encouraged not to even use the word “failure” but I think it’s important. All of us fail. It’s not a dirty word, it’s a part of a full life well-lived and an ambitious career. “Show me a man who has never made a mistake and I’ll show you one who has never tried anything” is a viral internet quote which – in various forms – has been attributed to pretty much everyone including Albert Einstein, Theodore Roosevelt and – my personal favourite – Joan Collins. Whoever said it (and today I truly cannot be bothered to try and find out who did so) was absolutely right.

My failures in tutoring have been few and far between. I say this not to boast about how great I am at what I do but rather to demonstrate how much easier and more powerful one-to-one tutoring is compared to classroom teaching. If you are an expert in your subject (by which I mean the academic content and the expectations of the relevant examinations), plus if you’re used to communicating with students of the age you’re trying to work with, tutoring is a breeze. One-to-one work is so phenomenally powerful that you really don’t need to be a genius at it for it to have a tangible impact. I like to think that I am good at what I do, but compared to the ambition of being a good classroom teacher, being a really good tutor is remarkably easy. Being a really good classroom teacher? Oh my goodness it’s hard. Like you wouldn’t believe. I cannot emphasise this enough. You wonder why teachers are leaving the profession in droves? I’ll give you a hint. It isn’t the salary.

Being good at what you do does not mean you will not fail sometimes. I keep a record of students who have discontinued (as opposed to those who have simply reached the end of their time with me because they have completed the course or finished their exams). There are not many, but given the sheer volume of students that I work with there are always going to be a few. This week I decided to reflect on each case and try to glean what – if anything – can be learned from them. It turns out, they all have one thing in common.

Generally speaking, the underlying reason why a student will discontinue working with me is that they remain reluctant to engage with the sessions. This is sometimes because the tutoring has been foisted on them, rather than something they have asked for themselves, or it’s sometimes when they realise that they will have to do some work during the sessions – a student may have asked for help, but the process is not going to work unless they are up for a challenge. I have worked with scores of students who are deeply reluctant to work independently outside of the sessions, and I always make it clear to the bill-payer that the impact of what I do will be limited when this is the case; yet so long as a student engages with the sessions during our one-to-one time together then it is still possible to have some kind of impact on them. By contrast, a student that really won’t engage with the learning process will not progress. It is often because they are afraid of failure and while I’m pretty experienced with helping a disaffected student to overcome this barrier, I accept that I simply cannot win them all.

So, what can I do to mitigate against such failures? After all, there is no point in reflecting on failure unless to improve. Well, something I have got better at is the early identification of students who are not responding well to the process. I would much rather get in touch with home and have this frank conversation than continue to take someone’s money when I believe that I am unlikely to have much of an impact on that student’s outcomes. Sometimes, that very frank conversation can jolt a student into realising that they have been resistant to the process and if they actually do wish to continue with the tutoring then it’s usually the catalyst towards engagement and progress – a turnaround in what might otherwise have been a failure. If the student does not want to keep working with me, it gives them the opportunity to say so, which is fine too.

Beyond that, another way in which I have tried to mitigate against the risk of failure is to specialise more and more in the areas I know best. I am a GCSE expert and now I am so much in demand then that’s what I offer. I work with students who are preparing for the GCSE or who have it in their sights and am no longer advertising myself as a tutor who works outside of this field: my expertise at working with that material and that age-range is the greatest and the more I am in my field of expertise, the more likely the process is to succeed. My advice would be to be wary of tutors who offer a bounteous range of subjects and/or levels: the best tutors hone their skills in one particular offering and become a genuine expert in what they do.

One of the things I tell my students is that mistakes are important. They inform me of their misunderstandings and misconceptions, so they’re a hugely important part of the tutoring process. Mistakes and failures make us better at what we do and we should embrace them and learn from them, not see them as a reflection on us as a person or a professional. It is not the failures that define us, but rather how we respond to them. Failures can make us more likely to succeed in the future.

Photo by Kind and Curious on Unsplash

Following the Herd

At primary school, I rarely played with other children. For me, playtime usually meant a walk around the edges of the playground, observing others and thinking to myself. There were lots of reasons why I found it difficult to connect with my childhood peers, none of them particularly interesting or unusual, but I have always wondered whether my early childhood experiences have shaped my temperament: to this day, I’m not much of a joiner.

More recently, I have begun to ponder whether in fact my own biology has had more influence on my personality than I would like to admit: as someone who suffers with extremely poor eyesight and less-than-perfect hearing, I am naturally quite cut off from much of the world. In recent years, I have begun to realise how this has in many ways defined how I relate to others and in turn how others respond to me. Motivated by a desire for acceptance, I have always tried to disguise my disablities, to the extent that many people are genuinely surprised when I admit to them. The price I have paid for this – ironically – is that I have gained a reputation of being “stand offish”, with many people firmly convinced that I have ignored or blanked them over the years. So, for anyone reading this who is convinced that I have overlooked them in the street or in the corridor (especially to whomever it was that made me aware of it by writing a rather nasty comment on this blog): the truth is, I probably didn’t see you or hear you. I’m sorry. It wasn’t deliberate.

Large scale groups have always made me feel uncomfortable and I hate the idea of “losing myself” in a crowd. The thought of going to a football match terrifies me. I did a few big concerts in my youth but struggled with the sheer number of people around me and I would not do it again now I’m older. A crowd takes on a mind-set and a force of its own, one that’s both independent from and beyond the control of the individuals it contains. Recent events have served as a horrific and tangible reminder that herd mentality – in all its forms, both ancient and modern – is something that should frighten us all.

Experience has certainly taught me that being part of a group is not in my nature and broadly speaking I am proud of the fact that I won’t play ball for the sake of staying on the team. It may not be my most attractive quality, but it’s one that will drive me to raise the alarm whilst everyone else stays silent. It makes me the kid who will shout that the emperor’s got no clothes on. Some employers have thanked me for this, others have not: it takes a robustly confident leader to tolerate being told that they’re naked in front of the world. There are times when I have reflected that I could have led a somewhat easier life – certainly professionally – had I been more willing to march in time, but generally speaking I quite like being an outsider. This is not to say that my failure to merge cohesively with a group has not caused me some anguish over the years – it can be a lonely existence. In the past, it has meant being kicked out of a group of writers with whom I shared many values, due to my innate inability to agree with them on everything – or at least, to pretend that I did. It meant the Editor of the magazine blocking all contact with me as “no longer an ally” because I asked questions and defended other people’s right to to do so. As a lifelong supporter of social justice, the increasing phenomenon of these kinds of activists, who denounce all forms of debate or discussion, has come as a genuine shock to me.

Until a few years ago, I believed that the fight for equality would usher in a new era of empathy, diversity and understanding – a new age, in which our ability to relate to each other would be improved by our ever-evolving understanding of how human rights intesect and – at times – conflict. It is what being a liberal is all about. Yet it seems to me that most of my so-called liberal allies have been taken over by a collective fear of rejection. Like the teenagers I have worked with over the years, they constantly check in with each other to affirm whether or not what they think is acceptable – and who can blame them? The consequence of dissent these days is excommunication from the tribe. Man, as Aristotle said, is a social animal: rejection is frightening and dangerous.

In the past, I found myself briefly drawn to people who described themselves as “libertarians” – only to find once again that there was a hymn sheet of horrors that I was expected to sing from if I wished to be initiated into the tribe. According to most of the Americans that I met online, to be accepted as a “libertarian” then one must be in favour of guns. Lots of guns. One must agree that the act of carrying a gun is a liberating experience (I mean – what?) and certainly that the act of carrying one is none of the government’s business. Every time I tried to propose a different line of thinking (held by most sane individuals on this side of the Atlantic), I was simply told that I was “not a libertarian”. So there we are. Another crowd to watch from the sidelines as they descend into madness.

Another “libertarian” approach that I struggled to respect was the puerile desire to offend, bolstered by the dubious claim that this is somehow a noble and worthwhile antidote to the equally tedious culture of taking offence. Certainly, I relish challenge and debate, and I also believe that free speech is more important than the inevitable risk of causing offence to some. As Salman Rushdie said following the horrifying attacks on the staff at Charlie Hebdo in 2015, “I … defend the art of satire, which has always been a force for liberty and against tyranny, dishonesty and stupidity.” But in an article on what he has termed “cultural libertarianism,” Breitbart author Allum Bokhari argued that “deliberate offensiveness plays an important role in the fight against cultural authoritarianism, … showing that with a little cleverness, it’s possible to express controversial opinions and not just survive but become a cult hero.” This surely sums up the unambitious and self-seeking aims of the internet-famous shock-jocks, who make it their business to offend – preening contrarians, whose sole function is to cause shock and awe, their online communications a heady mix of clickbait, worthless insults and self-aggrandisement. There is no evidence whatsoever that anyone’s personal liberty is furthered by such infantile sneering, yet swarms of self-proclaimed free-speech advocates rejoice in this toxic effluence with excited applause.

Maybe I’m still that little girl on the edges of the playground, the one with the problem joining in – but as I stand at the periphery, I see the herd mentality all around me. At its best, it gives us a sense of solidarity as we strive for the greater good or find our feet in the world. At its worst, it gives us mindless savagery, the kind of collective violence exemplified and explored in William Golding’s Lord of the Flies. On a day-to-day level, however, it results in something much more mundane and insidious: it endorses mediocrity and prevents us from thinking.

Photo by Steffen Junginger on Unsplash

This is an updated and adapted version of an article I wrote originally for Quillette magazine in 2016.

Shooting the Moon

During the period when I was writing my PhD, my main source of temptation and distraction was an electronic card game called Hearts. This was before the turn of the 21st century and while there were indeed some strange men in some of the science departments talking about a mysterious and abstract notion called “The Internet”, most of us had not discovered it yet. So, in 1998, I had neither cat videos nor social media to distract me, but I did have Hearts. Traditional card games such as Hearts and Solitaire (which I have always called Patience) were included along with the Microsoft software on my laptop, and it turned out to be a genuinely powerful temptation when the alternative was doing some work.

Hearts is a simple game for four players (or you plus three players driven by the computer). It is an evasion-game, in which you must try to avoid collecting any cards in the suit of hearts, plus particularly avoid collecting the Queen of Spades, which carries a heavy penalty and is essential to avoid. Generally speaking, the more hearts you end up stuck with at the end of the game, the worse your score, plus if you end up with the Queen of Spades you are particularly in trouble. I discovered all of this gradually: the motto in my family has always been, “as a last resort, read the instructions”, so in the style to which I had become accustomed, I plunged into the game and learnt the rules through trial and error.

One day, I was having such a bad round that it became clear that I was going to lose every single hand. Amused, I continued on my losing streak, keen in fact to make sure that I did indeed lose every single hand, purely for entertainment. (Please remember – the alternative was neoplatonic metaphysics). It was through this throwing in of the towel that I discovered the phenomenon of “shooting the moon” – it turns out that that in Hearts, if you lose every single hand and thus collect every single card in the suit of hearts and you collect the Queen of Spades, you actually win that round. It’s a slam-dunk, all-in move, like placing all your chips on one roll of the dice. I never managed to replicate the phenomenon and so only ever managed to win through shooting the moon on that one, accidental occasion.

In the last couple of years, I have become of aware of an increasing number of people who are keen for their children to “complete the syllabus early”. Some parents have expressed their wish that the entire specification be covered by the end of Year 10 (good luck with that!) and others adamant that they want the most complex concepts taught early or taught from the beginning. I have no idea where this notion has come from, but it wouldn’t surprise me if it found its origins on some online parent forum somewhere. Some high-achieving schools used to push this kind of rhetoric but with the shift in 2018 to specifications which are far more content-heavy, most schools find themselves struggling to complete the entire syllabus on time in some subjects, never mind early. The desire to push ahead also fails to take into account the rapid development that children are undergoing in their mid-teens. What a child is capable of towards the end of Year 11 may be poles apart from what they were capable of at the start of Year 10. On the other hand, it may not. It’s impossible to predict and – lest we forget – children are not machines.

One or two parents I have spoken to are so utterly wedded to the idea that the syllabus must be completed months ahead of the exam that they simply cannot be persuaded otherwise. Sometimes they claim that their child is vastly ahead in another subject – often mathematics – and express frustration that this is not the case in all. In the past, I might have accepted their take that their child was indeed in this position and argued that languages are different. Now I am married to a man with a mathematics degree, who rues the fact that he feels – on reflection – that he did not have the intellectual maturity to cope with the more nebulous fields of study that he was exposed to during his degree, it gives me pause. Is there honestly any subject in which a child or a young adult, however intelligent, can advance so rapidly without paying a price further down the line? Do they really understand what they are doing, or will it all come crashing down like the proverbial house of cards when they get a little further down the road? My feeling is that unless your child is some kind of savant (and to date I have never met one of those, so I’m telling you your child isn’t one of them) then you’re taking quite a risk with this approach.

Many parents who want their children to do well are concerned about the trickiest concepts in the syllabus. Sometimes they have feedback from their child’s schoolteacher that they have struggled with one or more of these more complex concepts. What some people find difficult to accept is that much of the time, it is not the tricky concept that is the problem – the problem lies deeper, in the foundational studies that their child may have been whisked through at high speed and left with tiny, often imperceptible gaps in their knowledge. Like the invisible holes in the enamel of a tooth, these gaps store up trouble for the future and before you know it you’ve got a gaping cavity in front of you. It is the rarest of occasions when this is not the case and indeed it is often the children who have historically done well in a subject that are most at risk. The better a child appears to be doing in a subject, the harder and faster they are pushed and the greater the number of tiny, undetectable cracks are formed which will make their presence known in the future. It’s the nature of the beast and nobody’s fault, but parents do need to trust a tutor who tells them that it’s time to go back to basics.

The overwhelming joy of what I do now is having the one-to-one time in which to genuinely test and shore up a child’s fundamental understanding. Asking them the same question in multiple different ways to ensure that they possess a genuine grasp of the topic, not a superficial ability to provide a text-book answer to an anticipated question worded in a style that they recognise. Asking them to define a grammatical term and give an example. Most of all, asking them to explain why a phrase or a sentence translates the way it does – does their translation stem from the ability to skate on thin ice or from a genuine grasp of the underlying principles?

You see, shooting the moon is exciting. But risking it all on one turn of pitch and toss is – as any recovered gambler will tell you – a seriously bad idea. Success comes from baby steps, strong foundations and a genuine grasp of how things are put together. Success in study is a marathon, not a sprint, and if a marathon runner started the race with the speed of a 100-metre sprinter, they would never make it to the end, never mind win. Early and fast does not mean better – quite the opposite. It can mean failure. So be patient and trust in the process. Shooting the moon is both elusive and risky and there are infinitely safer ways to win a round of cards.

Photo by Sam Tan on Unsplash

See you in three weeks

This week, at a garden party, I chatted to a man in his 80s who reminisced about a school trip he went on in the 1950s. My neighbour’s father was given the opportunity to visit Dubrovnik in what was then Yugoslavia, when another local school had a few spare places for students to join the trip. Nothing seemed unusual or particularly surprising about his story until he reported their arrival in the city.

“So, the teacher pointed out some features at the train station and said that we should meet at the same spot for our return journey,” he explained. “After that, the teacher said, we’ll see you in three weeks.”

There was a pause, while my husband and I stared at Geoff in silence.

“I’ve got no idea what he and the other staff did from that point on,” he continued, “but we didn’t see them until it was time to go home.”

I then had to check in with him that I had heard him correctly.

“Wait … they just left you there to get on with it? For three whole weeks?

“Absolutely,” he said.

Well. Needless to say, my flabber was gasted. Geoff went on to talk about his memories of the trip, which boiled down to basic survival. He and his friends bought some eggs from a local farm and discovered that every single one of them was bad. He expressed regret that some diaries he had kept at the time had gone missing during a recent move. Let’s hope they turn up at the bottom of an unpacked box somewhere, as they will surely make for fascinating reading when lined up next to the experiences of children today in a school trip setting.

Anyone who knows anything about school trips in a modern setting will be equally struck by the difference between Geoff’s experience and the ones had by students now. I have written before on the pressures of running school trips, most especially school trips abroad, and indeed that piece of writing remains my most-read blog by a considerable margin: it’s been read tens of thousands of times and clearly resonates with teachers who are still faced with the challenge of working in loco parentis. In summary, the original post was an exploration of a case where teachers on a school trip abroad were unjustly charged with “manslaughter as a result of gross negligence” in a French court, seven years after a child had died in an accident on a trip while in their care. Fortunately, the judge threw out the case, but the distress and suffering undergone by those three young professionals can only be imagined. The post also explained how I made the decision several years ago to stop running school trips abroad, purely because I could no longer cope with the stress and anxiety of doing so.

While I would never suggest that Geoff’s experience is one we should try to replicate in the modern world, as it displays a level of naivety and foolishness on the part of staff back in those days that I can only wonder at, it has got me thinking again about what was expected from teachers in the past compared to what is expected from us now. It has also caused me to think deeply about the vast chasm of difference between the day-to-day experiences that were once readily available to young people compared to what we assume is appropriate for them now.

One of the things that Jonathan Haidt explores in his recent book The Anxious Generation is the degree to which children now experience near-permanent adult supervision (to the extent that one might call it surveillance) and thus increasingly less real-world freedom and independence as they grow up; he contrasts this with the complete lack of supervision which most youngsters have when it comes to the online world, which is where – he argues – the worst dangers actually lie. He calls the effect on Generation Z – the generation who grew up with smart phones in their pockets – “the great rewiring” and urges society to roll back on the online freedoms we have grown used to and to replace them with more real-world freedom and risk. Haidt is a Professor at New York University and collaborates often with the American psychologist Jean Twenge, who was one of the first psychologists to argue that the rising rates of poor mental health among Generation Z can be attributed to smartphones. Sceptics of such research argue that young people simply have more things to feel anxious and depressed about, but in my opinion Haidt makes his case pretty persuasively. Earlier generations have also grown up in the shadow of war and global instability, he points out, yet such collective crises in the past did not manifest themselves in psychological distress; quite the opposite, they often engendered a sense of greater social solidarity and purpose, a net positive for mental health. By contrast, the evidence linking mental illness to smartphones and the inescapable and thus addictive access they bring to social media use is genuinely alarming.

Haidt’s argument builds upon a case he has made in his previous book, The Coddling of the American Mind, that overprotectiveness has contributed to the mental health crisis. He argues that Generation Z children are what he calls “antifragile”: they lack exposure to the varied experiences that are required in order to develop resilience. Haidt argues in both books that children ought to be given greater freedom to play unsupervised, free from adult surveillance.

In my last blog post I mentioned that in my last few years at the chalkface it was quite normal to walk down the school corridor and find a child outside every classroom – not necessarily because they had been thrown out of class, but because they were refusing to enter it in the first place due to the extreme level of their anxiety. I have no concrete answer as to why this is happening, but happening it is. There is an emerging school of thought that the well-meaning work that has increasingly been done in schools to address the issue of children’s mental health has in fact done more harm than good. I have recently read Bad Therapy by Abigail Shrier and her research most definitely raises causes for concern. Shrier is a journalist and a controversial figure for some, but her concerns echo those raised by numerous psychologists, who talk of our modern tendency to pathologise normal feelings (who didn’t feel genuinely overwhelmed with fear and at times bone-crushingly miserable during their teenage years?) and push children down a path of sickness rather than allowing them to negotiate their way through their feelings and trust that the storms will pass. These concerns are summarised quite nicely here, in a piece from 2022 in the Telegraph.

So, Geoff’s brief and cheerful reminiscence has left me with much to think about. While none of us would dare to send our children out to a foreign country to fend for themselves for three weeks, perhaps we can learn from what was presumably the innocence of our forefathers. There was an enormous plus side to growing up and living without fear; if that kind of life and freedom produced the vibrant man that Geoff remains, then perhaps they weren’t getting it so wrong in the 1950s.

Photo by Dino Reichmuth on Unsplash

Endtimes

The toe-curling indignity of Joe Biden’s current situation is a lesson to us all. A lesson in what happens when a system favours old guys and then wonders why those old guys won’t move over when it’s time. A system that appears not to have considered what might happen if it’s desperately obvious that one of those old guys should take a back seat, but the dude wants to stay behind the wheel. A system so unwieldy and expensive that the only people who can afford to play the game are – as a general rule – those same old white guys, the ones who don’t want to take their hands off the wheel.

How does anybody know when it’s time to stop? Biden’s painful crumbling in front of the world has reminded me how as a youngster I promised myself fiercely that I would know when my time was done. To me, this does not just apply to when it’s time to retire, but throughout your career when you’re done with a particular role. Whatever I took on in education, I gave it my best shot and then handed it over. I made whatever changes I felt were needed, led people and adjusted systems to what I felt worked best, but always handed over the role when I had run out of ideas. Every. Single. Time. Quite literally my worst nightmare was the idea that people were saying behind my back “why doesn’t she just go?” The thought genuinely filled me with dread. Happily, due to my overwhelming desire to avoid this situation, I’m pretty sure it’s never happened.

On this side of the Atlantic, whatever your politics, I think it’s fair to say that our outgoing government was running out of ideas. Our system is based on a pattern of rotation, ensuring that nobody gets too stale in their role: when a cabinet and the government in general is fresh out of new proposals, we vote them out. The whole process runs on a cycle and – broadly speaking – it works for the best. Only the most partisan (and those who haven’t lived very long) really believe that seismic change will come with a change of government, but everyone can get behind the idea that a fresh line of buttocks on seats in the cabinet office can only be a good thing. Time for something different, for those who are not worn down by cynicism to give it a go. Nothing could be more true this time, when it’s fair to say that the outgoing government has had some issues.

Although not a great follower of any kind of sport, I did smile to myself this time last year when the 20-year-old Carlos Alcaraz smashed Novak Djokovic’s bid for his 8th Wimbledon title. You see, however outstanding you are in your field, there will always be the next youngster snapping at your heels. That’s just as it should be. Personally, I find it inspiring and comforting that there is always somebody coming up through the ranks that is likely to do your job better than you did. I do not find this a threat. I am at peace with the contribution that I made at the chalkface and continue to make as a tutor in extremely high demand – experience counts. But I am genuinely delighted to have met the next person who will be doing my old job in the comprehensive school I left two years ago and to find that she is enthusiastic, passionate and bursting with ideas. Nothing would give me more joy than to see the role flourish and grow. It is not my possession, it is my legacy – and a legacy only works when there are new people keen to do something even better than you did.

Will Biden finally realise that it’s time to step back and spend more time on his sofa – one that isn’t in the Oval office? One can only hope that he is surrounded by advisors with courage, not the usual troupe of sycophants that great world leaders tend to find themselves hemmed in by. Will he listen? The message seems to be that it’s unlikely. The strongest and best leaders I have ever known are the ones who listen to the things they do not want to hear. As someone who is quite good at opening their mouth when others tend to keep theirs closed, I have often found myself to be the reluctant Cassandra in the room. In my experience, the best leaders will listen, nod and thank you for having the gumption to challenge them. The worst will destroy you for speaking the truth. Quite how and why the Democrats have ended up in this position is for those who understand US politics in depth to explain, but I suspect that it’s inertia that has brought them here. Nothing is worse than doing things as they’ve always been done for no other reason than the fact that they’ve always been done that way. Presidents always run for a second term, even if they’re in their 80s and showing clear signs of deterioration despite the best healthcare that their capacious wallet can buy.

Photo by Wonderlane on Unsplash

eligo, eligere, elegi, electus

Given the undeniable unfairness baked into Roman society, it might be a surprise to some that the Romans embraced a democracy of sorts. Only a small fraction of people living under Roman control could actually vote, but male citizens during the period when Rome was a Republic did have the opportunity to cast their vote for various administrative positions in government. The Latin verb “to choose”, which forms the title of this blog post, is what produced the participle electus and gives us the modern word election.

In the 6th century BCE, with the overthrow of the Roman monarchy, the city-state of Rome was re-founded as a Republic and by the 3rd Century BCE it had risen to become the dominant civilisation in the Mediterranean world. The ruling body known as the Senate was made up of the wealthiest and most powerful patricians, men of aristocratic descent. These men oversaw both the military campaigns that brought expansion and wealth to Rome and the political structures that managed its society. At the beginning of the Republic, only the Consuls were elected, but in later years Roman free-born male citizens could vote for officials in around 40 public offices which formed a complex hierarchical structure of power.  Yet this public performance of voting did not really offer the citizens any kind of real choice. If you’re feeling depressed about the choices offered to you in your polling booth today, take heart: things were considerably worse two thousand years ago (even if you were a man).

Candidates for office under the Roman Republic were originally selected by the Senate and were voted for by various different Assemblies of male citizens. These Assemblies were stratified by social class and the weighting was heavily skewed in favour of the aristocracy. In the early years of the Republic, candidates were banned from speaking or even appearing in public. The Senate argued that candidates should be voted for on the merit of their policies, rather than through rhetoric and personality; in truth it meant the general public had no real opportunity to hear candidates’ arguments or indeed to hold them to account. In the later Republic the ban on public oracy was lifted and the empty promises so familiar to us today abounded, alongside some good old-fashioned bribery which – while theoretically illegal – was widespread. As the practice of electoral campaigning developed things did begin to change, with the pool of candidates no longer tightly limited to a select group of aristocrats under Senatorial control. In the long-term, however, this led to even greater misery for the citizens. They lost what little democracy they had during the Roman revolution, when what should have been a righteous and deserved uprising against the ruling oligarchy ended up turning into something arguably worse. Rome’s first ruling emperor, Augustus Caesar, claimed that voting was corrupt and had been rigged by the Senate for years in order to perpetuate the power of a handful of aristocratic families. His neat solution was to abolish voting altogether. Be careful what you wish for?

Once the early ban on public oracy was lifted, a key component of public campaigning during the Republic was canvassing for votes in the Forum. A candidate would walk to this location surrounded by an entourage of supporters, many of whom were paid, in order to meet another pre-prepared gathering of allies in the central marketplace. Being seen surrounded by a gaggle of admirers was hugely important for a candidate’s public image and was worth paying for. Once in the Forum, the candidate would shake hands with eligible voters aided by his nomenclator, a slave whose job it was to memorise the names of all the voters, so that his candidate could greet them all in person. The man running for office stood out in the crowd by wearing a toga that was chalk-whitened called the toga candida: it is from this that we get the modern word candidate.

To further attract voters among the ordinary people, candidates gave away free tickets to the gladiatorial games. To pay for such a display a candidate either had to be extremely wealthy, or to secure the sponsorship of wealthy friends. Cases are documented of men ending up in ruinous debt as a result of their electoral campaigning. Several laws were passed attempting to limit candidates’ spending on banquets and games, which evidences the fact that that the Senate didn’t like electoral corruption except when they were in charge of it.

Democracy under the Roman Republic was very much controlled by the select few male members of the aristocracy who held seats in the Senate. They essentially held all of the power, having been born into wealthy patriarchal families. The majority of people who inhabited the Roman world were not allowed to vote, including women and slaves. It is striking and not to say infuriating how many modern sources on Roman voting talk about “citizens” and “people” without seeming to feel any need to clarify that they are talking about male citizens and male people only. We do have evidence that women in the wealthiest families put their money and their energy behind their preferred male candidates, most usually because they were members of the same family. Electioneering in the form of visible graffiti in Pompeii evidences women’s support of their husbands, fathers and brothers but this is all produced by women of considerable means; what the poorest women in society thought and felt about the men who controlled their lives is anybody’s guess.

Cicero denounces Catiline in the Senate by Cesare Maccari (1840-1919 CE) .
Palazzo Madama, Rome

Is it original?

One of my most recent fiction reads is Yellowface by FR Kuang. I was absolutely blown away by this fierce and darkly hilarious examination of the publication industry and its acolytes.

It is not giving anything away to explain the basic premise, for that is played out right at the start of the novel: contemporary young American authors June Hayward and Athena Liu are both supposed to be rising stars. But while the fabulous Asian-American Athena finds instant fame and recognition, June is a literary nobody and her first novel is a resounding flop. When June happens to be present at Athena’s death as a result of a freak accident, she acts on impulse and steals Athena’s latest novel, an experimental masterpiece exploring the unacknowledged contributions of Chinese workers to World War I. June decides to edit Athena’s novel and “make it her own”, immersing herself so deeply in the process of refining its prose that on some level she becomes convinced that the novel actually is indeed her own. She sends it to her agent as her own work and – at the eager publisher’s suggestion – rebrands herself with the culturally ambiguous author name of Juniper Song. The rest of the novel charts her rise and fall.

Yellowface explores the ethics of plagiarism and forces us to confront the question of originality: if an original work is heavily edited, does it remain the authentic work of the primary author, or can it be considered a collaboration? June/Junpier certainly convinces herself that it can. The novel also explores issues of friendship, race and diversity, painting the protagonist as a jealous and overlooked author with nothing fashionable to say, frustrated by the lack of interest in her “white stories” and then thwarted by an audience that questions her right to explore a history outside of her own cultural milieu. Hilariously, June/Juniper becomes aggressively and eloquently defensive of her right to such authorship, to be a white author writing about a forgotten part of Chinese history, at times seeming to forget completely that she did not – in fact – author the novel in the first place. At other times she is quite literally haunted by Athena and the truth of what she has done. There are heated debates played out in real time at book fairs and accounts of reviews on Goodreads, many of which had me laughing out loud at their accuracy. Yellowface is simply brilliant and one of the many reasons I know it’s brilliant is that it has seriously upset a lot of the chattering reviewers on Goodreads: nobody likes how it feels when a mirror is held up in front of them.

Like any good novel, Yellowface has stayed in my mind and got me thinking about some of the issues it explores. I have written before about the dangers that teachers and private tutors face when seeking to monetise their resources (as we are all encouraged to do), due to what I believe is their naivety when it comes to what truly constitutes original work. I am grateful for my background in academia here, a period during which an extreme fastidiousness about the risk of plagiarism was drummed into me. There have been numerous cases of teachers monetising resources that have turned out to be based on the work of others and – quite unbelievably – this is supported and facilitated by the Times Educational Supplement, which allows people to upload and sell resources on its own website without a single check as to their originality. Only this week I saw someone online who was able to prove categorically that monetised resources available on the site were cut and pasted from his own work.

Such flagrant stealing aside, I honestly believe that a great deal of plagiarism occurs through nescience rather than through deliberate action. The way that teachers traditionally work means that it can be genuinely difficult to remember where your work ends and that of another begins. Teachers are the curators of an ever-evolving bank of resources that many others will have influenced in different ways over the years. Thanks to an academic background and some experience in publishing, I am acutely aware of the fact that pretty much everything I produce as a working resource for my students started its life somewhere else – as a passage in an old text book, from a bank of files kindly shared by a colleague, on a dim and distant exam paper from days gone by. Virtually nothing that I produce, therefore, can be claimed as fully original and monetised. This is true of most teachers, but I’m not sure how many of them fully understand the implications when it comes to publishing their work.

Every time I read or hear the exhortation from the ever-growing chorus of business coaches that tutors should be monetising their resources to create a passive income, my blood runs cold for those who heed this advice. How sure can such tutors be that their work is 100% and exclusively their own? If they’re sure of it, then they’ve been working in a vacuum, which seems a pretty strange way to go about things: reworking other people’s ideas is how we teachers get by in the job and doing so for our own use is absolutely allowed. But packaging these things up and selling them on as if they are entirely our own work is not. We live in an age where “publishing” is something that everyone can do – I have “published” this blog post myself – no editor, no publisher, no agent. The ease with which it is possible to release our work into the world can cause those inexperienced in the realities of professional publishing to think that they can do whatever they like, without recompense. I genuinely worry for them. If you’re still not convinced that there is anything for us to be concerned about, then take a look at what happened on The Classics Library website, where resources being shared entirely for free fell foul of copyright law and had to be taken down when the site was challenged by Cambridge University Press. Published resources using the ideas, the stories, the images or even just the names of the characters contained in the Cambridge Latin Course were deemed an infringement and the CUP demanded that they be taken down. In summary, any resource that uses even just an abstract concept created by others is breaking copyright law: if you publish an entirely “original” Latin story but that story contains the characters of Caecilius, Metella and Quintus, you’re potentially in trouble. These characters and their images are the intellectual property of the CUP.

Originality was not valued in the ancient world in anything like the way it is now. The modern world is obsessed with originality and authenticity, a tendency which has spilled over into society’s prioritisation of the individual over the community. The ancient Greeks had no interest in original stories, rather they liked to hear traditional or familiar stories told well. The Greek concept of story-writing arose out of the oral tradition, where stories were shared by word of mouth and were told and re-told a thousand times. Each teller would embellish the story and “make it their own” but none would claim (or indeed even wish to claim) that the story was original to them. For this and other reasons it is sometimes impossible to discern who was the original author of ideas in the ancient world and Homer, the oldest story-teller whose works we have in our possession, is considered by many to be an amalgamation of multiple authors over time, rather than one individual.

The Romans took the art of mimicry to a whole new level and due to the rapid and spectacular expansion of their empire had the opportunity to steal ideas from across much of the globe. They relished doing so. Their own art and literature were a kaleidoscope of colour from the regions they dominated and they certainly didn’t fret about cultural appropriation; quite frankly, they’d have been left with precious little culture without it. Furthermore, the Romans did not have the artistic prissiness we now harbour about owning the “original work”. Copies of Greek originals abounded and to be in possession of a good copy was considered not only acceptable but desirable. And it’s just as well. A multitude of Greek bronze originals are only known to us as a direct result of their Roman marble copy. (Bronzes don’t tend to survive – they get melted down and turned into more useful stuff!)

To return to the novel, I would highly recommend it. Few novels I have read this year have stayed with me as much as this one has and I loved its acerbic swipe at an industry and indeed an audience which can be cruel, unforgiving and hyprocrital. I wonder how the agent felt about this when they first picked up the manuscript. Now that I would like to have seen.

Photo by Elisa Photography on Unsplash

Thank you, Doctor

To date, no celebrity’s death has affected me on any level beyond “oh, that’s a shame”. Throughout my life, I have watched with curiosity and at times bewilderment while others claim to be “deeply affected” by the passing of someone they have never met; if I’m honest, I thought I was largely immune to the phenomenon. But during the last week I found myself checking and re-checking online, simply frantic to hear news of Michael Mosley, who went missing on the Greek island of Symi last Wednesday. As the days passed and the chance that there would be reports of him found safe and well became more and more unlikely, it was nevertheless still so distressing to finally read the confirmation that his body had been found. My heart goes out to his wife and his four children.

Dr. Michael Mosley was a scientist with an innate likeability that seems to have endeared him to everyone he encountered. His warm, empathetic style gave him an instant rapport with his audience and his passion for his subject was palpable. Mosley made it his mission to make the science of good health and longevity comprehensible to all and he practised what I would describe as comprehensibility without compromise: he never dumbed things down, he simply made them intelligible to the layperson of average intelligence. I have seen some of his TV work but for me, it was his BBC podcast called Just One Thing that made him feel like a part of my life. There is something about the way we listen to podcasts, having someone’s voice deep inside our ears while we go about our daily business of taking a walk or doing the shopping, that makes for a kind of intimacy never achieved through the television. Nodding along to Mosley’s warm-hearted, practical advice had become an important staple for me, so his sudden and untimely passing feels like a genuine loss, for which my life will be the lesser.

Mosley’s own health journey was, we are told, inspired in part by watching his father deteriorate in old age. Mosley’s father died aged 74 and, according to Mosley, was very inactive in his final years. Both Mosley and his father developed Type 2 diabetes in later life but while his father’s health deteriorated and was exacerbated by inactivity, Mosley himself managed to put his condition into longterm remission through diet and exercise, a phenomenon that is well-recognised by medics as possible for many patients. Mosley is perhaps most famous for his advice on diet, but it is not this side of his work that held interest for me. Due to genetic good fortune, I have never struggled with my weight. Furthermore, Mosley’s research took him down the route of recommending diets that include bouts of fasting and no scientist on earth could convince me to give that a go, however much I respected their advice. Fasting is emphatically not for me: it makes me feel truly awful. The last time I tried it was when instructed to fast prior to a blood test. Already feeling ghastly as a result, I was then kept waiting for some considerable time at the surgery. By the time I did get to actually see the Doctor I was the colour of parchment, shaking uncontrollably, covered in a film of cold sweat and dry-retching into a tissue. The somewhat bemused Doctor then of course proceeded to quiz me on my family history of Type 1 diabetes. There isn’t one! This is simply the way that fasting makes me feel and it always has done. I have absolutely no intention of trying it as a lifestyle choice. Sorry, Dr. Mosley.

Yet Mosley’s recommendations went way beyond diet and it was his advice on exercise that had me hooked. He more than anyone first convinced me to try weight and resistance training in later life, a journey which I embarked upon around 6 months ago and first wrote about here. Something about Mosley’s no-nonsense approach combined with the fact that he was not your typical lycra-wearing gym fanatic convinced me to do some further research and reading which – of course, although somewhat to my irritation – proved that he was 100% right about the importance of such work. I finally started down that pathway in November, have never wavered from it and now see resistance training as a permanent, non-negotiable part of life. Mosley was open about the fact that he loathed environments such as the gym and could never see himself going to one, yet he talked enthusiastically about doing push-ups, planks and squats in his 60s, about the enormous importance of developing muscle strength and bone-density to mitigate against the ageing process and to promote independence in later life. He talked and I listened.

It says a great deal about the society in which we live that much was made by some of the fact that Mosley left his mobile phone back at the place where he and his wife were staying before embarking on his ill-fated walk. Yet those of us who have listened to him over the years know that he also advocated for doing exactly this: for leaving your digital attachments to the world behind and striding off alone, to listen to the birds, the waves, the crickets, whatever nature may provide as the soundtrack to your adventure. Mosley’s wife confirmed in her response to his passing that his fierce independence and sense of adventure were part of what defined him and it speaks volumes about how ridiculously addicted so many people are to their hand-held communication devices that they are puzzled by the very idea that a man could leave his smart phone behind to go striding off into the hills.

I for one shall remember this vibrant yet gentle man with great affection and will continue to take his advice throughout what remains of my life. I am monumentally grateful for the contribution that he has made to our world and to my own health in particular. Whether we make it to a ripe old age or leave this world far too soon like Mosley himself, few of us will make such an impact and be remembered as such a compassionate, unassuming force for good. I shall miss his wisdom greatly.

Image source: BBC