eligo, eligere, elegi, electus

Given the undeniable unfairness baked into Roman society, it might be a surprise to some that the Romans embraced a democracy of sorts. Only a small fraction of people living under Roman control could actually vote, but male citizens during the period when Rome was a Republic did have the opportunity to cast their vote for various administrative positions in government. The Latin verb “to choose”, which forms the title of this blog post, is what produced the participle electus and gives us the modern word election.

In the 6th century BCE, with the overthrow of the Roman monarchy, the city-state of Rome was re-founded as a Republic and by the 3rd Century BCE it had risen to become the dominant civilisation in the Mediterranean world. The ruling body known as the Senate was made up of the wealthiest and most powerful patricians, men of aristocratic descent. These men oversaw both the military campaigns that brought expansion and wealth to Rome and the political structures that managed its society. At the beginning of the Republic, only the Consuls were elected, but in later years Roman free-born male citizens could vote for officials in around 40 public offices which formed a complex hierarchical structure of power.  Yet this public performance of voting did not really offer the citizens any kind of real choice. If you’re feeling depressed about the choices offered to you in your polling booth today, take heart: things were considerably worse two thousand years ago (even if you were a man).

Candidates for office under the Roman Republic were originally selected by the Senate and were voted for by various different Assemblies of male citizens. These Assemblies were stratified by social class and the weighting was heavily skewed in favour of the aristocracy. In the early years of the Republic, candidates were banned from speaking or even appearing in public. The Senate argued that candidates should be voted for on the merit of their policies, rather than through rhetoric and personality; in truth it meant the general public had no real opportunity to hear candidates’ arguments or indeed to hold them to account. In the later Republic the ban on public oracy was lifted and the empty promises so familiar to us today abounded, alongside some good old-fashioned bribery which – while theoretically illegal – was widespread. As the practice of electoral campaigning developed things did begin to change, with the pool of candidates no longer tightly limited to a select group of aristocrats under Senatorial control. In the long-term, however, this led to even greater misery for the citizens. They lost what little democracy they had during the Roman revolution, when what should have been a righteous and deserved uprising against the ruling oligarchy ended up turning into something arguably worse. Rome’s first ruling emperor, Augustus Caesar, claimed that voting was corrupt and had been rigged by the Senate for years in order to perpetuate the power of a handful of aristocratic families. His neat solution was to abolish voting altogether. Be careful what you wish for?

Once the early ban on public oracy was lifted, a key component of public campaigning during the Republic was canvassing for votes in the Forum. A candidate would walk to this location surrounded by an entourage of supporters, many of whom were paid, in order to meet another pre-prepared gathering of allies in the central marketplace. Being seen surrounded by a gaggle of admirers was hugely important for a candidate’s public image and was worth paying for. Once in the Forum, the candidate would shake hands with eligible voters aided by his nomenclator, a slave whose job it was to memorise the names of all the voters, so that his candidate could greet them all in person. The man running for office stood out in the crowd by wearing a toga that was chalk-whitened called the toga candida: it is from this that we get the modern word candidate.

To further attract voters among the ordinary people, candidates gave away free tickets to the gladiatorial games. To pay for such a display a candidate either had to be extremely wealthy, or to secure the sponsorship of wealthy friends. Cases are documented of men ending up in ruinous debt as a result of their electoral campaigning. Several laws were passed attempting to limit candidates’ spending on banquets and games, which evidences the fact that that the Senate didn’t like electoral corruption except when they were in charge of it.

Democracy under the Roman Republic was very much controlled by the select few male members of the aristocracy who held seats in the Senate. They essentially held all of the power, having been born into wealthy patriarchal families. The majority of people who inhabited the Roman world were not allowed to vote, including women and slaves. It is striking and not to say infuriating how many modern sources on Roman voting talk about “citizens” and “people” without seeming to feel any need to clarify that they are talking about male citizens and male people only. We do have evidence that women in the wealthiest families put their money and their energy behind their preferred male candidates, most usually because they were members of the same family. Electioneering in the form of visible graffiti in Pompeii evidences women’s support of their husbands, fathers and brothers but this is all produced by women of considerable means; what the poorest women in society thought and felt about the men who controlled their lives is anybody’s guess.

Cicero denounces Catiline in the Senate by Cesare Maccari (1840-1919 CE) .
Palazzo Madama, Rome

On bugbears and juxtaposition

An old Head of Department from many years ago used to start his Year 7 German course in the same way every year. Every year he would ask students to name any famous Germans they could think of. Every year he hoped to hear names like Michael Schumacher or Boris Becker, or perhaps one of the countless famous German composers from over the centuries. Every year he was given Hitler. It never seemed to occur to this lovely man that perhaps there was a better way of starting off his first German lesson. Something made him do the same thing over and again and I think a bit of him somehow relished the inevitable disappointment. We all have our crosses to bear in our chosen subjects.

For anyone who teaches or touches upon Roman culture, for us it’s waiting for the inevitable moment when a child will inform us that the Romans used to eat so much at their dinner parties that they would go and make themselves sick so that they could eat more. I’ve even overheard the guides at Pompeii help to perpetuate this myth by mischievously telling tourists that any random passageway that they can’t account for is a “vomitorium”, where guests would relieve themselves to create space for more gluttony. They know that this is nonsense. The confusion seems to have come from the word vomitorium itself (which actually was used by the Romans to refer to any passageway leading crowds out of a public building) combined with satirical pieces such as Trimalchio’s Feast, sometimes called The Millionaire’s Dinner Party, which describes the imagined excesses of dinner parties held by the nouveaux riches. We also have the disapproving remarks of authors such as Seneca, who wrote of slaves cleaning up the vomit of drunks at banquets and criticised what he saw as the excesses of Rome. It’s a depressingly familiar picture for anyone who has worked in a hotel or similar establishment in modern Britain; wealthy Romans were no more or no less gluttonous than the comfortably-off in any society, especially those societies which have alcohol at the heart of their culture.

Eye-roll inducing as this was, my personal bugbear of misinformation I simply cannot wait to hear is different. I tell myself I have to go there to prevent students from getting it wrong in their exams, but in truth there’s a bit of me that cannot resist it for my own torture. When working on the literature, I always ask every GCSE candidate what they think the term juxtaposition means. Almost without exception, students will tell me that the word means “contrast”. On an exceptionally good day, they will tell me that it means “putting things next to each other in order to create a contrast”. In actual fact, it means “putting things next to each other” and this may be done in order to highlight a contrast.

While I hate to be a massive Latin bore, I’m afraid this is yet another case where a simple knowledge of the Latin roots of words can help. To juxtapose has its origins in the Latin words iuxta (which means “next to”) and iungo (“to join”, also notable in derivatives such as join, conjunction, conjugation, conjugal) alongside the Latin word positus (“place” or “position”). It quite literally means “a placing next to”: there is no mention of the notion of contrast in the original etymological meaning of the word. The frequency with which the technique is used to highlight a contrast means that it is arguably justifiable to include this in the definition, but the etymological roots of the word really must be prioritised. Fundamentally, juxtaposition is placing a word or phrase next to another word or phrase, often but not exclusively to highlight a contrast.

Unfortunately, students (and teachers) Googling the word will find an avalanche of quotations using the word to mean simply and exclusively “contrast”. Just this morning I spotted a horrendous meme quoting American guitarist Dean Ween of all people: “the juxtaposition of fishing and touring couldn’t be greater”. Sigh.

Another part of the problem with this misunderstanding is that English really isn’t very good at doing juxtaposition. Our language requires too many supplementary words to make sense, plus we cannot muck about with word order in the way that Latin can without a serious change in meaning. Word order is sense-critical in the English language: “man bites dog” means the opposite of “dog bites man”. Latin, being an inflected language (i.e. one where the endings of the words dictate their meaning and role) has the advantage in that an author can place words next to each other with ease – certainly to highlight a contrast or frankly to do whatever he wishes.

The good news is that once a student realised what juxtaposition means it becomes much easier to spot in Latin. Once a student understands that it simply means placing words next to each other, they can assume that an author as adept as Virgil has always done so for a reason – it does not have to be limited to the concept of highlighting a contrast. An author may juxtapose a string of sounds, for example, or indeed words with a similar rather than a contrasting meaning. It’s entirely up to him.

Photo taken in Athens by Alexandra on Unsplash

Is it original?

One of my most recent fiction reads is Yellowface by FR Kuang. I was absolutely blown away by this fierce and darkly hilarious examination of the publication industry and its acolytes.

It is not giving anything away to explain the basic premise, for that is played out right at the start of the novel: contemporary young American authors June Hayward and Athena Liu are both supposed to be rising stars. But while the fabulous Asian-American Athena finds instant fame and recognition, June is a literary nobody and her first novel is a resounding flop. When June happens to be present at Athena’s death as a result of a freak accident, she acts on impulse and steals Athena’s latest novel, an experimental masterpiece exploring the unacknowledged contributions of Chinese workers to World War I. June decides to edit Athena’s novel and “make it her own”, immersing herself so deeply in the process of refining its prose that on some level she becomes convinced that the novel actually is indeed her own. She sends it to her agent as her own work and – at the eager publisher’s suggestion – rebrands herself with the culturally ambiguous author name of Juniper Song. The rest of the novel charts her rise and fall.

Yellowface explores the ethics of plagiarism and forces us to confront the question of originality: if an original work is heavily edited, does it remain the authentic work of the primary author, or can it be considered a collaboration? June/Junpier certainly convinces herself that it can. The novel also explores issues of friendship, race and diversity, painting the protagonist as a jealous and overlooked author with nothing fashionable to say, frustrated by the lack of interest in her “white stories” and then thwarted by an audience that questions her right to explore a history outside of her own cultural milieu. Hilariously, June/Juniper becomes aggressively and eloquently defensive of her right to such authorship, to be a white author writing about a forgotten part of Chinese history, at times seeming to forget completely that she did not – in fact – author the novel in the first place. At other times she is quite literally haunted by Athena and the truth of what she has done. There are heated debates played out in real time at book fairs and accounts of reviews on Goodreads, many of which had me laughing out loud at their accuracy. Yellowface is simply brilliant and one of the many reasons I know it’s brilliant is that it has seriously upset a lot of the chattering reviewers on Goodreads: nobody likes how it feels when a mirror is held up in front of them.

Like any good novel, Yellowface has stayed in my mind and got me thinking about some of the issues it explores. I have written before about the dangers that teachers and private tutors face when seeking to monetise their resources (as we are all encouraged to do), due to what I believe is their naivety when it comes to what truly constitutes original work. I am grateful for my background in academia here, a period during which an extreme fastidiousness about the risk of plagiarism was drummed into me. There have been numerous cases of teachers monetising resources that have turned out to be based on the work of others and – quite unbelievably – this is supported and facilitated by the Times Educational Supplement, which allows people to upload and sell resources on its own website without a single check as to their originality. Only this week I saw someone online who was able to prove categorically that monetised resources available on the site were cut and pasted from his own work.

Such flagrant stealing aside, I honestly believe that a great deal of plagiarism occurs through nescience rather than through deliberate action. The way that teachers traditionally work means that it can be genuinely difficult to remember where your work ends and that of another begins. Teachers are the curators of an ever-evolving bank of resources that many others will have influenced in different ways over the years. Thanks to an academic background and some experience in publishing, I am acutely aware of the fact that pretty much everything I produce as a working resource for my students started its life somewhere else – as a passage in an old text book, from a bank of files kindly shared by a colleague, on a dim and distant exam paper from days gone by. Virtually nothing that I produce, therefore, can be claimed as fully original and monetised. This is true of most teachers, but I’m not sure how many of them fully understand the implications when it comes to publishing their work.

Every time I read or hear the exhortation from the ever-growing chorus of business coaches that tutors should be monetising their resources to create a passive income, my blood runs cold for those who heed this advice. How sure can such tutors be that their work is 100% and exclusively their own? If they’re sure of it, then they’ve been working in a vacuum, which seems a pretty strange way to go about things: reworking other people’s ideas is how we teachers get by in the job and doing so for our own use is absolutely allowed. But packaging these things up and selling them on as if they are entirely our own work is not. We live in an age where “publishing” is something that everyone can do – I have “published” this blog post myself – no editor, no publisher, no agent. The ease with which it is possible to release our work into the world can cause those inexperienced in the realities of professional publishing to think that they can do whatever they like, without recompense. I genuinely worry for them. If you’re still not convinced that there is anything for us to be concerned about, then take a look at what happened on The Classics Library website, where resources being shared entirely for free fell foul of copyright law and had to be taken down when the site was challenged by Cambridge University Press. Published resources using the ideas, the stories, the images or even just the names of the characters contained in the Cambridge Latin Course were deemed an infringement and the CUP demanded that they be taken down. In summary, any resource that uses even just an abstract concept created by others is breaking copyright law: if you publish an entirely “original” Latin story but that story contains the characters of Caecilius, Metella and Quintus, you’re potentially in trouble. These characters and their images are the intellectual property of the CUP.

Originality was not valued in the ancient world in anything like the way it is now. The modern world is obsessed with originality and authenticity, a tendency which has spilled over into society’s prioritisation of the individual over the community. The ancient Greeks had no interest in original stories, rather they liked to hear traditional or familiar stories told well. The Greek concept of story-writing arose out of the oral tradition, where stories were shared by word of mouth and were told and re-told a thousand times. Each teller would embellish the story and “make it their own” but none would claim (or indeed even wish to claim) that the story was original to them. For this and other reasons it is sometimes impossible to discern who was the original author of ideas in the ancient world and Homer, the oldest story-teller whose works we have in our possession, is considered by many to be an amalgamation of multiple authors over time, rather than one individual.

The Romans took the art of mimicry to a whole new level and due to the rapid and spectacular expansion of their empire had the opportunity to steal ideas from across much of the globe. They relished doing so. Their own art and literature were a kaleidoscope of colour from the regions they dominated and they certainly didn’t fret about cultural appropriation; quite frankly, they’d have been left with precious little culture without it. Furthermore, the Romans did not have the artistic prissiness we now harbour about owning the “original work”. Copies of Greek originals abounded and to be in possession of a good copy was considered not only acceptable but desirable. And it’s just as well. A multitude of Greek bronze originals are only known to us as a direct result of their Roman marble copy. (Bronzes don’t tend to survive – they get melted down and turned into more useful stuff!)

To return to the novel, I would highly recommend it. Few novels I have read this year have stayed with me as much as this one has and I loved its acerbic swipe at an industry and indeed an audience which can be cruel, unforgiving and hyprocrital. I wonder how the agent felt about this when they first picked up the manuscript. Now that I would like to have seen.

Photo by Elisa Photography on Unsplash

Thank you, Doctor

To date, no celebrity’s death has affected me on any level beyond “oh, that’s a shame”. Throughout my life, I have watched with curiosity and at times bewilderment while others claim to be “deeply affected” by the passing of someone they have never met; if I’m honest, I thought I was largely immune to the phenomenon. But during the last week I found myself checking and re-checking online, simply frantic to hear news of Michael Mosley, who went missing on the Greek island of Symi last Wednesday. As the days passed and the chance that there would be reports of him found safe and well became more and more unlikely, it was nevertheless still so distressing to finally read the confirmation that his body had been found. My heart goes out to his wife and his four children.

Dr. Michael Mosley was a scientist with an innate likeability that seems to have endeared him to everyone he encountered. His warm, empathetic style gave him an instant rapport with his audience and his passion for his subject was palpable. Mosley made it his mission to make the science of good health and longevity comprehensible to all and he practised what I would describe as comprehensibility without compromise: he never dumbed things down, he simply made them intelligible to the layperson of average intelligence. I have seen some of his TV work but for me, it was his BBC podcast called Just One Thing that made him feel like a part of my life. There is something about the way we listen to podcasts, having someone’s voice deep inside our ears while we go about our daily business of taking a walk or doing the shopping, that makes for a kind of intimacy never achieved through the television. Nodding along to Mosley’s warm-hearted, practical advice had become an important staple for me, so his sudden and untimely passing feels like a genuine loss, for which my life will be the lesser.

Mosley’s own health journey was, we are told, inspired in part by watching his father deteriorate in old age. Mosley’s father died aged 74 and, according to Mosley, was very inactive in his final years. Both Mosley and his father developed Type 2 diabetes in later life but while his father’s health deteriorated and was exacerbated by inactivity, Mosley himself managed to put his condition into longterm remission through diet and exercise, a phenomenon that is well-recognised by medics as possible for many patients. Mosley is perhaps most famous for his advice on diet, but it is not this side of his work that held interest for me. Due to genetic good fortune, I have never struggled with my weight. Furthermore, Mosley’s research took him down the route of recommending diets that include bouts of fasting and no scientist on earth could convince me to give that a go, however much I respected their advice. Fasting is emphatically not for me: it makes me feel truly awful. The last time I tried it was when instructed to fast prior to a blood test. Already feeling ghastly as a result, I was then kept waiting for some considerable time at the surgery. By the time I did get to actually see the Doctor I was the colour of parchment, shaking uncontrollably, covered in a film of cold sweat and dry-retching into a tissue. The somewhat bemused Doctor then of course proceeded to quiz me on my family history of Type 1 diabetes. There isn’t one! This is simply the way that fasting makes me feel and it always has done. I have absolutely no intention of trying it as a lifestyle choice. Sorry, Dr. Mosley.

Yet Mosley’s recommendations went way beyond diet and it was his advice on exercise that had me hooked. He more than anyone first convinced me to try weight and resistance training in later life, a journey which I embarked upon around 6 months ago and first wrote about here. Something about Mosley’s no-nonsense approach combined with the fact that he was not your typical lycra-wearing gym fanatic convinced me to do some further research and reading which – of course, although somewhat to my irritation – proved that he was 100% right about the importance of such work. I finally started down that pathway in November, have never wavered from it and now see resistance training as a permanent, non-negotiable part of life. Mosley was open about the fact that he loathed environments such as the gym and could never see himself going to one, yet he talked enthusiastically about doing push-ups, planks and squats in his 60s, about the enormous importance of developing muscle strength and bone-density to mitigate against the ageing process and to promote independence in later life. He talked and I listened.

It says a great deal about the society in which we live that much was made by some of the fact that Mosley left his mobile phone back at the place where he and his wife were staying before embarking on his ill-fated walk. Yet those of us who have listened to him over the years know that he also advocated for doing exactly this: for leaving your digital attachments to the world behind and striding off alone, to listen to the birds, the waves, the crickets, whatever nature may provide as the soundtrack to your adventure. Mosley’s wife confirmed in her response to his passing that his fierce independence and sense of adventure were part of what defined him and it speaks volumes about how ridiculously addicted so many people are to their hand-held communication devices that they are puzzled by the very idea that a man could leave his smart phone behind to go striding off into the hills.

I for one shall remember this vibrant yet gentle man with great affection and will continue to take his advice throughout what remains of my life. I am monumentally grateful for the contribution that he has made to our world and to my own health in particular. Whether we make it to a ripe old age or leave this world far too soon like Mosley himself, few of us will make such an impact and be remembered as such a compassionate, unassuming force for good. I shall miss his wisdom greatly.

Image source: BBC

Subliminal messages

Well-behaved women seldom make history.

Laurel Thatcher Ulrich, Professor of Early American history, Harvard

Imagine a world in which children received only good, positive and appropriate messaging about girls and women. Would it be dramatically different from the world we inhabit right now? I like to think that the world within my lifetime has changed for women, but sometimes I wonder.

While I love and agree wholeheartedly with the quotation above, when it comes to Roman history it seems to me that two types of women are preserved by the male elite who wrote about them: the badly behaved indeed, but also the idealised male fantasy – the Roman Stepford wife, if you will. In a culture in which history-writing truly was the sole preserve of men, the only women considered worth talking about were those who stood out in extremis: either those who made a right royal nuisance of themselves, or those who fulfilled the Roman male fantasy of the ideal wife. This meant that examples of the “perfect” Roman woman were recorded (think Lucretia and Arria Paeta) as well as the ones whom history condemned as beyond the pale (think Messalina and Clodia). Hence we are left with a cartoonesque surreality, in which real women’s voices are almost entirely absent.

This week I’ve been thinking about the subliminal messages we send to women and girls. As a child of the late 1970s and the 1980s, I grew up alongside all sorts of messaging that would now be considered unacceptable and problematic for boys and girls alike. I remember cigars advertised on the television and cigarettes in magazines. I remember buying candy sticks that were sold in small packets designed to look exactly like cigarettes, which my friend and I would pretend to “smoke” on a wintry day, enjoying the way our clouds of breath resembled the unquestionably cool clouds exhaled by smokers. I remember Bernard Manning and Jim Davidson. I remember Benny Hill and his entourage of large-breasted, scantily-clad women. I remember the standard comic trope of male Boss chasing younger female employee around the desk. She would try to avoid his wandering hands but the messaging clearly implied that she invited it really and enjoyed the whole process immensely; it also implied that the man was utterly helpless and incapable of controlling himself in the face of his desires for the younger women paraded before him. Poor chap.

As a child, I liked playing with Sindy dolls and was distinctly less than impressed by Barbie, mainly because the design of her legs made it impossible to force her onto the back of a horse, which was my dominant obsession at the time. If a girl can’t ride a horse, what is exactly is the point of her, I thought? This was particularly disappointing given that the Barbie horse, although a somewhat stylised palomino with a ridiculous flowing mane, had the benefit of articulated legs, which one could adjust into galloping and jumping positions. What a missed opportunity for Barbie to shine as an athlete! My mother supported my apparent lack of enthusiasm for Barbie’s appearance and carriage, helpfully pointing out her risibly manipulated figure as well as the fact that her arms were fixed at right angles. “Probably caused by years of carrying a tray,” she remarked.

Behind the bar in our local family pub there were large cardboard cutouts of topless women, attached to which were overpriced packets of peanuts, which punters were encouraged to purchase by the apparent lure of revealing a little bit more of the nubile lady’s naked form. In that very same pub, my sister was told by one punter that she should be careful to alternate the hand she used to pull the pint-pump, to make sure she didn’t end up with “one bigger than the other”. This was the world I grew up in. None of it was anything I registered as either traumatic or indeed problematic at the time. It was simply the way the world worked. Chin up, love.

Fast-forward to the beginning of the 1990s and when I hit the 6th form my all-girls school seemed to expect us to flip from a world in which sex had been barely acknowledged inside its four walls to a world in which teaching us how to win over the opposite sex was pretty much the endgame. We were invited to host dinner parties for handfuls of lads from private boys’ schools and our group scored a real win as we got Harrow: “we’re all snobs at heart, aren’t we girls?” said the teacher who gave us the simply terrific news, about which we were all expected to be suitably delighted. An evening dinner dance at Wellington College followed swiftly, at which one girl was dragged out of the bushes, covered in scratches and bleeding. As a punishment for her behaviour, the girl was locked in the coach with the male coach driver for the remainder of the evening. It was never established what had happened in the bushes.

Sometimes, the past comes back to you in flashes and at times like these I try to remind myself just how much the world has changed. I hope that it is no longer even imaginable that a girl would be treated in this way in a modern school, although occasionally a story emerges that makes me wonder. At least, though, I cling to the fact that the world has surely changed enough that plenty of people would be shocked by such an event and prepared to take action. That’s the difference. Still, the fact that such action might remain necessary is both depressing and exhausting. I’m not sure I believed we’d still be fighting for the rights of women and girls in 2024, but for a myriad of complex and unpredictable reasons it’s where we seem to find ourselves.

This week, I was invited to comment on a resource, excellent in many ways, but which contained stylised cartoons of an apparently flirtatious slave-girl. When I queried the inclusion of an image that seemed to me to be straight out of the 1970s in a course aimed at modern schoolchildren, the author said he would take on board my comments but also that the girls he’d taught “seem to be happy” with them. So what’s the problem? Well, here’s the thing: girls are used to it. Girls are socialised to accept their lot, to remain docile while they are conditioned to believe that the female form is public property. In my foolishness, I honestly thought we’d have got past all this by now, but when I catch the smallest glimpse of the ghastly diet that our girls are being fed on Instagram and similar platforms, when I see the tiniest of tots mimicking the hyper-sexualised poses and pouts of their chosen online influencers, it makes me want to weep. I don’t know what we can do in the face of such an overwhelming tide of subliminal messaging online, but can we at least keep it out of our educational resources?

One of the most depressing conversations I have ever had with a student was one with a member of my Year 10 Form, who showed up for school on a Monday morning with ridiculously long acrylic nails. She knew that the school did not allow them and was clearly trying it on, so the usual negotiations ensued as I attempted to apply the school’s policy and she kicked against it. While we awaited the member of Patrol whom (of course) I ended up summoning due to her refusal to attend the expected acrylic removal session voluntarily, I foolishly attempted to appeal to her, woman to woman. I pointed out the extreme impracticality of the false nails, which rendered her frankly disabled when it came to even the most basic of tasks and certainly at real risk when playing any kind of sport. “You know, women before you have fought incredibly hard so that you don’t have to do this kind of thing any more,” I told her. “You don’t have to polish and preen; you don’t have to enhance your body or make it into a cartoon version of itself in order to please others.” Unfortunately, my words appeared to have had zero impact. She just looked at me like I was insane.

As so often with my weekly musings, I’m not even sure I know where I’m going with this or indeed who needs to hear it. Maybe I’m just howling at the moon. I remain concerned that there is still work to be done before our daughters and granddaughters can truly be themselves, not a caricature of what society says they should be. We can pat ourselves on the back as much as we like about how far we’ve come since Benny Hill was making us laugh on prime-time television; but until we stop the subliminal message being pumped into our children’s brains that women’s bodies are a commodity, then how can we expect them to rise above the mundane and realise their full potential, unencumbered by the expectations of others?

Photo by Kevin Wolf on Unsplash

Snacking

This week I resolved to do more snacking. Not of the doughnut kind (tempting as that is) but a thing I have read about called exercise snacking. It’s rather fun. Instead of resolving that anything other than a full-scale workout is a waste of time, the philosophy of snacking advises working small bursts of activity into your daily routine, whatever that is. I decided to experiment with it. So far this week I have done some calf exercises on the bottom stair while my coffee was brewing, some balancing exercises in the kitchen while cooking (there are probably some health and safety issues with this but I’m a grown adult and doing it at my own risk), plus some squats while finishing a drama on Netflix (far less risky, although the cat was pretty weirded out). None of this snacking is replacing my twice-weekly visits to the gymnasium from hell, but they form a picnic hamper of exercise snacks that I can work into my day without making any effortful changes to my everyday lifestyle.

This got me thinking about how the principles of snacking can be applied to studying. As clients will know, I work in half-hour slots and spend a great deal of my time persuading students that short bursts of focused work are far superior to longer periods of dwindling focus. So many students remain convinced that they need huge swathes of time in order to be able to study effectively, when in fact the reverse is true. No matter how much we learn from cognitive science about the limited capacity of our working memory and the shortness of our attention span, most students (and often their parents) remain wedded to the idea that they need a lengthy stretch of time for studying to be worthwhile.

Much of this attitude, of course, stems from good old-fashioned work avoidance. We’ve all done it: pretended to ourselves that we simply don’t have time for something when in fact what we’re doing is manufacturing an excuse to procrastinate whatever it is that we don’t want to do until the mythical day when we will have plenty of time to dedicate to it. You wouldn’t believe how much time I can convince myself is required to clean the bathroom. Part of overcoming this tendency is to call it out: point out to students when they are using their lack of time available simply as an excuse. But there is, I think, also a genuine anxiety amongst many students that they need long stretches of time in order to be able to achieve something. It often surprises them greatly when I inform them not only that much can be achieved in 10, 15 or 20 minutes but that in fact this kind of approach is optimal. It is not a necessary compromise in a busy lifestyle to fit your work into short, focused bursts: it is actually the ideal. The same is true for exercise snacks, for which there is a growing body of evidence that suggests the benefits of these short bursts of exercise can actually outweigh those of longer stretches.

One of the most counter-intuitive findings from cognitive science in recent years has been that regularly switching focus from one area of study to another is actually more effective for learning than spending extended periods of time on one thing. At first, I really struggled with this in the classroom, as all my training had taught me to pick one learning objective and hammer this home throughout the lesson. But up-to-date research-informed teaching advocates for mixing it up, especially in a setting like the school I used to work in where lessons were an hour long. A whole hour on one learning focus is not effective; far better to have one main learning focus plus another completely separate one one to reinvigorate the students’ focus and challenge them to recall prior learning on a completely different topic. I frequently do this whenever possible in my half-hour tutoring sessions, which may have one core learning purpose but with a secondary curve-ball which I throw in to challenge students to recall something we covered the previous week or even some time ago. This kind of switching keeps the mind alert and allows for regular retrieval and recall.

Retrieval snacking is also something that friends and family can help with and that students can and should be encouraged to do habitually. If you’re supporting your child with learning their noun endings, why not ask them randomly during the day to reel off the endings of the 1st declension? This kind of random questioning will pay dividends in the long-run, as it forces a child’s brain to recall their learning on a regular basis and out of context. Nothing could be more effective at cementing something into their longterm memory, which is the greatest gift any student can give themselves in order to succeed. My grandfather (a trained teacher himself) used to do this with me when I was small and was struggling to learn my times tables. “What are nine sevens?” he would yell out at random points during the day and I had to answer. It worked.

So, let’s hear it for study snacks. Short, random moments when a student challenges themselves to remember something. Adults can help and support them in this process as well as encourage them to develop it as a habit for themselves. Share with them the fact that this works and will help them with longterm recall. Apart from anything, it sends the message that study – like exercise – should be a part of daily life and woven into the fabric of your routine and habits. You don’t even need a desk to do it.

Photo by Eiliv Aceron on Unsplash

Why isn’t this taught in schools?

This was the cry of Susanna Reid on Good Morning Britain yesterday. In a discussion on the worthy quest by Martin Lewis to improve the teaching of financial literacy in schools (a move for which I am broadly in support), the well-paid presenter explained that one of her own children was surprised, shocked and no doubt disappointed by the news that they would have to pay tax on their own earnings. Reid was incredulous. Yet instead of reflecting on her own parenting and wondering how she had managed to raise someone with such a poor grasp of how the world works, she wailed “why isn’t this taught in schools?!” The entire panel agreed with her, with nobody raising the fact that basic financial literacy is, in fact, currently taught in schools.

To quote a nauseating political turn of phrase, let me be clear: I support the teaching of financial literacy in schools and I agree with Martin Lewis that it could do with some improvement. I support it because there are a small handful of vulnerable children who will not experience any discussion at home when it comes to financial matters. They may have parents who struggle to understand such things for themselves, who lack the skills and the vocabulary to enlighten their own children in complex matters. All of that said, I cling to the fact that all parents have a responsibility to teach their children about the world and how they fit into it and to the fact that the overwhelming majority of parents are perfectly capable of doing so. It is parents who have a duty to give children a sense that money doesn’t grow on trees and has to be earned, as well as the basic principle that most of the things they see around them have to be paid for and that this money comes from all of us. These are the kinds of things that must be discussed constantly in order for a child to grasp them, not ticked off on a curriculum list.

When we’re talking about a parent as privileged as Reid (you can look up the latest best guess on her salary), I am pretty unimpressed by the apparent fact that she does not consider it her responsibility to discuss such matters with her own children. To give her the benefit of the doubt, some people find talking about money with their own children difficult. Some want to cushion their children against the harsh reality that things have to be bought and paid for. I’ll be honest and say that I have never understood this. I consider myself hugely fortunate to have had parents who laid their cards on the table. Who told me what we could and could not afford. Who pointed to schoolmates with more luxurious lifestyles and punctured the image by deliberating where that money might have come from, what sacrifices may have been made in order to get hold of it. I was told that I was lucky to have a father who came home in the evenings and at weekends, who turned down more lucrative opportunities because he had different values and preferred to be at home with his family. By the same turn, my parents got lucky that I happened to observe one or two things that supported their rhetoric. Perhaps the most poignant moment was during a pool party at the house of a particularly wealthy classmate. They had an amazing house and an incredible lifestyle, one which could easily have impressed a child of my age. But the birthday girl’s mother spent the entire proceedings lying on a sun-lounger while we were supervised by the au pair, which I found really weird. (I was too young to work out that the mother was drunk, but realised this in later years). What I did understand at the time was that the child’s father made a brief appearance at around 4.00pm and she burst into tears: he was wearing a suit, carrying a briefcase and was leaving his daughter on her birthday to go to work. I remember thinking there and then, “if this is what buys you a private pool, you can keep it.”

Of course, the debate about where the responsibility lies for financial literacy forms part of a wider discussion on what schools are and should be used for and to what extent we are now asking them to take on things which really should not be their responsibility. I have written before on Labour’s mind-boggling suggestion that schools should take on teaching children how to brush their teeth, and barely a day goes by when there isn’t a story of a child sent in to primary school incapable of buttoning up their own coat, doing up their own shoelaces or even the basics of toilet training. Schools are now the receptacle for every failure in social care and – let us not be afraid to say it – every failure in parenting. It simply is not sustainable.

When I mentioned Reid’s comment on Twitter I received a lot of replies, with plenty of people telling me whether they did or did not recall receiving any teaching about financial literacy when they were in school. As always, everyone thinks their own recollections of school reflect the reality then and now, and everyone labours under the illusion that their own recollections are 100% accurate. If I believed every tutee who claimed they’d “never been taught” something I’d be declaring a state of emergency in Latin teaching across some of the most prestigious schools in the country. The reality? Well, they have been taught it, they just didn’t take it in at the time and it’s my job to fix that. The teaching of financial literacy in schools does take place and Reid’s children will in all likelihood have been given some basic teaching on taxes. Could the teaching of financial literacy improved? Certainly. As Lewis pointed out in the discussion on GMB, it is a topic currently divided between Maths and Citizenship in state secondary schools, so it might be a good idea to have someone with overall responsibility for coordinating the curriculum on finances across the whole school. Great idea. I’m all in favour. However, there will still be kids who simply don’t take it on board and I come back again and again to the reality that nothing is so powerful as the messaging a child receives at home.

So, Susanna: if you truly wanted your children to understand about paying taxes, then maybe you should have talked to them about such things on a regular basis to prepare them for the world they will be inhabiting. Your children have grown up in a household with a fair bit more money than the average person, so I hope very much that this was discussed. I hope you told them when times were tight, or explained to them how lucky they were that this was never the case, since mummy does a job that is considered worthy of a salary that most people in equally worthy professions could only dream about. I hope you talked to them about how much prices have gone up in the last couple of years. Do they know why most supermarkets now have a donation point for local food banks? Do they know the answer to the classic question that MPs are so frequently challenged with: do they know the price of a pint of milk these days? Do you? You see, your children’s teachers were not responsible for explaining the basics of how the world works. That job, I’m afraid, was yours.

Photo by micheile henderson on Unsplash

How did it go?

With the first Latin GCSE done and dusted, “how did it go?” is probably a question that every candidate has been asked and answered multiple times. This week, I have found myself wondering to what extent their self-evaluations are accurate.

Curious to discover an answer, I turned to the internet without much hope of finding one, yet came across a psychology study reported by The Learning Scientists, a group of cognitive scientists who focus on research in education. What’s particularly interesting about the study is that it attempts to evaluate students’ success at making what they call “predictions”, which the psychologists define as a student’s projection of their likely performance prior to a test, as well as their “postdictions”, by which they mean a student’s evaluation of their performance afterwards. The study attempted to make an intervention in that process, in other words they tried to improve students’ ability to make both “predictions” and “postdictions” about their own performance. The results are interesting.

The study was performed with a group of undergraduates, and the psychologists made several interventions in an attempt to improve their students’ ability to self-evaluate. They taught them specific techniques for making the most of feedback and they ensured that they took a practice test one week before each of the three exams that they sat, inviting students to self-score the practice test and reflect on any errors. The undergraduates were then encouraged to examine reasons why their “predictions” and their “postdictions” may have been inaccurate on the first two exams, and make adjustments. All of this was with the aim of improving their ability to self-evaluate.

The study found that while the undergraduates’ “postdictions” (i.e. their report on their own performance after the test) remained slightly more accurate than their own “predictions” (their projection of their likely performance), the above interventions resulted in no improvement in the accuracy of students’ “postdictions” over time. While the accuracy of some students’ “predictions” did improve somewhat, none of the undergraduates showed any significant improvement in their ability to make “postdictions”. The students’ ability to evaluate their own performance after each test remained as varied as they had been prior to the interventions.

As the authors conclude, “this study demonstrates … that improving the accuracy of students’ self-evaluations is very difficult.” This is genuinely interesting and certainly fits with my own anecdotal experience of my own ability to assess how I have performed after an examination, as well as the huge number of students that I have worked with over the years. A student’s own feelings after a test may be affected by a myriad of compounding factors and if I had a £1 for every student who felt that an examination had gone dismally who then turned out a perfectly respectable grade, I’d be a wealthy woman. In my experience, some students may over-estimate their “predictions” but most students underestimate their “postdictions”. It is interesting that those “postdictions” appear to be elusive when it comes to intervention and that the cognitive scientists have not – as yet – found a method of helping students to assess their own performance more accurately. I suspect that is because it is too emotive.

It is not obvious from the study how high-stakes the tests were – the psychologists do not make clear, for example, whether the test results contributed significantly (or indeed at all) to the assessment of the undergraduates’ own degree. This to me is something of an oversight, as an obvious compounding factor in any student’s ability to assess their own performance has to be their emotional response to it. Low-stakes testing as part of an experiment is a very different ball-game to the high-stakes testing of an examination that counts towards a GCSE, an A level or a degree class.

My conclusion for now, especially for my highest-achieving students, is to remain unconvinced that they know how well they have done. I could name countless students who have been deeply distressed after an examination, only to discover that they achieved a mark well above 90%. Even in the most seemingly disastrous of circumstances this can be the case. I know of students who missed out a whole question or indeed even a whole page of questions and still achieved an excellent grade overall, so solid was their performance on the rest of the paper and the other papers which counted towards their grade.

Much as it remains an important emotional connection to engage with every student about how they feel their exam went, they’re not a good barometer for what will be on the slip of paper when they open their envelope in August.

Photo by Siora Photography on Unsplash