Fraud

Some things have happened to me this week that have made me reflect about how we talk to each other online. I mentioned in my last post that I had (accidentally) smashed my iPhone. This is now fixed, although not before I had been through quite the self-reflection on whether it might actually be rather good for me to own a smart phone that was less pleasant to use. In the end, however, I concluded that a broken phone was at risk of malfunctioning and that this was perhaps not the smartest move for someone who is self-employed and relies on business coming in; yesterday, I forked out for a replacement screen.

The smashed phone coincided with some broader reflections that I also mentioned in my last blog post and which have continued to ferment in my mind. Two television programmes have influenced me over the last fortnight, one a drama and one a documentary. A couple of weeks ago I got around to watching the most recent season of Charlie Brooker’s Black Mirror and was moved and disturbed as always. The final episode – without giving too much away – deals with smart phone addiction; it is a thought experiment about where such an addiction might lead in a worst-case scenario, and takes a wry look at how even the creators of the big social media platforms seem to rue their own creation.

This episode of Black Mirror really stuck in my mind and at first I struggled to think why. It wasn’t one of Brooker’s best and it certainly wasn’t one of his most disturbing. (There are other episodes of Black Mirror that I frankly regret watching). Yet this one needled me, I suspect because I recognised the compulsion and the attachment it explored. I knew that I found my smart phone addictive. So I resolved to do better, and as a part of my quest I decided to watch something else that had been on my list for a while, a Netflix documentary called The Social Dilemma. This production, made only a couple of years ago, interviews a range of ex-techies from Silicon Valley, all of whom have left the companies for which they previously worked: there was the guy who created the “Like” button on Facebook, there were techies from the platform formerly known as Twitter, from Instagram and even from Google. All of them had three things in common. Firstly, they had all struggled personally with addiction to the products that they themselves had helped to create: they were suppliers addicted to their own drug. Secondly, they were now united in opposition to the way that these platforms were built and designed in order to be addictive; many of them were actively campaigning against the platforms that they used to work for, appalled by what they themselves had created. Thirdly, not one of them let any of their kids near a smart phone. Not at all. These were wealthy tech whizzes from Silicon Valley and their own kids do not have smart phones. If that doesn’t make the rest of the world reflect on why they let their kids have access to these devices from such a young age, I don’t know what will.

There is so much to love about the internet. I find it empowering and useful and it enables me to do the work that I do. On the other hand, there is much to be afraid of, most of all the addictive nature of the ever-accessible device in your pocket. Listening to the men and women who created these platforms that we all use and hearing them explain how they are built, designed and programmed to be addictive was a sobering experience. I have found myself looking at those around me – both the people I am close to and people who are strangers to me – and I see the signs of compulsive usage everywhere. I see it in myself. To my regret, I have found myself scrolling through and staring at platforms I actively dislike, somehow unable not to look at them, even in the sure and present knowledge that they bring me no joy. Why do these things have such power over us? The answer is that they were built that way; clever people are paid a lot of money to find ever-improving ways to keep us glued to every platform we sign up to.

In response, and taking the direct advice of the self-confessed ex-drug-pushers from Silicon Valley, I have removed all social media apps from my phone. There are several platforms I viscerally dislike and would happily never use again, but they are undeniably useful for business: Instagram, Facebook and LinkedIn; these from now on I will manage solely through scheduling on my laptop, and I will log in to do that kind of work once or twice a week. The messaging services on Facebook and Instagram I have set up to deliver an automated message to anyone enquiring after my services, saying hello, explaining that I do not spend time on those platforms and giving other ways to get in touch with me. The responses to this, I can tell you, have been interesting. A couple of very genuine prospective clients have reached out to me, one even thanking me for enabling them to get off the platform, which she also disliked. Another said “good for you”. But two other people – neither of whom were prospective clients, nor were they known to me personally – have already expressed their disapproval.

When I logged in to check my Instagram account recently, I found one message from someone purporting to be a business coach. I have no interest in using a coaching service, so I would have ignored this man’s approach anyway, wherever he had made it. He sent me a message stating that he “had a question about my business” and, because it was on Instagram, he received my automated response. His immediate reaction was anger. I blocked him, obviously, but I do find myself wondering about just how bad his own addiction is that the very implication that someone else was choosing not to hang out on his platform of choice made him furious.

Further to this, it appears that another person approached me initially on Instagram and then followed this up, as instructed, with an email. This, of course, I received. He too said that he had a question, and I asked him what it was. Fortunately, it was not a ruse to send me something inappropriate, but it was an inroad into asking me to translate something into Latin for him. Now, you probably don’t realise this, but I get literally dozens of these kinds of requests. I used to respond to all of them. I still do to some. A few months ago, someone got in touch and asked for my help with a favourite quotation for their mother’s funeral and of course I replied to them, indeed I corresponded with them at some length.

Much of the time, however, especially when I am busy, I don’t honestly consider it my honour-bound duty to provide a free translation service for anyone and everyone’s t-shirt, club logo, necklace or tattoo. I am a teacher and a tutor, I’m not a motto-creation service. If someone asks nicely, I may help them out. This man, however, before I had even decided whether and how I was going to respond to his request, followed up his initial email with a second one barely an hour or so later, wanting to know whether I had received the first email and intimating that he was waiting on my response. I didn’t like this, so I decided simply to delete both the emails. The consequence of this decision was that he sent me another, one-word message on Instagram. It said “fraud”.

Fraud.

I am sure that this person is a perfectly reasonable and functioning individual in real life. Were I to sit him down face-to-face and explain that this is a busy time of year for me, that I get dozens of these sorts of requests, that I might indeed have responded to him had he been a little more patient and not harrassed me for an answer, I am quite certain that he would have reacted in a rational manner. Yet online, without that human connection, not only did he decide that I am a “fraud”, he felt the need to tell me so. How did he feel after he sent that message, I wonder? Vindicated? Satisfied? Like he’d done a good thing? Somehow I doubt it. It is an empty feeling, shouting into the void and being left to wonder what the reaction at the other end might be.

The truth is that these platforms are not good for us. They make us less honest and they make us less kind. Most of all, it seems to me, they make us lonelier by dividing us further – the very opposite, those recovered tech junkies tell me, of the original Silicon Valley dream. So you will not find me hanging out on LinkedIn, Instagram or Facebook, none of which contain anything that interests me enough to outweigh the excessive demands that they have placed on my attention due to the addictive nature of their construction. I do gain something from the platform formerly known as Twitter, as so many teachers exchange ideas on there and it remains an outstanding medium for finding links to new ideas and research about good practice in education. If Threads takes over that mantle, so be it. Still, however, I have ruthlessly removed these platforms from my phone. I will keep things on my iPad, which I do use but nowhere near as much as I use my phone. So the phone will be solely for genuine messages from real people – family, friends and clients. At the moment, as I get used to the situation, I am finding myself picking the phone up and then wondering what on earth I have picked it up for. Numerous times a day. This only goes to prove that my decision was right – clearly, the number of times I have been habitually checking these platforms for no good reason is genuinely scary.

I think what I have decided is that, like all addictive substances, social media must either be avoided altogether or be very strictly managed. Its usage must be balanced against the risks and if it’s not bringing me joy or enriching my life, then I genuinely don’t see the point of it. For some people, I fear, social media really is the same as drugs and alcohol: highly addictive, with the potential to turn them into the very worst version of themselves.

Photo by camilo jimenez on Unsplash

Back to School

It’s been impossible to ignore the start of the school year this September, even for those people with no children and with no connection to the education system. With the scandal of RAAC concrete rocking the country and all of us reeling once again at what can only be described as years of incompetence and underinvestment by government, whatever your political stripe, the start of the new school term and the new school year has been on everyone’s mind.

This academic year feels like a milestone for me. This time last year felt truly surreal, as for the first time I did not return to school as I had done for the previous 21 years. The start of last September was very strange and somehow I didn’t quite believe it was happening; I still had the familiar anxiety dreams, so convinced was my subconscious I would be returning to the chalkface as usual. This year, with some distance in place between myself and the school grounds, I forgot altogether which day my old school was returning (although old colleagues did keep me posted on the usual hilarities of INSET day).

I have enjoyed the summer holiday immensely, working to a different schedule (I only saw clients in the morning) and doing significantly fewer hours compared to my usual schedule. But it also feels great now to be settling back into the routine again and I am loving seeing the return of regular clients as they come back for their old slots and restart the academic year. There is also the excitement of starting to work with new students, especially the ones that I really feel I can help make a difference to; nothing in life is as rewarding as helping a student to turn their performance around.

This year I decided to reflect on what happens in schools at the start of the new academic year and to apply the best and most important aspects of this to my tutoring business. I have refreshed my safeguarding training, a legal requirement for teachers in schools but not something which is (yet) regulated for tutors. I have looked at my results and done some reflection, although one of the joys of one-to-one work is you do not face the surprises and disappointments that inevitably occur across a year group in a school. I have reflected on my own practice, decided what worked best last year and resolved to apply the most effective techniques to all clients. Over the last couple of weeks I have reshaped my daily timetable and applied some lessons learnt from last year about when I work most effectively as well as where demand is highest. FInally, I have reflected on how I can reduce unncessary administration and time-wasting, most especially the time spent on social media, which I have reduced to an absolute minimum; I have put systems in place to mean that I don’t have to engage at all with the platforms which do not bring me joy, namely Facebook and Instagram. That final decision has been rather well-assisted by me smashing up my iPhone (not deliberately, but there is a psychological school of thought that there are no real accidents …); this sparked some further reflection on just how much screen time is truly necessary for running a business like mine and how much of it was mindless, fruitless scrolling in the name of “visibility”, which so many business coaches seem to preach is essential to the success of my business. With a website that performs as well as mine does, I do not find this to be so.

Thus, as I settle in to my second year as a full time, independent, one-to-one tutor, I could not be happier with my role and with the balance I have managed to strike between meaningful employment and a better quality of life. I cannot wait to get on with helping my clients, old and new, and to see what the new academic year will bring.

Photo by Aaron Burden on Unsplash

A study in cultish madness

Since my last post, so many people have sent me messages asking what my research was actually about that I have decided to write an explanation. You only have yourselves to blame.

One of the difficulties one faces when writing a proposal for a PhD is to find a niche in one’s subject where there is work left to be done. I have met academics in my time who have written PhDs on Virgil or Homer, but how they managed to come up with a new angle, never mind how they managed to get a handle on everything that had been written already, is completely beyond me. Personally, I decided that something a little more obscure was the way forward.

I had an interest in ancient philosophy and I was also lucky enough as a part of my degree to do an undergraduate course on the rise of Christianity in the ancient world. These two fields of study collided when I started to learn about Neoplatonism, a branch of thinking in late antiquity which is notoriously difficult to define. In origin and essence, Neoplatonism was everything that was said, thought and written about Plato, Aristotle and other key thinkers in the generations after they lived. Initially, this was the men studying in the schools in which Plato and Aristotle themselves taught (Aristotle was a pupil of Plato, so the process started with him), but as the centuries rolled by Neoplatonism became the wildly diverse writings that were produced generations and even centuries after Plato and Aristotle were writing and teaching. People also wrote intensively about Pythagoras and some ancient scholars became interested in finding what they believed to be religious and philosophical allegories in the writings of Homer. The study of what these men wrote at the time is thus an entire field in itself – if you like, it’s the study of Platonic, Aristotelian and Pythaorean reception in the ancient world. Its most famous and respected proponent was a man called Plotinus, who lived and wrote in the 3rd century AD and had a strong influence on Christian philosophy; I specialised in the men who came shortly after him.

Despite its noble origins as an intellectual field of study, Neoplatonism took on a life of its own and morphed into something really rather bizarre as the years rolled by. This was partly because it was influenced during this period by the growth of religions that focused on developing a personal relationship with one’s god, but there were other complicating factors too. Suffice to say, by the time you get to the period in which I specialised, Neoplatonism had become something pretty weird and wonderful: an intensely intellectual field of study on the one hand and a downright barking set of pseudo-philosophical cultish ravings on the other. I do not exaggerate – better scholars than I have said as much.

Most of the writings from the period we are talking about were so mystical and incomprehensible that modern scholars had no interest in bothering with them. As a result, many of the texts remained untranslated until a movement led by Richard Sorabji, who was a Professor at King’s College while I was studying and researching. Sorabji oversaw a series of texts and translations, making many of these works available for the first time to undergraduates and indeed to anyone else who was bonkers enough to be interested. He specialised in the commentators on Aristotle, the scores of ancient scholars who had spent their lives poring over Aristotelian texts and writing down their thoughts on them.

So I ended up wading around in this quagmire of growing information in this developing field and, prompted by my Supervisor, took a look at a text nicknamed the De Mysteriis by an author called Iamblichus, a Syrian thinker who was writing in Greek during the late 3rd and early 4th centuries AD. He was particularly keen on Pythagoras, and wrote masses of pseudo-mystical nonsense about him; we have one complete surviving work which has frankly undeniable parallels with the Gospels and presents Pythagoras as what can only be described as a Christ figure. He also wrote various other works including the De Mysteriis, on which I wrote my research and which is fundamentally about theurgy or divine magic. Yeah. I told you it was weird.

So. Theurgy. It is pretty difficult to define without presenting my entire thesis, but in essence it was a range of mystical rituals, all with the aim of connecting humans with the divine. You’d recognise some of them from your general knowledge of the ancient world: oracles, for example, through which the gods supposedly spoke to men. Iamblichus believed very firmly that there was a right way and a wrong way of doing these divine rituals, and the De Mysteriis is his authoritative account of what’s what when it comes to doing this stuff. As a result it is – inevitably – absolutely barking. This is not exactly what I said in my thesis, but it’s the honest truth in summary. Indeed, the De Mysteriis is so barking that previous scholars had largely consigned it to obscurity and it had not been translated into English since 1911. So, that’s where I came along. My PhD was a study of the work and through that research I hooked up with another couple of scholars – far older and more prestigious in the field than I was – and who had in the previous decade taken on the task of producing a modern edition and translation of this text. They were – to put it mildly – rather regretting doing so. One of them had already had a heart attack, although the jury was out as to whether the De Mysteriis was entirely to blame or only partially. Long story short, they drafted me in as Chief Editor and I finished it for them. My PhD was also published.

As I wrote last week, I did not enjoy the process of academic research and I regretted signing up for it. However, this does not mean that I was uninterested in much of what I was doing. What it did reveal is what I should have been studying, and it wasn’t Classics. During the process of my research I realised that what fascinated me more than anything else in the world was (a) what makes people do, think and believe what they do and (b) how it is possible to persuade even the most intelligent and educated person of something which is provably impossible. In simple terms, why do people believe in miracles? Why did Iamblichus believe that a truly inspired (for which read fully possessed) spokesperson for the gods could be struck on the back of the neck with an axe and not be injured? Did he really believe that the famous oracles of which he spoke were still functioning? (We know for a fact that most of them had been disbanded by his time – one that he writes about fulsomely had become a Christian campsite by the time he was writing). Following my interests, and whilst I was meant to be working exclusively on Neoplatonism, I ended up going down all sorts of rabbit holes. I read about early 20th century research into “shell shock” (now known as PTSD); I read purported accounts from the 19th century of children possessed by the devil; I read about mass conversion rallies such as those led by Billy Graham; I read about attacks of crowd hysteria, such as faining fits or hysterical laughing in nunneries and girls’ boarding schools; I read about witch trials; I read about Zaehner’s LSD-fuelled research into what would happen to his mind when, enhanced by hallucinogenic drugs, it was exposed to art or literature. (Not much as it turns out – he just couldn’t stop laughing). In short, I read a wildly diverse range of stuff about possession, altered states of the mind and all sorts of jolly interesting weirdness. Long story short, I should have switched to anthropology.

My interest in such things remains to this day and in other guises I have written articles about belief, conversion and religiosity. I even dipped my toe into novel-writing and wrote a dystopian Young Adult novel about a world in which beliefs are controlled and dictated. Much of my spare time these days is spent reading about a variety of cult-like beliefs which are developing rapidly and spreading online. I might even write about it one day.

Sliding Doors

As thousands of students receive their A level, BTEC and T level results this morning, I’ve been thinking about moments in life that I and no doubt many others from my era nickname “sliding doors”: moments that mark a turning point in the course of your life. The 1998 film Sliding Doors explores the idea that the course of one woman’s career as well as her love-life hung upon whether or not she managed to catch a particular tube-train; it follows both scenarios in parallel – one in which she catches the train, one in which she doesn’t.

In real life, of course, without the omnipotency of a film director, one cannot do this. We cannot see the different scenarios played out and choose which one we prefer. We can look back at pivotal moments in life and acknowledge that something shifted in our lives at that moment, but we cannot know what would have happened in an alternative universe. In the context of romantic relationships, this concept is expressed wonderfully in another 1990s classic, one of my favourite songs by Pulp, called Something’s Changed. In this song, Jarvis Cocker explores the chance nature of his meeting a partner and how it might never have happened; it also uses the conceit of imagining himself writing a particular song on a particular day, which then became about that person:

I wrote this song two hours before we met.
I didn’t know your name, or what you looked like yet.
I could have stayed at home and gone to bed.
I could have gone to see a film instead.
You might have changed your mind and seen your friends.
Life would have been very different then.

Later in the song he returns to the conceit and perhaps my favourite moment (probably upsettting to the more romantically inclined among you), is when he even ponders that without his partner, he might have met somebody different:

When we woke up that morning we had no way of knowing
That in a matter of hours we’d change the way we were going.
Where would I be now?
Where would I be now if we’d never met?
Would I be singing this song to someone else instead …?

The tone of the whole song is wistful but not melancholy, nor is it overtly gushing – those of you who know Jarvis Cocker will understand that he doesn’t really do gushing. The girlfriend is given a voice, but she uses it to tell Jarvis to “stop asking questions that don’t matter anyway”. The general conclusion is: ah, well.

I spoke to a friend this week – on Zoom because we live 300 miles apart – and she too is at a turning point in life. We spoke about sliding doors moments and I told her about how miserable I was doing my PhD and how eventually I decided not to pursue an academic career because I realised that the lifestyle was making me deeply unhappy. “I finished it,” I said. “But it nearly killed me.” This friend then asked me something that nobody has asked me before. She asked me whether I regretted finishing it.

Lots of people have asked me whether I am glad I finished my research. That tends to be the expected tone of the conversation. It is in my view a marker of how insightful this particular friend is that she worded it differently. Do I regret finishing it? Do I regret putting myself through that process? It got me thinking. Maybe I should.

Looking back, the reasons I finished it were all in relation to external pressures. I had received funding from the British Academy and that is very hard to come by – I would have felt guilty that I had taken somebody else’s place and squandered such a privilege. My parents would have been disappointed. My Supervisor would have been disappointed. Finally, and perhaps most foolishly of all, I didn’t like quitting and I still believed that the qualification was important. So, I soldiered on and I finished the thing. I cried multiple times a day. A low point was sitting in my college room and pondering how long it would take somebody to notice if I died in there.

Not only did I finish the PhD, my deep unhappiness and loathing for the life drove me to finish it in record time. Two and a half years. For the last 4 to 6 months when I was officially “writing up”, three copies of the completed thesis sat on my desk, all printed out and ready to be bound. I hid this even from my Supervisor. It was not the done thing to finish in less than the standard three years, plus I had nowhere else to go once my thesis was completed. I had a place to start teacher training in September, but until then I needed to hang on to my college room. So I waited. Eventually, the thesis was bound and sent to the examiners and ultimately my Viva went without a hitch. PhDs are not officially graded, but truthfully there is a hierarchy to what the examiners may say to you at the end of your Viva. Best case scenario is that they mark in pencil a few minor errors or typos and tell you that these do not require correction in order for the thesis to be accepted; worst case scenario, they tell you to tear the thing up and never darken their doors again. Most people fall somewhere in the middle, with corrections advised or sometimes a re-write of some sections recommended. Mine was waved through with the minor pencil errors. The examiners shook my hand at the end of the session and used my brand new title for the first time. Being me, I did correct the minor errors even though I didn’t have to, and submitted it to Senate House.

My PhD has brought me some benefits, not least an exposure to teaching which ultimately became my career choice and is not, I think, something I would have considered as a possibility had I not been thrust into it. It has been useful, as a woman, to sign off as “Dr. E Williams” when writing to certain types of people or institutions – until we bring down the partiarchy (work ongoing), it can be handy to let the recipient of your complaint assume that you are a man (which those certain sorts of people or institutions inevitably tend to do when you use an academic title). My PhD is also something I am proud of, solely because I know just what it took me to complete it. Many of my research peers fell by the wayside (ironically, all of them claiming to be loving the process of research, while I was always very vocal about how miserable I was finding it). A few years ago, I was invited back by my university as something of a voice of doom on a panel about postgraduate research: it’s tough, and most people don’t really enjoy it. So be careful what you wish for.

Back to my lovely friend’s question. Do I regret finishing my research? I have always told myself that I don’t, since I made it through and have something to show for it. My field is pretty niche (true of most PhDs) but my contribution was significant and is still cited in other people’s research in this field right across the world. Quitting would have meant that I had a bad experience and had nothing to show for it. Also, when it comes to sliding doors in life I think it’s best not to have regrets: you made the decisions you made or things happened and that’s how it is – there is little point in asking yourself what you could or should have done differently. But the way my friend worded that question really has made me think about that particular decision a little differently.

Sometimes in life, putting yourself through more pain truly isn’t worth it. The more I think about that awful time, the more my decision seems a little crazy. Much as I cannot see the alternatives played out with the clarity of a film, I can make sensible and reasonable predictions about what might have happened. My career would still have worked out: I had already been exposed to teaching (that happened in my very first year of research), so I probably would have chosen to switch to teacher training if I’d had the foresight and courage to jump ship when I could and should have done. I would have started work earlier, bought a house more cheaply than I did, paid into my pension for longer. If I’m honest, I’m not sure that my PhD has benefitted me in gaining work to the extent that I have told myself it has; my first class Honours degree and Masters at Distinction level was probably plenty and I’m not quite sure why I’ve never considered that before. So in truth, I cannot think of a negative outcome that would have happened due to quitting, other than the immense courage it would have taken to do so.

It is healthy in life to have no regrets, and I’m certainly not going to beat myself up about a decision I made in my mid-20s. I shall continue to make use of my title, and maintain pride in what turned out to be the toughest achievement I have ever faced in life. Go me. But if I could go back in time and tell myself what to do in 1997, I’d tell myself that the brave decision was not, in fact, to soldier on and do what was expected of me; the brave decision would have been to run for the hills. Sometimes, quitting is the bravest act of all.

Image from the British film Sliding Doors, starring Gwynneth Paltrow and John Hannah.

Mistakes were made: the use of the passive voice

Students are often surprised and puzzled when I point out to them that English is not very comfortable with the passive voice. It’s not our most natural way of speaking, which may go some way towards explaining why students find the passive voice difficult to translate. For example, while Latin slides into the passive voice in the imperfect and future tenses quite simply with its alternative set of endings, English makes one heck of a meal out of this: who on earth really wants to say “he was being carried” or “he will be carried”?

Despite this, and this is another thing I like to point out to my tutees, the passive voice is used for very distinct purposes in the English language. First of all, it is used in scientific writing. When writing up an experiment, students are taught to write that “the powder was placed into the test tube” rather than “I put the powder in the test tube”. In scientific writing, this tradition stems from the principle that we should take the individual out of the process and focus on the process itself, removing any other distractions or influences. This then carries forward as a tradition in all academic writing in all fields, although I note with some dismay that this appears to be changing. O tempora! O mores!

The passive voice in scientific writing is a tradition because the person conducting the experiment is not (or should not be) the focus of the experiment. Likewise, another purpose for which the passive voice can therefore be used is in order to separate an event from its cause. Consider the difference between saying “Emma broke the vase” and “the vase was broken (by Emma)”. Not only is the person who broke the vase separated from the action, they don’t even need to be named for the action to make some kind of sense: the accident just happened.

This kind of passive speaking is used to great effect is by public figures – in particular, politicians. The phrase “mistakes were made” even has its own Wikipedia entry, so synonymous is it with political double-speak. The phrase, described by the New York Times as a “classic Washington linguistic construct” allows a politician to sound as if they or their party are taking responsibility for something without actually doing so and without even articulating what they are taking responsibility for. Despite the fact that journalists have poked fun at this phrase since it was first used as early as the 19th century, politicians continue to roll it out on a regular basis. Listen out for it – you’ll be amazed how often it or something very like it pops up.

There are yet more creative ways in which the passive voice can be used other than merely distancing yourself from responsiblity. If you want to create a straw man argument and so ridicule a view that nobody actually holds, how about saying “what we are being asked to believe”? “What we are being asked to believe here is that young people are incapable of making any kind of decision.” Who is actually asking us to believe this? Erm, nobody; we’re just “being asked” to believe it by persons unspecified – straw men, if you will. The passive voice makes this rhetorical trick viable, effective and convincing. (Nice tricolon, I know). Similarly, if you wish to give more credibility to a position than it truly deserves, then make it sound like the consensus view by using the perfect passive participle — drop in that a position is “long held” or “long agreed-upon”. Long held or long agreed-upon by whom, exactly? Well, nobody knows or seems to care.

One of my hobbies is listening out for ancient rhetorical techniques as employed by modern politicians (or rather their speech writers, as since the Age of Spin I am somewhat cynical about the degree to which any of our modern leaders write their own material). Many of the techniques learned, employed and taught by the greatest speech-writers of the Roman era can still be heard in the House of Commons today. A very basic version of the same skills are indeed taught in schools, as students are still expected to learn how to write persuasively in their English language GCSE. The passive voice is an often-overlooked and thus dangerously insidious technique. Do not let the speakers fool you with it. Or, I should say, do not be fooled (by them).

Cicero Against Catiline by Hans W. Schmidt, 1912. Meibohm Fine Arts.

Let us be clear: what teachers could learn from the aviation industry

We in education could learn a great deal from the aviation industry. In fact, most professions could learn a lot from the aviation industry. While so many other professions have a tendency towards a blame-culture and criticism, aviation is built on the principle of learning from its mistakes and implementing procedures to mitigate against them. It is also relentlessly focused on clarity of communcation: for lives literally depend on getting this right.

Last night, my husband and I watched one of a string of documentaries aired on the National Geographic channel, which follow accident investigations in aviation. The accident explored in this particular episode occured all the way back in 1989 and involved a Boeing 707 on an American charter flight from Italy to the Dominican Republic. The flight was making a scheduled stop on the Portugese island of Santa Maria, where it was due to be refuelled. As a result of a series of minor but significant oversights in procedure made both by the flight crew and by the Air Traffic Controller, Flight 1851 came in too low and struck a mountain range, killing everyone on board.

One of the most important things about the way in which investigations are conducted in aviation is that the culture focuses not on finger-pointing but on identifying the factors which led to an accident, so that the findings can be shared within the industry and lessons truly learned. The investigators may make recommendations which lead to a tightening of regulations around pilots’ working hours, a change in how pilots are trained, an adjustment to the design of an aircraft and/or its instruments, a tweak in recommended flight procedures, or all of the above. The approach is a model in how to react under extreme pressure: it does not seek to apportion blame, it aims rather to improve safety for the future, for the benefit of everyone. The pilots who do make errors (and who pay the ultimate price for them) are treated with infinite respect and the investigators do not simply stop at putting things down to “pilot error” and then washing their hands of the incident – they go on to explore why the pilots may have made a particular error, with the ultimate aim of reducing the risk of similar errors occuring again in the future. Were they overtired? Was the information they were receiving unclear or counter-intuitive? Was their training insufficient in this area? Above all, how could we have prevented them from making this mistake? I find this approach genuinely inspiring and I wish other fields would learn from it.

One of the key findings of this particular investigation was that there was a crucial miscommunication between the Air Traffic Controller and the flight crew. The ATC instructed the crew that they were “cleared to three thousand feet” (in other words, that their initial approach should not go below three thousand feet). For several reasons, the First Officer ended up mishearing the instruction as “cleared two thousand feet” and the aircraft was set to the incorrect height. My husband (a trained pilot, as it happens) informs me that this would be less likely to occur now because the advised vocalisations for this particular information have been revised, in an attempt to prevent such misunderstandings; the instruction would now be “you are cleared to altitude three thousand feet” – the word “altitude” must be used immediately before the given height in order to avoid any confusion between words and numbers, an error which in the case of Flight 1851 led to devastating consequences.

While the documentary explored numerous other reasons why the crash ultimately occured and indeed made it clear that the Captain of the flight crew undeniably missed several opportunities to prevent the disaster from happening, the underlying cause of the crash was quite simple – the aircraft came in too low. The miscommunication between the ATC and the flight crew, which was not corrected when it could and should have been, set the aircraft on a collision course with the mountainous terrain towards which it was headed.

Now I for one am jolly glad to be working in a field in which my mistakes are unlikely to cause multiple fatalities and even less likely to be the subject of a documentary on National Geographic some 35 years later. Yet this minor slip which led to such devastating consequences for the flight and its passengers did remind me of a misapprehension which I discovered in a tutee this year. On Flight 1851, the First Officer mistook a word for a number – quite simply, he heard “cleared to” as “cleared two”. Similarly, I realised this year that one of my tutees was convinced that the dative case had something to do with numbers. After a couple of minutes of discussing this with him and trying to explore what was going on, I suddenly realised what had happened: his teacher had (quite rightly) taught his class that the dative case was to be translated as “to” or “for”. My tutee, however, had heard “two” or “four”. He heard numbers instead of words. And he had been royally confused ever since.

Whilst teachers are not making minute-by-minute decisions on which hundreds of lives depend, instead they are laying the foundations for a child’s understanding in their subject. Whilst this is not life-threatening (happily, I can’t think of a single occasion on which a misunderstanding of the dative case has led to multiple fatalities), it is nevertheless important in our line of work, assuming we care about what we do. This particular child’s misunderstanding underlined for me the importance of dual coding, which means using a visual representation of what you are saying as well as a verbalisation: quite simply, if the teacher in question had merely written the words “to” and “for” on the board as they spoke, they would have avoided the misconception that was absorbed and internalised by this particular child.

On one sunny day, during which I took the photograph below, I was very privileged to join my husband on a flight during his training and listen to the impeccably high standard of teaching that he received from his instructor. My advice to all teachers if they want to observe a model in verbal clarity is to take every opportunity that they can to go and listen to people who teach a practical skill. Go and watch a PE teacher setting up a game; watch a science teacher preparing students for an experiment; take a refresher course from a driving instructor; tune in to your coach at the gym. Above all, in your own teaching, remember that every word you use must be carefully thought through and – in an ideal world – that you should take a note of every misconception which does occur and seek to mitigate against it next time by improving your verbal explanation. While I am happy and relieved to say that a child’s life will not depend on your words, their success in your subject absolutely does.

I took this photograph from inside the aircraft in which my husband did much of his training.

You’ve had enough

“You’ve had enough” he said, as he pushed her glass away. “You’ve had enough” he said, with a tone so dismissive and disapproving that I looked up from my book and glared across the room, judging this man I had never met in a marriage I had no knowledge of in a room full of strangers.

“You’ve had enough.” She’d had one glass of champagne.

Every week, I try to make my blog posts an honest representation of what’s on my mind at the time. Mostly that means something which has been sparked during a tutoring session, an observation I have made during several similar sessions or some work that a client seems to have found particularly helpful. This week, following a weekend away in North Yorkshire, a passing encounter with another couple has been playing on my mind ever since.

Maybe his wife had a history of truly outrageous behaviour in public. Maybe she had a history of drinking too much. Maybe she had a health condition that meant more than one tiny glass of champagne was a seriously bad idea, and he was just looking out for her. Maybe. Maybe. But I doubt it.

How many times have I heard a man asserting his control over a woman’s behaviour, done with such decisiveness, such easy self-confidence, such surety that they have the right to impose their will upon a woman? I’m afraid it’s one gigantic trigger for me and makes me want to grab the bottle and drain it before marching to the bar to order a second: ordered, I might add, on my own account, bought with my own money and poured recklessly down my own neck. I might end up with the hangover from hell, but at least it would be on my own terms and be my own stupid choice.

Mary Beard has written and spoken with brilliant clarity about how the voice of women has been controlled, manipulated and erased by men throughout western culture and traces its origins right back to the earliest works we have. As she points out, it is in the very first book of Homer’s Odyssey, one of the two oldest works of western literature in our possession, that we are treated to the first ever recorded example of a man telling a woman to shut up. Telemachus, the teenaged son of the absent Odysseus, speaks to his mother (who is ruler in her husband’s absence and managing rather magnificently – for a woman) and he tells her: “go back upstairs and resume your own work – the spinning and weaving; speech is the business of men, all men, and of me most of all; for mine is the power in this household.” Okay. Right you are, son.

Professor Beard cites various texts to illustrate her point that the silencing and dismissing of women’s voices was par for the course in the ancient world (and indeed is par for the course with depressing frequency in the modern one). She points out that in Aristophanes’ Lysistrata, the satirical joke behind the entire play was that the men of Athens were doing such a God-awful job of running the state that even the women could manage it better. She also examines how women are silenced in Ovid’s Metamorphoses, through transformation into dumb animals incapable of diction. She explores in some depth how oracy was not only considered a man’s domain but indeed defined masculinity itself at its most sophisticated; fine rhetoric was the demonstration, the exercise and the definition of power.

Yet it is not just the oratorical and political silencing of women that has been playing on my mind since I overheard the disapproving Yorkshireman policing his wife’s alcohol intake. Disapproval of this kind so often has its origins in fear – fear of “making a scene” or “a show of oneself” and I find myself reminded of how frightened men in the ancient world were of their women losing control. Witness the moral panic documented in Roman sources at the supposed rise in popularity of cults in which women allegedly lost control and gave in to their baser desires. The Romans in particular seemed to find excessive emotions or hedonism genuinely horrifying when it was expressed by women or, indeed, in a manner which they simply considered to be “feminine”. So the Roman state clamped down hard on these cults.


Addressing the manner in which women are portrayed in the ancient world is of crucial importance and something I have never shied away from. I cannot without comment take students through a passage of Latin which mentions the handing over of women as part of a peace deal, the assignation of female prisoners to men as a reward, or the supposed magnanimity of the general who lets a particular woman off the expectation of becoming his personal slave because she is betrothed to somebody important. Students are usually interested to talk about the content of such passages and it is an important opportunity to remind them of the differences between ancient society and our modern expectations.

Yet in that little bar in North Yorkshire, just for a little while, I could not help but find myself wondering just how far we have come after all, when a man can so clearly and so publicly disapprove of his wife maybe enjoying herself just a little bit more than usual.

Missing the mark

This week I’ve been pondering the fact that we teachers don’t always make the best markers. I mentioned this in passing to a Year 11 tutee a couple of days ago and he expressed such incredulity that I decided to unpick my thoughts a little. Why do teachers struggle to mark accurately and disapassionately?

First of all, marking is incredibly difficult. Even shorter-answer questions take an enormous amount of concentration and classroom teachers are under intolerable time-pressure most of the time. Marking is rarely something that teachers enjoy and prioritise (I’ve met the odd bizarre teacher who claims to “love” marking but if I’m honest I always assumed they were pretending). Longer-answer questions require even greater concentration (English teachers, I feel your pain) and they also require training; if a teacher has not acted as a professional marker and/or attended a training course run by the examining body which addresses those questions and the mark scheme in detail, they may be making false assumptions about how those question will be assessed.

Secondly, teachers develop their marking as a professional tool to aid the teaching process, not as an end goal in itself. When I was training “assessment for learning” – something which its pioneers, Black and Wiliam, now say they wished they had called “responsive teaching” – was the new focus in education, and to a large extent it still dominates. Responsive teaching (I shall call it by its preferred name) requires teachers to mark in a manner that informs their planning – in other words, teachers should base their next lesson on the information that has arisen out of the last time they looked at their students’ work. From the outset, both Black and Wiliam campaigned for teachers to mark in a manner that reduced their workload – I heard Professor Black deliver a session at The Latymer School where I used to work, and he was without a doubt the first educationalist to stand up and tell me to spend less time marking. Black and Wiliam’s vision was that teachers should mark in a smarter way that genuinely informed their teaching – all outstanding advice.

What it means, however, is that teachers are trained to use marking as a diagnostic tool. Every time we mark, we are acquiring and encoding information about how that student is doing and – let’s be frank – whether they are following instructions and/or approaching their learning as we have taught them to. This all feeds into our overall impression of how a student is performing and will shape our next approaches. This is of course jolly difficult in the mainstream classroom, where a class of 30 may present a myriad of responses to what they have been taught so far. Happily, schools are learning to adapt more effectively to this, with leading proponents of whole-class feedback such as Daisy Christodoulou, the brains behind the “no more marking” campaign, driving schools towards a more effective way to share feedback to larger groups. Schools who have not fully adapted in this direction (mine was one of them) are overloading teachers with unnecessary work, since all the research points towards whole-class feedback as by far the most effective use of teachers’ time. Asking teachers to write individual, personalised feedback to every student in a large class is insane and remains one of the things that drives people out of the profession.

So let us come back to the original comment which so surprised my tutee, which was the suggestion that teachers don’t always make the best markers. I told him that I worked as part of a group of 6 professional markers who were assigned the A level literature components a few years ago. Most of us were working classroom teachers, but one member of the group was a subject expert but not a teacher. If I’m honest I was surprised she was there and expected her to struggle with the process. How wrong I was. In fact, she rapidly became the best out of all of us. You see, she was arriving without all the baggage. We teachers look at a script and immediately start thinking about the individual that wrote it. How if only they had done this or that then their answer would have been better. I found it hard not to feel frustrated by the ones who had clearly not learnt the text – again, a symptom of years at the chalkface. I rejoiced for the ones who had excelled. I ached for the ones who had misunderstood the question. But the non-teaching subject expert had no emotional baggage to bring to the table, no classroom-weary experience of working with a myriad of teenagers, who can be frustrating at the best of times; she approached the process entirely disapassionately. Teachers tend to pick up a script and think “how can I help this student to improve?”, or sometimes – let’s be honest – “what on earth are they doing?!”. Examiners must pick up a script and think nothing other than “where precisely does this response fit in the mark scheme?” That’s actually incredibly difficult to do if your brain is used to marking for the classroom – marking for the purpose of helping students to develop and improve.

One of the things we had to develop as part of the examining process was the ability to judge when an answer had hit the threshold for full marks. The teachers in the group took far longer to understand this than the non-teacher. This – I believe – is because we were so used to looking for reasons and ideas to help the students in front of us. The schools I have worked in were all obsessed with “even better if” comments – what tweaks could even the most outstanding of students make to their answer in order to make it better? Much as I applaud the notion that there is always room for improvement, this was sometimes exhausting and at times felt cruel. Sometimes I blatantly ignored school policy and said “you know what? This was perfect. Whatever you’re doing, keep doing it. Keep up the brilliant work.” Sometimes students need to hear that. But marking for the exam board isn’t about perfection – marking for the exam board will require you to give full marks to an answer that is decidely less than perfect. The exam board does not require perfection – it requires students to show their knowledge in a way that fits the mark scheme (and yes, it is a somewhat mechanical and artificial process). Giving full marks to an answer that could be improved was something that the teachers in the group – myself included – had to be trained into doing; it still felt weird every time we did it.

Exam boards are struggling more and more to recruit markers, a symptom of the fact that teachers are already under intolerable strain much of the time as well as an indicator of just how appalling the rates of pay are. I have always advocated that working as a professional marker is excellent CPD and that teachers should mark for the board they teach to if they can; however, I completely understand why so many of them simply cannot find the time or the energy to do so.

Photo by Mauro Gigli on Unsplash