Responsive Tutoring

One of the most powerful tools for promoting student progress is what’s called assessment for learning (AfL). When I was first teaching and the phrase was all the rage, you wouldn’t have passed an interview without mentioning it. While the acryonym AfL is less often used these days, it still underpins modern teaching.

The thinkers credited with the founding principles behind the use of AfL in the classroom are on record as saying they wish they’d called it something else. Rather than “assessment for learning”, they wish they’d called it “responsive teaching” and I can see why. In many ways, AfL is about neither assessment nor learning – at least, not in isolation. AfL, or rather responsive teaching, is about what a teacher does differently in response to where their students are in terms of their understanding.

While summative assessments (such as a GCSE examination) focus on evaluating final outcomes, AfL is embedded in day-to-day teaching in order to gauge students’ progress, clarify misunderstandings and – most crucially – to guide further learning. Effective use in the classroom presents a unique set of challenges for teachers, especially when working with larger groups. The process is infinitely easier in a one-to-one setting, where the dynamic between the tutor and the tutee shapes the entire process.

Responsive teaching is meant to be a continuous loop, the gathering and interpretation of evidence used to shape a teacher’s instructional decisions. AfL can also be used to help students to recognise their own current level of understanding and set goals to improve. It is meant to be an ongoing, dynamic process and requires teachers to have a nuanced understanding of each student’s needs, strengths, and areas for improvement. To be effective, AfL requires not just frequent feedback but feedback that is individualised and actionable. In a one-on-one setting, a tutor can more naturally meet these requirements, while in a classroom with multiple students, the process becomes complex, requiring considerable skill and resourcefulness from the teacher.

When implementing AfL in the classroom, teachers encounter several challenges that are unique to managing large groups. In a classroom of 30 students, teachers must balance AfL with the demands of covering the curriculum, managing behaviour and addressing a multitude of diverse learning needs. The time constraints are significant. For each student, providing specific feedback and tailoring instructional adjustments is an ideal that is often close to impossible to achieve in practice. In any single lesson, a teacher may only have a minute or two to focus on each student. This time is rarely enough for comprehensive feedback, making it challenging to provide meaningful guidance on areas for improvement.

In larger classrooms, teachers have to rely on quick, general assessments, such as asking questions to the class or using hand-raising methods, but these approaches can miss individual nuances and only provide superficial insights into each student’s understanding. Real-time feedback is essential for the process to work, but logistical challenges mean that teachers sometimes delay feedback until they can examine students’ work. This delay can diminish the impact of the feedback and may hinder a student’s immediate progress. It also places a significant workload burden on the teacher: even schools who have understood and embraced the principles behind whole-class feedback are still placing a considerable assessment burden on the classroom teacher in terms of work that must be completed outside the classroom.

In any classroom, some students may actively participate and show enthusiasm, while others remain quiet or withdrawn. Unless a school has fully embraced and embedded the principles of “no excuses”, teachers will struggle to gauge the understanding of all students. Ensuring equal participation is challenging, and without specific engagement from each student, teachers may only get a partial view of the overall class understanding. Implementing AfL strategies requires significant time and energy, which teachers often need to dedicate to managing classroom behaviour. Students can become disengaged, especially if they don’t immediately understand a lesson or find it challenging. The need for behaviour management can take time away from delivering AfL, reducing the effectiveness of feedback and lesson adaptation.

By contrast, one-to-one tutoring provides an environment where AfL shapes and defines the entire process. In a one-on-one setting, the tutor’s focus is exclusively on a single student and this individual attention means the tutor can tailor questions, feedback, and guidance specifically for that student. Any misconceptions or gaps in knowledge are immediately identified and addressed, without the need for complex assessment. For example, a tutor might notice hesitation in a student’s response and immediately reframe the question to clarify understanding. This kind of personalised, immediate and dynamic intervention is impossible in a classroom.

In tutoring, feedback is instant. If a student misunderstands a concept, the tutor can pause and offer corrective feedback on the spot. There is no need to wait, no need to press ahead with the curriculum. This timely response to a student’s needs helps to solidify learning and build confidence, making AfL truly effective. Tutoring allows for a flexibility in pacing which simply cannot happen in the classroom. A tutor can spend as much time as necessary on a particular concept, adjusting the level of challenge to ensure that a student remains engaged. For example, if a student masters a topic quickly, the tutor can introduce more complex material. Conversely, if a student is struggling, the tutor can slow down, review foundational concepts, or use alternative explanations.

One-to-one tutoring fosters a relationship where the student may feel more comfortable expressing misunderstandings or asking questions. I actively praise my students for interrupting me and asking questions, although I am careful to highlight for them that this is the right environement in which to do so; it is important to me that I support classroom teachers by clarifying to students that they cannot – nor should they – demand this level of individual attention and feedback in the mainstream classroom.

Photo by Element5 Digital on Unsplash

An actual Nazi on campus?

It’s been on my mind to write about this for a while, but I was waiting for the right trigger in current events. This week, news has broken that a student at Leeds University has been suspended from her work at the student radio station and investigated by the Students’ Union for, allegedly, “not acting in a duty of care,” putting the “health and safety” of members at risk, not “upholding the values” of Leeds Student Radio and the Student Union, and “bringing the reputation of the University, the [Student Union, or Leeds Student Radio] into disrepute.”

I’d already had some online contact with Connie Shaw, as she seemed to me to be a very impressive young woman who has been treated quite outrageously by her university and I sent her a message to that effect; her situation has now been reported in the mainstream media, so many more people are aware of what has happened to her. Connie was interrogated by the Union about her “gender critical views” (which are protected in law) and it seems pretty clear that the apparent complaints about her “conduct” arise from the fact that she has launched a podcast on which she interviewed Graham Linehan, Andrew Gold and Charlie Bentley-Astor; these are all people who have had personal experiences and/or hold views that do not align with the prevailing narrative on a typical university campus these days, so Connie has found herself in a whole heap of trouble. Unfortunately for Leeds, Connie is not somebody to be pushed around or silenced and her predicament has now been highlighted in the national press.

I wish my recollections were clearer for the situation I wish to contrast this with, but when I was at university I really was not involved with Union politics. I made sure to vote for representatives, as I have always believed that voting is important. One of the things that has driven me absolutely wild over the many years that I have spent signed up to various Unions is that the average member rarely votes. The number of conversations I have had with people who bemoan the fact that their Union committee is dominated by political zealots at the same time as admitting that they don’t bother to vote makes me want to bash my head against the wall. I will point out until the end times that the reason why so many Unions are dominated by representatives with extreme or bizarre views is because people with extreme or bizarre views get off their butts and run for office, and people who support those views get off their butts and vote for them. The problem is rarely the extreme or bizarre views themselves (which are not held by the vast majority of Union members), it is the apathy of the majority which allows them to thrive. So, yes, I always voted. My only other involvement was I acted as a volunteer for the Nightline service, a telephone support line manned by students and modelled on the service run by the Samaritans. But that was it. I didn’t go to hustings and I wasn’t involved with the day-to-day drama of Union politics.

Despite my lack of involvement, even I managed to hear about the fact that we had a Nazi on campus in 1992. “Nazi” is an over-used word these days and Connie Shaw has joked about being called “a Nazi” by those who disagree with her. It is beyond doubt that, in the current climate, this ridiculous insult is regularly rolled out by people on the political left when they don’t like what somebody else is saying. But this was university in 1992: there were no websites, no chat rooms, no social media, no hashtags and no mobile phones. We used to leave a note on our doors to tell friends where to find us. These were different times in every sense: I recall hearing another student making an openly homophobic remark about one of our lecturers within earshot of dozens of students (and the lecturer himself), and I was the only one to call him out on it. Even when I did so, nobody else backed me up. And again, when I say “homophobic” I really mean it: “Better keep your backs to the wall, lads” was what he actually said as the poor man walked past. Yeah, I know. This was how the world was in those days and believe me when I say that very, very few people were willing to step in and say something. At 19 years old I was already one of them and I’m proud of that.

So, the concept of labelling anyone who failed to meet the exacting liberal standards of a certain kind of Guardian journalist “a Nazi” had very much not taken off in 1992. Quite the contrary. Yet rumours abounded that we had a genuine, bona fide Nazi on campus and he was causing trouble. I first became aware of the situation when I heard that this self-confessed Nazi had applied to speak publicly at a Union meeting and lots of people were very upset about it. From what I could gather, there was a lobby of students pushing that he should be disallowed: nobody wanted to hear what he had to say and why should we have to put up with his revolting opinions being platformed and aired in our own Union? I had a considerable amount of sympathy with this view and understood the strength of reaction that his application to speak had sparked. However, after much discussion, everyone accepted that under the rules of the Union – of which this student was a member – Nazi Boy had the right to speak. Lots of people were very unhappy about it, but those were the rules.

On the day after the event, I spoke to one or two people who were present at the meeting when it happened. Apparently, the guy stood up and said his piece. Nobody shouted him down, because the decision had been made that under the rules he was allowed to speak. However, by the same token, nobody was interested in listening. His speech was not good: it was not articulate, it was not rational and it was, of course, offensive. After he sat down, nobody applauded. The meeting moved on. That was the sum total of his impact: zero. Following what turned out to be quite the non-event, the student in question did not last the year on campus: he left after a few months, and was quickly forgotten.

I am agog as to how quickly we have shifted from a committee of students in 1992, who reasoned that the right to free speech must prevail above all else – even if that meant sitting on their hands and grinding their teeth while the worst of all views were shared publicly – to so many of them believing that nobody has the right to say anything that might challenge a prevailing social narrative in 2024. Here’s the thing, kiddos – when you let people speak, they reveal the truth about themselves and their views. If those views are insane, offensive or irrelevant, perhaps it is all to the good that they are exposed for what they are. If I’m honest, I’m still not sure whether it truly was the right decision to allow a Nazi to speak in the Union, but I believe that the scenario is worth recalling and I applaud the Union committee of 1992 who believed that the agreed democratic process was what mattered most, despite the pressure that they were under to ban the guy from speaking.

We have moved from a situation in which the youngest of people were capable of grasping the dangers of curbing free speech in even the most challenging of circumstances, to one in which students refuse to even entertain a narrative which may jar with their own. Quite how these young people navigate their way through the world I struggle to understand. What a terrifying and dangerous place it must seem, when you cannot cope even with hearing some politely-spoken words you disagree with. It seems to be a frequent occurrence in many universities now, with students either refusing to platform certain speakers or protesting their very presence when they do appear. I defend anyone’s right to protest, but it seems to me that this important right is now exploited by people who simply do not wish to allow others to speak freely. Ask any student who protested the appearance of Kathleen Stock at the Oxford Union what their purpose was and I am quite sure that they will happily tell you that they wanted to drown her out, as they believed that her views were hateful.

Perhaps some students are terrified of any alternative narrative because deep down they are actually afraid that they might be persuaded by it. What if I start to believe what the other side has to say? Yet surely it says very little for the strength of anyone’s convictions if they are genuinely terrified of a conversation. I guess if you lack all moral fibre and courage then it’s easier to scream until you can no longer hear the other speaker. In that way, you also get to drown out the niggling voice inside your head: the voice that says maybe – just maybe – you’re the bad guy.

Photo by Kristina Flour on Unsplash

Call that a PhD?

You could be forgiven for thinking that Gregg Wallace’s video was the most explosive thing to happen on social media this week, but you would be wrong.

Picture the scene: a young, female academic at Cambridge shares a happy picture of herself, smiling and clutching her freshly-acknowledged PhD thesis in English literature. Ally Louks, now Dr Ally Louks, probably thought that her message of celebration that she was “PhDone” would be liked by a few and ignored by the majority. Yet her post at the time of writing has been seen by hundreds of thousands of people and Ally has received torrents of abuse, some of which beggars belief. The whole storm has sparked outraged discussion on all sides – most of it thoroughly ignorant – about what a PhD is or should be.

Here’s the thing, for those of you that haven’t been there. A PhD is like going potholing: you wriggle down into some difficult spaces and explore the subterrain. Nobody will ever know those particular underground passages better than you, because nobody else is ever likely to go there or, indeed, even want to go there. The reason you’re awarded the PhD is because you have traversed new terrain and – in the judgement of the potholing community – you are the first to do so, or you have uncovered a sufficient number of nooks and crannies that previous potholers did not comment upon. Most of the time, you don’t find an underground palace, a glistening river of stalactites or a dazzling crystal chamber: you simply wriggle your way back up to the surface and get on with your life. Your thesis will sit on the shelf of whichever institution recognised it and – if you’re lucky – it will be consulted by a tiny handful of niche-hole specialists over the next few decades, the number of which you could count on one hand.

Personally, I blame Stephen Hawking. During his doctorate, he hit upon a leap of understanding so brilliant that it changed the direction of theoretical physics forever. Most of us don’t manage that. This does not mean that our PhDs are not worthy of the title: it simply means that most of us are – demonstrably – not a genius like Hawking. There is a reason why Hawking has been laid to rest between Newton and Dawin: he is right up there with those two when it comes to the significance of his contribution to his field. Yet many people seem to assume that Hawking is an example of what is expected of a PhD candidate – a particularly famous example, perhaps, but an example nonetheless. In reality, most research is utterly banal and unimportant: it’s not going to shake up our understanding of the fabric of the universe.

Louks’ PhD sounds – to me – rather fun. Okay, I’m one of those wish-washy artsy types that got a PhD in Classics, not theoretical physics, but I reckon her thesis “Olfactory ethics: the politics of smell in modern and contemporary prose” sounds like a more stimulating read than a huge number of PhDs that have passed under my nose over the years (pun intended). In response to the unexpected interest in her work, Louks shared her abstract, which only further made my nostrils twitch. Her thesis explores “how literature registers the importance of olfactory discourse – the language of smell and the olfactory imagination it creates – in structuring our social world.” Her work looks at various authors and explores how smell is used in description to delineate class, social status and other social strata. I mean … fine? No? Quite why a certain type of Science Lad on the internet decided that this was a completely unacceptable thesis baffles me. Apparently, there is a certain type of aggressively practical chap, who believes that exploring how things are represented in literature and how that literature has in turn helped to shape our world is utterly unworthy. Well, more fool them. They should read some literature. I suggest they start with Perfume by Patrick Suskind, a modern classic that is quite literally a novel about smell.

I’ll confess that the whole thing has left me feeling quite jumpy about my own thesis, which in 1999 was welcomed as an acceptable contribution to my very narrow, very obscure corner of the underground caves. Once I had seen the reaction to Louks’ abstract I decided to re-read my own. Having done so, I concluded not only that it would sound utterly barking to the rest of the world, it sounded utterly barking to me! This was a field in which I was immersed at the time but have read nothing about since I walked out of the room in which my viva took place.

The viva itself is something that most people do not really understand and is difficult to explain. It is not an examination. Short for viva voce, which is Latin for “with the living voice”, the viva is there in principle for the PhD candidate to demonstrate that they are the author of their own work. In practice, it is also an opportunity for the examiners to quiz the candidate and explore their hypothesis further. The examiners may have questions and it is common for them to advise corrections and amendments; often, the examiners make the passing of the thesis conditional on these amendments. Best case scenario (and one enjoyed by Ally Louks), the examiners pass your thesis with nothing more than a few pencil annotations, none of which require attention for the thesis to be accepted. Worst case scenario, they say that your thesis is a load of old hooey and that you should not – under any circumstances – re-submit it, corrected or otherwise.

While the worst-case scenario is rare and indicates a profound failure on the part of the candidate’s supervisor, who never should have allowed the submission, it does happen on rare occasions. The last time I saw one of my old lecturers from my university days, he reported being fresh from a viva on which he had acted as an external examiner and had failed the thesis. This happens so rarely that I was agog. Having been so long out of the world of academia, it is impossible for me to express in simple terms the intellectual complexities that he explained were the reasons behind his decision, so I shall have to quote him directly: apologies if the language is too academic for you to follow. “Basically, it was b*****ks,” he said. “I mean, don’t get me wrong, it was kind of brilliant b*****ks: but it was b*****ks nevertheless.” That poor candidate. I ached for him. I also found myself recalling the gut-wrenching moment during which Naomi Wolf’s PhD thesis was exposed as fundamentally flawed by Matthew Sweet, live on Radio 3. If you’ve never listened to the relevant part of the interview, I highly recommend it: it is – especially for those of us who have submitted a thesis for judgement in the past – the most toe-curling listen imaginable. Wolf’s entire thesis appears to have been based on a misunderstanding of a legal term, which Sweet discovered simply by looking it up on The Old Bailey’s website. Wolf’s thesis had been passed at Trinity College, Cambridge, an institution that would be hard to beat in terms of intellectual clout and reputation, so quite how this happened is mind-boggling and shameful.

The reaction to Souks’ thesis does, I suspect, have a great deal to do with the increasing suspicion with which academia is viewed, and in many ways I am not unsympathetic to people’s disquiet. There is, without question, a good deal of nonsense (or b*****ks, to use the technical term) talked in a lot of fields, particularly in the arts and social sciences. Yet the vitriol with which Souks was criticised has nothing to do with this. Her abstract, to anyone with even a grudging respect for the field of English literature, makes intellectual sense. No, the roasting of Souks and her work betrays a profound and depressing ignorance as well as a nasty dose of good old-fashioned cruelty. Before people decide that an entire field of study is unworthy of merit, they should maybe ask themselves whether there is even the tiniest possibility that they perhaps don’t know enough about it before they pounce. One can but hope that these people who value their rationality so much will next time run a more scientific test, rather than dunking the witch to see whether she floats.

Photo by Alex Block on Unsplash

Vocabulary acquisition

An essential challenge faced by students and teachers alike is the acquisition of vocabulary. I have written before on the best methods that students can employ when tackling vocabulary learning, so I do not plan to reiterate those here. What follows are rather some observations and musings about what we’re getting wrong in the Latin classroom when it comes to vocabulary acquisition, especially when compared to our counterparts in modern languages.

In my experience to date, supporting students in the accretion of vocabulary is a responsibility undertaken more effectively and proactively by modern language teachers than by those of us who specialise in Latin. It is possible that Latinists are under more time pressure in the curriculum and thus have no choice but to place the responsibility for vocabulary learning onto our students, but I think it more likely that we are simply less well trained in how to go about it than our colleagues in MFL. Classicists suffer from the fact that our training is somewhat broad – a qualified Classics teacher will necessarily have spread their training time across Ancient History and Classical Civilisation subjects, dramatically reducing the time that they spend focused purely on the teaching of the Latin language. I have little to no recollection of being given any significant guidance on how to help my students to develop their knowledge of vocabulary, so all my knowledge in this area has come later – through experience and through reading.

One of the many differences between the manner in which ancient languages are taught compared to modern ones is in the presentation of vocabulary to students. While modern linguists favour grouping words into themes or topics (e.g. “going to the shops” or “hobbies”), Latin teachers tend to present vocabulary in the following ways:

  1. By chapters in a text book (e.g. Cambridge Latin Course, Suburani, De Romanis or Taylor & Cullen). Sometimes these may have a loose theme, but it’s generally pretty tenuous.
  2. As one long alphabetical list (e.g. OCR GCSE or Eduqas GCSE).
  3. In parts of speech. Some teachers invite students to learn the GCSE list in types of words, e.g. 1st declension nouns, 2nd declension nouns etc. 

Each of these approaches has its drawbacks, so let’s consider those one by one. First of all, let us consider the approach of learning vocabulary by text book chapter. If one were to use Taylor & Cullen for this purpose, one would at least be learning the set vocabulary for OCR and thus there is some longterm justification for the approach. The vocabulary also reflects what is being introduced in each chapter and therefore there is some pedagogical justification for students learning it as they go. All of that said, you wouldn’t believe how few schools are actually doing this and to date I’m not sure I have met a single student that is working systematically through the chapters of Taylor & Cullen and learning the vocabulary as they go: some students are being tested on the chapters retrospectively, but I have not worked with any who are using the text book as it was designed. This is most likely because Taylor & Cullen is an ab initio course and thus the early chapters are not suitable for use with Year 10s who have studied Latin in Years 7-9. Why don’t schools use it during those years? Well, I’m assuming that its somewhat sombre presentation and lack of colour pictures puts teachers off the idea of using it a basis for KS3, when (to be frank) they are under pressure to recruit bums onto seats for KS4 or else find themselves out of a job. This means that there is no text book explicitly aimed at preparing students for a specific GCSE exam board being made wide use of in schools.

None of the text books commonly used in schools at KS3 build vocabulary that is explicitly and exclusively aimed at a particular GCSE course. While Suburani is supposedly linked to the Eduqas course, it diverts from using the vocabulary that is relevant to this in favour of what suits its own narrative. For example, students of Suburani will be deeply familiar with the word popina as meaning “bar” (not on the GCSE list for either OCR or Eduqas but used widely throughout the first few chapters), yet they are not introduced to the word taberna meaning “tavern” or “shop” (on the GCSE list for both boards) until chapter 12. Similar problems occur in terms of the thematic focus of Suburani: because it focuses on the life of the poor in Rome, students are taught that insula means “block of flats”. While it does mean this, I have never seen it used in this way on a GCSE paper – the word is used exclusively by both boards in a context in which the only sensible translation is “island”.  I shall say more about the problem of words with multiple meanings later on.

Presenting words in an alphabetical list seems to be the practice used by most schools when students reach Years 10 and 11 and are embarking on their GCSE studies. Most students that I have worked with are told to learn a certain number of words from the alphabetical list and are thus tested on multiple words that have nothing in common, either in terms of their meaning or their grammatical form. One advantage of this is that students are forced to look at words with similar appearance but different meaning. However, multiple and in my opinion worse problems arise from this method. Students learning the vocabulary in alphabetical order give little thought to what type of word they are looking at (e.g. whether it is a noun or a verb) or to its morphology. This means that students do not learn the principal parts of their verbs, nor do they learn the stem changes of nouns and adjectives. This can cause considerable frustration and demotivation when students struggle to recognise the words that they have supposedly learnt when those words appear in different forms. Teachers could mitigate against this by testing students on those forms, but most seem reluctant to do so. Do they think it’s too hard?

The method I used was to present the GCSE list in parts of speech and invite students to learn different types of words in groups: all the 1st declension nouns, all the 2nd declension nouns etc. The advantage of this method is that it allows for the opportunity to link the vocabulary to the grammar. For example, the first vocabulary learning task I used to set my Year 10s in September was to learn/revise all the 1st declension nouns (in theory they knew most of them already from KS3) and to revise the endings of the 1st declension. In the test, they were expected to be able to give the meaning of the nouns I selected for testing and they were expected to be able to write out their endings also. I felt (and still feel, on the whole) that this was the best approach, but that does not mean that it does not have its own disadvantages. Firstly, it made some learning tasks excessively onerous and others too easy: for example, that task of learning the 1st declension nouns was very easy (because most of the words were already familiar and the forms of the nouns are very simple) but the task of learning 3rd conjugation verbs was much harder (fewer of them were previously known and their principal parts are a nightmare). This meant that students were often hit with homework that turned out to be extremely difficult at what might not have been the ideal time for them. A second disadvantage was that it was impossible to give students a translation test, because one could not create sentences out of a set of words which all belong to one category. Thirdly, and related to that point, testing according to parts of speech made it very difficult to link vocabulary learning to classroom teaching in any meaningful way: in class, we might be studying the uses of the subjunctive, and that could not necessarily be linked to the homework task that was next on the list. This is something that I have been thinking about more and more in recent years as a massive problem in Latin teaching – a disconnect between what students are learning in the classroom and the vocabulary they are invited to learn for homework. The more I think about it, the more I believe this is a fundamental problem which requires a complete curriculum re-think.

The difficulty of linking vocabulary learning to explicit classroom teaching is something that modern language teachers would probably be very puzzled by. Modern linguists are way ahead when it comes to tying vocabulary learning to what’s happening in their classroom and to the relevant grammar. Given this, imagine my excitement when one of my tutees shared with me that she has been presented with the OCR vocabulary list in themes! I was full of anticipation as to how her school was planning to test their students on those themes. For example, one theme might be “fighting and military language”, within which students learn nouns such as “battle” and “war” alongside verbs such as “fight” and attack”. Call me daft, but I hoped and expected that she would be tested using some simple sentences, which would afford teachers the opportunity to observe students’ (hopefully) increasing understanding of grammar and morphology alongside the acquisition of the relevant vocabulary. Surely no teacher would have gone to the trouble of dividing up 450 words into a set of themes unless they were going to make use of some innovative testing methodologies? No? Well …  actually, no. The school are testing the students on a list of words, with no link made between the meanings of those words and the learning that is going on in classroom. I have absolutely no idea what the point of this is. Maybe somebody in the department has read somewhere that “themes” is a good way to classify vocabulary and I am sure it is – but I’d place a hefty bet that there is no tangible pedagogical gain unless that learning is linked to the use of those words in sentence-structures, the kind of approach favoured by Gianfranco Conti.

I said that I would come back to the issue of words with multiple meanings, and that is something I have noted with interest from my tutee’s themed list. Words with multiple meanings appear more than once on the different lists, with their meanings edited to suit the theme of that list. This is an interesting idea and I am still pondering whether or not I think it is a good one. Multiple meanings are a real menace, particularly when the most obvious meaning (i.e. the one which is a derivative) is the least essential. For example, on the GCSE list for both boards is the word imperium, which can mean “empire” and all students immediately plump for that meaning as it is an obvious derivative. However, the word is more commonly used on language papers to mean “command” or “power” – it is therefore those meanings that must be prioritised when a student is learning the word. Similarly, all students need to be drilled on the fact that while imperator does come to mean “emperor” in time, it originally meant “general” and is usually used in that way on exam papers. Even worse is a nightmare word such as peto, which is listed on both boards as meaning anything from “make for”, “head for”, “seek” and “attack”. Students really struggle with learning all of its multiple possible meanings and it is important to show them multiple sentences with the verb being used in lots of different contexts so that they can grasp all of the possibilities.

As so often, I reach the end of my musings having criticised much and resolved little. I am thankful to be working in a one-to-one setting, in which I can support students with vocabulary learning in a proactive and detailed way, one which goes way beyond what is possible in the mainstream classroom and supports their learning in a way that simply cannot be expected of a classroom teacher. I shall continue to ponder what I would do were I in a position to re-shape the curriculum all over again, but I fear that this would entail writing an entire text book from scratch. Many have tried to do this, and even those who have made it to publication remain flawed: I have no conviction that I could do any better.

Photo by Olena Bohovyk on Unsplash

How did we do?

The modern world is increasingly baffling. If I feel like this at the age of 50, what’s it going to be like in another 30 years’ time, assuming I am granted the good fortune to make it into my twilight years?

Yesterday, I visited eBay, browsing for an item I was (unbelievably) struggling to find on Amazon. I did a couple of searches for the item on their site, before getting bored and closing it down. Within hours, I received an email from the computerised bots managed by the team at eBay headquarters. They were keen, they said, to learn about my experience.

When did the most banal of activities become an experience? It is a remarkably recent phenomenon, but one we have apparently come to accept. I am asked to rate my experience of everything from browsing a website to attending an appointment at the hairdressers, at the optician, at the gym, at the dentist. The dentist! Do they really want my honest opinion regarding lying flat on my back while a man fires a high-frequency jet of freezing water against my gums, then offers me patronising, unsolicited advice about how to brush my own teeth, followed by a bill for over £100?

On some level, I get it. Of course. I am self-employed and there are times when I have to ask clients to throw me a bone by leaving me a review somewhere that can be verified. This all helps me to get seen in a noisy, overcrowded online world, and I would struggle to source clients without a little of that kind of support. Some people are kind enough to offer, without being asked. But I hope to God that I have never asked a client to “rate their experience”, nor pestered them with multiple messages when they promised to leave a review and then never got around to it. Most people don’t realise how much small businesses rely on this kind of thing, and I understand completely that it is not top of anyone’s list of priorities to be rating me on Google. I am enormously grateful for the people who do so and think no less of the majority who don’t.

When I was first setting up my business, watching the last few arrivals from the regular salary I had taken for granted for over 20 years, I read and listened to a good deal of business advice. Much of it involved the use of social media, which I dabbled in before realising what a right royal waste of time it was, plus email marketing, which I am glad to say I resisted with pride from the very beginning. Nobody – least of all me – wants their inbox jammed full of self-congratulatory “news” from someone they employ, nor do they want constant exhortations to “let us know how we did”. The ruthlessness with which I police the contents of my own inbox and indeed my phone have become somewhat legendary, indeed I recently had to rather shamefacedly get myself back onto a list of an organisation I belong to, who had taken on board the fact that I had unsubscribed from all email communications and taken myself out of their WhatsApp group. I don’t mean to be unfriendly; I simply don’t want the bombardment of standardised self-promotion which inevitably follows. I habitually set up filters for anything unsolicited that arrives to be deleted automatically – it doesn’t even go to my spam folder, which I have to monitor as sometimes a new approach from a fresh contact will end up in there. Nope, with the filters as I have them, it goes straight in the bin.

The irony is, because I am self-employed and I understand how important ratings and reviews can be for small businesses, I am really diligent about doing them. If a local professional of any stripe does a good job for me, I make sure to ask them where they would like me to leave a review and I make sure that I do it: Google, Check-a-Trade, whatever they wish. If they’re really good, I will also send their details directly to any local friends who might benefit from their services. It’s one of the really nice things about living in a community that one can share this kind of thing and help local businesses to benefit from word of mouth. Everybody wins that way: the business is rewarded for doing a good job, the people in the community benefit from reliable and trustworthy service-providers. What’s not to like? But in the hands of big businesses (and – it has to be said – some overly enthusiastic small ones), this simple, organic and entirely benevolent system has somehow morphed into a leviathan, a behemoth with which to hassle their customers with remarkable tenacity. Well, I’m not having it. I shall continue to police my inbox with more rigour than has ever been managed by US border control.

Did you enjoy this blog post? Please do let us know! We love to hear from you! Rate your experience here.

Photo by Thomas Park on Unsplash

Low-level disruption

One of the multiple joys about tutoring compared to classroom teaching is the minimal amount of disruption. Barring technical difficulties, which do happen on occasion, my sessions with students these days are mostly uninterrupted bliss. Lest you think that my working life is now perfect all the time, I shall start with the few occasions on which I have found my one-to-one sessions rudely interrupted, before I move onto more painful recollections from the classroom.

Technical issues in tutoring usually stem from ropey broadband and much of the time can be alleviated by sharing the screen and/or turning cameras off, so the internet has less to cope with. Some clients seem to think that WiFi is not required; my clients this year are pleasingly home-based, but I have had clients in the past who seem to believe that online learning can be conducted on the go. I’ve had students in the back of the car on their way somewhere (I think my favourite was one session that was interrupted by the father 5 minutes in who announced to the child that they had to get in the car – she had no idea where they were going – and continue the session on the hoof). I have met with one student who was all dressed for riding and actually at the stables, attempting to concentrate on boring old Latin right before she got on her horse. I did point out to her parents that this was quite a big ask for an 11-year-old girl who is quite understandably obsessed with ponies, and they took it on board.

Even when at home there can be the odd glitch and sessions with one client have recently assaulted my ears with such an appalling electronic scroobling noise that I could barely hear the child over the din. It sounded like a cross between a fax machine (remember those?) and the old dial-up connection from the early 2000s (remember that?) The problem seems to be fixed now, thank heavens, but it was excruciating while it lasted. Some families need to have it explained to them that conversations in the background can be heard by me through the microphone – this can be quite remarkably distracting. Less distracting but often more painful are the sounds of cooking, cleaning or loading the dishwasher. Many families plug their children into headphones and seem to think therefore that the problem is solved, forgetting that if they are using an open microphone, I can still hear everything that is happening in the vicinity.

None of this, however, comes even close to the agony of what are laughably called “low-level disruptions” in the classroom. This week I read a discussion on EduTwitter that took me back to those days with such accuracy that I felt positively triggered. It is impossible to explain to those who have not worked in the mainstream classroom how utterly dispiriting the slow drip-drip effect of low-level disruption can feel like when you experience it multiple times a day and on every day of the week. You see, in life it’s the little things that grind you down. If a child’s behaviour is massively challenging, that isn’t fun or easy by any means, but it’s A Big Deal that will lead to inevitable consequences. The situation will undoubtedly disrupt your lesson and those consequences may well cause you a whole pile of work, but consequences there will be. Low-level disruption, on the other hand, is tolerated in all but the most well-run (and – for reasons which baffle me – most controversial) schools. Every single example of disruption that I am going to give you will sound unbelievably petty and trivial on its own – but what you have to imagine is those actions performed by dozens of students multiple times per day and causing a glitch in learning. You also have to understand that in schools where the culture is that these things are considered acceptable (which are the majority) you get really hard pushback from the students when and if you challenge it. As a result, much of the time, you have no choice but to accept it. And believe you me, learning suffers as a result.

In the discussion, most of the teachers focused on behaviours which cause a small but excruciating noise in the classroom. Several mentioned the clicking of pens. Several also mentioned the crunching of plastic water bottles; indeed, water bottles in general are an indescribably irritating source of disruption, with children crunching them, shaking them, complaining that they’ve spilt them and asking to refill them. How those of us that attended school in the decades before it was decided that all small humans must have minute-to-minute access to liquid in order not to immediately dehydrate is anybody’s guess. Plastic water bottles are awful but so are those trendy reusable ones, which result in an unholy din when they come crashing to the floor (as they inevitably do). Lest we forget, as a result of all this 24-hour hydration, the number of requests by children to go to the toilet is quite literally insane.

Beyond the realms of noise, we have the next level of physical disruption, which happens most among younger students who seem used to milling about the classroom as if it’s a set of stalls for browsing. I have no idea what goes on in some primary schools, but the most inordinate number of Year 7s seem happily convinced that roaming about the classroom is perfectly acceptable, and some of them doggedly continue with this belief into their later years. A student will suddenly decide that it’s essential for them to put something in the bin, which will of course require sauntering past their mates. Likewise, many students simply cannot resist the urge to turn around, then will argue either that they were not turning round or that they were turning around because somebody asked them an important question or had a simply desperate need to borrow an essential piece of equipment, one which they were supposed to have in the first place. Equipment hassles cause no end of tedium and if I had a £1 for every student who has at some point sliced up, flicked across the room or eaten the shards of their rubber, I would be a wealthy woman.

Other behaviours mentioned included tapping, fake coughing/sneezing and general wriggling, in addition to students putting their head on the desk in a last-ditch attempt at silent protest. At least it’s silent, I suppose, but it’s nevertheless still distracting for those around them and does not indicate a great deal of engagement from the student in question.

Of course, those of us capable of teaching like John Keating in Dead Poets’ Society, who had all of the students in raptures and simply hanging on our every word, prepared to stand on their desks and applaud at our remarkable ability to inspire them, suffered none of these hassles. It is a demonstrable fact that every child who spent more than a few minutes in my presence was simply gripped by imagination and motivated to do their best from the very moment they opened their books. Every single one of them lived and breathed their desire to grasp the fundamentals of the indirect statement and to rote-learn the endings of the 4th declension. No exceptions for me. I merely write this blog to show my empathy with those who may – at times – have not held the room so successfully and so rousingly as I did.

Perhaps the funniest moment ever photographed by the press in a school. A child did a faceplant in frustration (at her own performance!) while being tutored by the then Prime Minister. The various images captured were quickly dubbed, “child speaks for nation”.

Vox populi

The Roman intelligentsia never really understood the rise of the demagogues. Those who saw it coming were those who viewed Rome as in decline, at the mercy of mob rule; they never understood the needs and legitimate frustrations of the ordinary people, who counted for little as far as they were concerned. Those caught by surprise had only a hazy grasp of their own handle on power.

The authority of the Senate was based on custom and consent rather than upon the rule of law: it gave advice, but could not enforce its rulings. The Senate thus had no legal control over the people or their magistrates. This uneasy rule by consent lasted for a while, but complacency and arrogance ultimately led to the Senate’s authority being dismantled in all but name.

The Optimates were the dominant group in the Senate, those with families dating back to the mythical good old days and the easy confidence that aristocracy brings. They consistently blocked the wishes of others, who were thus forced to seek support for their measures via the tribunes, who led the tribal assembly. These men were called the Populares  or “demagogues,” by their opponents. The Optimates tried to uphold the oligarchy and thus maintain their aristocratic stranglehold on power; the demagogues sought popular support against the dominant aristocracy, sometimes in the interests of the people but also to further their own personal ambitions. To be clear, both groups of men were eye-wateringly wealthy: this was in no way a rise of the working man. The demagogues achieved some success via purely political means, but ultimately the generals who commanded military forces in the provinces (also fabulously wealthy) began to realise that there was an opportunity here. The Roman elite found out the hard way that those who win the hearts and minds of the military are the ones who hold power. And it was still all about money – you didn’t make it in this battle for power without it.

It is difficult and not to say puzzling to watch the existential crisis being experienced by Democrats and their supporters across the Atlantic. Rarely has a nation been so divided and so unable to listen, although I am experiencing uncomfortable flashbacks to our own country during the fallout after the Brexit vote. I cycle through news channels on an endless loop and find person after person talking, talking, talking. Talking to each other, talking at each other, talking over each other. Nobody’s listening. Although a passionate supporter of people’s right to protest, I am beginning to lose my faith in even this simple kind of expression, for in today’s world it seems always to descend into people screaming at each other, nose to nose across the barricades. But you can scream as hard as you like. It doesn’t matter if nobody’s listening.

I do not claim to know why the American people voted as they did. I have my own pet theories, but these are shaped and coloured by my own peculiar interests and passions, and therefore too biased to be of relevance. I can only say what I see, and what I see is a population that feels betrayed by politics. I see people who are sick of being told what to think or – even worse – being told that they don’t think, that they are incapable of it. That their small-town lives don’t matter, that their values are old-fashioned and need to be consigned to history, that they need to get with the programme, wise up, wake up, listen up, sit up and shut up.

The trouble with democracy is you ask people what they want and they tell you. It may not be the result you were hoping for. Winston Churchill no doubt thought when he saw his country through the second world war that the next election was in the bag. In fact, he and his party were voted out of office. To quote the man himself: “No one pretends that democracy is perfect or all-wise. Indeed it has been said that democracy is the worst form of Government, except for all those other forms that have been tried from time to time.”

Cicero denounces Catiline, fresco by Cesare Maccari, Palazzo Madama in Rome

Roman spooktacles

As the leaves turn and the nights draw in, many cultures prepare to celebrate festivals that explore the boundaries between the living and the dead. Halloween is a festival steeped in history and rich in symbolism and so this year I am trying to be less grumpy about it. I’ve never liked Halloween and tend to lock myself away in doors, but I have decided to embrace the spirits and attempt to understand why so many people feel drawn to this festival.

To appreciate Halloween’s origins, we must travel back to its Celtic origins and also to ancient Rome, where festivals of the dead held significant cultural and spiritual importance. Modern Halloween is heavily influenced by Celtic traditions, particularly the festival of Samhain. The Celts celebrated Samhain at the end of October — marking the transition from harvest to winter  — and believed that the boundary between the living and the dead was at its most attenuated during this time. Roman practices undoubtedly shaped the evolution of this festival into what we now call Halloween. As the Romans expanded their empire, they encountered various cultures and incorporated aspects of their beliefs and practices. This cultural blending likely contributed to the transformation of Halloween from a purely Celtic observance into a more widespread celebration that includes various elements from different traditions. Many see Halloween as an American export and in multiple ways it is; but we can thank/blame the Romans for most things, so I don’t see why Halloween should be any different.

One of the most important Roman festivals dedicated to the dead was Feralia, which in itself was part of larger nine-day observance called Parentalia, dedicated entirely to ancestors. During Feralia, families would visit the graves of their loved ones, bringing offerings such as food, wine, and flowers. It was a time to reflect on the lives of those who had passed, and rituals often included prayers and sacrifices. Many modern Halloween traditions across Europe involve remembering loved ones who have died; altars, decorations, and memorials are common in many parts of the world during Halloween, reflecting the human desire to honour those who came before us. A ritualistic response to death is one of the things that defines us as a species and tentative evidence of burial or funerary caching goes back to the Stone Age; it seems clear that our earliest ancestors began interring their dead, sometimes with personal effects. Some anthropologists argue that such relics are evidence for a belief in some kind of afterlife, in which it was assumed that the deceased individual would require the tools of his trade; others are more cautious, and argue that grave goods are simply evidence of individualisation and respect – religious or not, we like to bury a person’s things with them, as symbolic markers of who they were and the impact that they had on the world.

Another notable Roman festival of the dead was Lemuria. This festival is perhaps closer to that of Halloween, for it focused on the appeasing of restless spirits, particularly those of deceased family members who had not received proper burial rites. The father of the household would perform a series of rituals, including throwing black beans over his shoulder and chanting incantations to exorcise the spirits. The beans symbolized the offerings made to the dead, while the rituals aimed to ensure peace for both the living and the dead. Echoes of our festival of Halloween are obvious in the theme of dealing with spirits. Many Halloween customs, such as carving pumpkins to ward off evil spirits and dressing in frightening costumes, are deeply rooted in the idea of confronting and appeasing supernatural entities. The shared emphasis on rituals and offerings reflects a universal human desire to connect and to address fears of the unknown.

The Roman festivals of the dead offer us an insight into how ancient cultures grappled with the concept of mortality. In an ever-changing world, the rituals surrounding death and remembrance remain vital to many people. Whether through offering food to the spirits, lighting candles, or sharing stories about loved ones, we find ways of engaging with those we have lost. Halloween serves as a reminder of our connection to those who came before us, a celebration of life, death, and the enduring bond between the living and the dead.

The modern festival of Halloween is rather characterized by a mix of fun and irreverence and most of my students absolutely love it. Trick-or-treating, ghoulish fancy dress and haunted houses dominate the festivities and many of these traditions do hail from our friends across the Atlantic. Many people argue that the act of dressing up in costumes can be seen as a way to confront the idea of death and the unknown, much like the Romans did during Lemuria; all of that said, I’m not sure how many of my students see it as anything other than a jolly good excuse to eat vast quantities of sweets and – bonus prize – scare the absolute willies out of the grown-ups.

Photo by freestocks on Unsplash