Poking and fussing

Do you ever wonder whether we’ve somewhat lost our way when it comes to the purpose of education?

When I decided to become a teacher, it was made clear to me back in 1999 that my role would be complex. Given the trend back then for group work and making lessons fun, the role of the teacher had become somewhat synonymous with the purported aims of the BBC: to educate, inform and entertain, not necessarily in that order. Beyond that, it was also made clear to me in 1999 that I would have numerous responsibilities that blurred the line between education and social work, and none of them were unreasonable. Teachers – particularly primary school teachers – spend a huge amount of time with a large number of individual children every day; as a result, teachers are without question some of the best-placed adults to notice when there are concerns to be had, when a child’s demeanour changes or their health declines. I took my duty of care very seriously and regularly reported safeguarding concerns; the ability to raise such concerns anonymously, with more experienced experts who took me seriously and followed up on them, is something I miss greatly about being in a school.

The overwhelming majority of teachers take their safeguarding responsibilities extremely seriously. Nobody goes into teaching with the belief that they will be nothing but an academe, pouring knowledge into the minds of the young with no thought given to their health, their personality, their family situation or what might be going on inside their head. Teaching is a constant dialogue between adults and the young, and our empathy with and understanding of a wide variety of issues that may be holding a child back in their learning is crucial. But let us remind ourselves that what we are there to do is to impart learning. We are not there to solve all of society’s problems, from knife crime to nutrition.

In the last decade or so, and most particularly during and after the pandemic, schools have been expected to take up the slack for every single failing in society: for the failings of government, for the failings of under-funded health services, for the failings of over-stretched social services and sometimes – let’s not be afraid to say it – for the failings of parents. Parenthood is hard – incredibly hard – and not everybody is acing it; but teachers are not parents to the children in their care and they cannot – nor should they be asked to – replace that role.

I hesitate to make political predictions as I am notoriously bad at it and if the last few years have taught us anything it should be to prepare for surprise. That said, it seems likely that we will have a change of government at the next General Election, and it seems likely that the new ruling party will be Labour. This means that what the Labour party said about education at its recent conference becomes potentially more important and relevant than the Conversatives’ blustering about mobile phones (already banned in most decent schools) and maths up to the age of 18 (where they will find the teachers yet to be confirmed). But the Labour party’s pledge to bring in “supervised tooth brushing” for primary school children aged 3 to 5 caught my attention and got me wondering about what they think teachers are for. It also got me wondering whether any of them have ever set foot in a primary school, never mind stayed there for any length of time.

As one primary school teacher on the platform formerly known as Twitter pointed out, teachers have already experienced what it is like when they are asked to supervise hand-washing on a massive scale, when there was a big focus on this during the pandemic. “I remember getting the children to wash their hands at the sink during covid. It took an hour and they missed learning … My TA had to supervise them instead of support children. And that was a class of Y6 children. I can’t imagine how long it would take to shepherd 4 & 5 year olds through the process. This policy has not been suggested by anyone with experience of primary.” Her comments were in answer to someone who claimed that supervised tooth-brushing “would only take a few minutes”. Several primary school teachers responded, with comments like “30 very young children. Probably only one sink. Cleaning the cup after each child. Making sure each child has their toothbrush. At least 50% won’t like the toothpaste … I could go on and on.” My personal favourite was the one who pointed out the problems that would arise from all the spitting. Covid hygiene? Whatever. All in all, the discussion was (or should have been) an eye-opener for anyone who does not work with large groups of children on a daily basis, especially the little ones. You may (I hope) have supervised your own child’s toothbrushing at home. This is not the same as trying to do it with a class of 30.

The British Dental Association has stated that it is “encouraged” by Labour’s proposal, but I feel more than a little despair. As one teacher put it “it’s a sticking plaster for a gaping wound. Babies have teeth. We need NHS dentists, breastfeeding support groups at doctors surgeries, 0-4 family centres. Teachers have an educational role but they’re outsourcing it to us because they don’t want to fund the real support needed.” Absolutely. And it has to stop. Given the amount of time that every primary school teacher knows realistically that this tooth-brushing regime will take, what would people like those teachers to do less of to make it happen? Less supervised play? Fewer handwriting skills? Ditch basic numeracy? You choose.

For me, the suggestion sums up the tangible lack of respect that politicians have for the teaching profession. Teachers are treated as punching bags by all the major parties, belittled and taken for granted across the board. The profession is haemorraghing staff at an alarming rate and to this date not one single political party has taken any kind of frank look at this. Any pledge to “recruit more teachers” falls far short of what’s required, when we know that currently one third of teachers are quitting the profession within five years. It costs a lot of money to train a teacher, so a proper focus on how we retain them – not recruit them – would save the country a fortune.

Readers around my age may recognise the title of this post as a quotation from Pam Ayres’ I Wish I’d Looked After Me Teeth, a poem which pretty much every child my age was told to learn off by heart at some point during their time in primary school. “Poking and fussing” (or – more accurately – “pokin’ and fussin'”) is how tooth-brushing seemed to Ayres as a young child. For me, it’s a rather good description of the approach taken by politicians towards education.

Photo by Henrik Lagercrantz on Unsplash

Poplars tremble gradually to gold

There is an apocryphal saying that has been shared thousands of times on the internet. It is usually labelled “a Greek proverb” but sadly I cannot find any reliable reference to it that predates the 20th century. Nevertheless, it is a favourite saying of mine and whoever first expressed the sentiment was certainly insightful, even if he didn’t share his thoughts in the agora of 4th century Athens.

The saying is as follows:

“A society grows great when old men plant trees in whose shade they know they will never sit.”

Source unknown

There is so much to like about this statement. First of all, I like the fact that it talks about the responsibilities of the oldest in society. It seems to me that we all spend quite a lot of time wagging fingers at the young, telling them that it’s their responsibility to sort out the problems of the future – we may have caused all the problems, mind you, but we won’t be around to face the consequences and they will. The quoted statement calls this attitude into question and suggests that we all bear a responsibility towards the future that will exist after we are gone. I’m not surprised that people assumed such sentiments came from ancient Athens, which was a patriarchal society in which aristocratic men enjoyed the benefits and bore the responsibilities of government; elderly men were afforded power and respect, and in return they were expected to leave behind a legacy for the good of the generations to come.

In many ways, however, this statement is about the importance of trees. While it is using the tree as a metaphor for the future, to express the importance of the longterm legacy that every human is capable of leaving behind when they’re gone, it speaks to the visceral understanding that planting a tree is one of the best things that anybody can do in this world. Our love for trees and our trust in their enduring importance has recently been brought into sharp relief with the heinous felling of the beautiful tree at Sycamore Gap, a famous landmark so named after the tree that by chance grew in a sharp dip in the hillside next to Hadrian’s Wall in Northumberland. The real horror of this inexplicable act of nihilism has left me and countless others quite bereft; even those of us down in the south know the sense of history and local pride that this awe-inspiring natural feature commanded. I simply cannot believe that somebody could bring themselves to do such a thing.

The Romans valued their trees, not just for ornamentation but also for their practical uses. Trees were planted along roads, around public buildings, and inside the garden rooms of the villas of the wealthy, creating an outside-in effect that still inspires architecture and city planning to this day. Preserved cities like Pompeii and Herculaneum evidence how the Romans made trees a part of their urban landscape; excavations reveal that these ancient cities were home to a wide variety of trees, strategically planted for shade and selected for both their aesthetics and their utility. The Romans clearly had an understanding of how they could use trees to improve urban environments, a concept that we are now returning to, with more and more research suggesting that trees can improve the air quality as well as reduce temperatures in modern cities.

I am privileged to live in “leafy Surrey” and it is perhaps poignant that I become most aware of the trees around me in autumn, as we watch the leaves die and start to fall. During October and November, walking along a pavement where I live becomes a joyous experience of swishing through the fallen leaves and crunching upon acorns and horse chestnuts. The title of this blog post is taken from a poem by Gillian Clarke entitled simply October. It explores imagery of death and dying, but highlights the beauty of the colours as leaves start to die and decay in autumn. There simply is not a more beautiful and poignant time of year and while it is always tinged with sadness as it foreshadows the depths of winter to come, I value the glory and the beauty of this time of year immensely.

Photo of the now-felled sycamore tree at Sycamore Gap
by Toa Heftiba on Unsplash

The problem with homework

Self-directed study remains one of the most insurmountable barriers to success for most young people. For many of them, their first introduction to this process is homework. The very concept of homework in state schools is quite modern and seems to have been an expectation pushed in grammar schools rather than in secondary moderns. According to a survey of male pupils carried out by the Central Advisory Council for Education in 1947, 98% of boys in grammar schools received homework regularly, compared to a figure of just 29% of boys in a secondary modern setting. (The fact that this research was carried out into boys only tells you even more about the attitudes towards education at the time – liberals were starting to care about what happened to boys from lower-income backgrounds, but girls didn’t matter full stop). A fascinating booklet published in 1937 by the Board of Education reveals that the government were looking into the issue of homework and evidences firstly that it was on the increase in state schools and secondly that this was not popular; there is notable evidence that homework was used as a punishment rather than as something which was designed to support learning.

Homework remains a controversial issue in modern school settings and some teachers eschew its usage altogether; some educationalists believe that homework advantages those students with greater support at home and puts unnecessary pressure on those without the resources to facilitate it. Certainly, if a child is sharing a room with younger siblings or is in a multiple-occupancy household, and/or if that child has caring responsibilities, committing to any kind of study outside of their compulsory schooling can be a huge challenge. Interestingly, however, I have become increasingly aware of just how unsupported even the most affluent children can be when it comes to self-directed study. They may have all the facilities in the world, but they may not have any idea of how to go about their work. Unless the adults at home have a wealth of time and patience to offer their children, homework and self-directed study can become a real pinch point for families and can severely impair a child’s educational performance.

If I could convince parents of one thing that would make a difference to their child’s educational outcomes it would be this: most people drastically overestimate their child’s maturity when it comes to self-directed study. This includes their child’s ability to self-regulate, their capacity to self-motivate and their fundamental understanding of how to learn. None of this should be surprising given that most adults still have a very poor knowledge and understanding of how humans learn. Many people are still influenced by long-since debunked research on “learning styles” or similiar dangerous mutations and edu-myths that simply will not die; they remain wedded to the idea that the way their child learns is somehow unique, and that the child must discover the best ways of doing so for thesmelves. The reality is that we know more than we ever did about how humans learn things, and there is a wealth of advice out there about how to do so; most people simply don’t take it on board. With this in mind, what follows is a summative reminder of the advice that I give on a regular basis and provide for all parents who wish to support their child with learning.

1. Testing:

Even if your child thinks that they don’t know something, the first thing you should do is test them. I know that might seem strange, but the process of testing forces the brain to concentrate. Just staring at a word and its meaning won’t work; to succeed at memorisation, your child needs to engage with the process and the easiest way to make them do so is to start testing them. This is because memory is the residue of thought (Daniel T. Willingham): in other words, to remember something you have to think about it actively on muptiple occasions.

2. Small amounts, little and often:
This is absolutely crucial. If your child’s Latin teacher has set 30 words for them to learn over one week, they will need to tackle the task repeatedly. While for most homeworks they may be able to sit down and tick them off as done after an hour’s blitz, vocabulary learning should be done in short bursts: take 5-10 minutes once or twice a day and spend that time testing. Start with 10 words. Then later that day or on the next day, return to those 10, adding another 5 words on top. Then repeat those 15 words, adding another 5 and so on. By the end of the week they should be confident. Why so much repetition? There is a reason, and here it is …

3. Spaced learning:
When you rote-learn something quickly, you forget it pretty quickly too. But do not despair! The process of well-spaced repetition strengthens the links your brain has made with what it is learning and lengthens the retention. If a child does their vocabulary homework in one sitting, one week later they will have completely wasted their time. Instead, they should do it in short, spaced-out bursts, with “forgetting time” in between; this way they will spend around the same amount of time in total but their recall will be close to perfect. As a child gains confidence, you should extend the length of the spaces and ultimately you should revisit material that has not been covered for quite some time – days, weeks, months later.

4. Make intelligent use of flashcards:
Flashcards are an outstanding tool when it comes to vocabulary learning. You can use the traditional method of physical cards or an online version, which has the advantage of speed and efficiency. Personally, I am a huge fan of Quizlet, and your child already has access to my flashcards on there. What do I mean by intelligent use of them? Well …

Firstly, do not let your child spend hours making them look pretty, especially not drawing lovely pictures all over them. The use of images on flashcards actually has close to zero impact on students’ ability to learn vocabulary, which can turn into a ridiculous game of “say what you see.” For example, if I showed you the Latin word “femina” with a cartoon picture of a woman next to it, I’ll place a bet you’d be able to tell me that the word means “woman”. But what have you learned? Frankly, nothing. You’ve recognised a picture of a woman, which a two-year-old can do. Much better to discuss the meaning of the word “feminine” with your child and fix the Latin word in their head through the understanding of derivatives (of which more below).

Secondly, make sure that your child is definitely using the flashcards to test themselves (a process called retrieval), not to reassure themselves through recognition. Research shows that one of the most common mistakes students tend to make is to turn the cards over too swiftly; this way, students become convinced that they know the meanings of the words when in fact they are merely recognising the answers – and it can be surprisingly difficult for students to discipline themselves out of this habit, which is why you should help them. Make sure that you’re supporting them at least some of the time by controlling the turnover of the cards. Talk to them about flipping the cards too swiftly and make sure they’re aware of this tendency.

Thirdly, another temptation for students is to keep testing themselves on the familiar words (we all like to feel comfortable!) Remember, flashcards are a tool to help someone to learn the words they don’t know, so separate out the ones that your child has gained confidence with and spend longer on the ones they are struggling to recognise. This can be done on Quizlet by marking up cards with a yellow star (top right-hand corner of each card). That said, another mistake students make is to overestimate their level of confidence with words they have recently learned, so make sure you revisit the “no problem” pile a couple of times before you decide that the words have really stuck in your child’s longterm memory.

Finally, shuffle the deck. This is hugely important. The brain works by mapping links between the things that it is learning; as a result, it has a strong tendency to remember things in order, so the danger with learning several words at once is your child will remember them only in order. You should constantly shuffle the deck to ensure that this isn’t happening, or your child will never recognise the words out of context. On Quizlet this can be done by hitting the “shuffle” button in the bottom right-hand corner of the flashcard deck.

5. Focus on derivatives.
Not only does this help with vocabulary learning, it will develop your child’s knowledge and understanding of their own language and any other language(s) that they are learning. Furthermore, it will consolidate their learning because their brain will be linking its newfound knowledge to prior and future learning – and this all helps with its innate mapping skills.

All of the above requires time, energy and effort from a caring adult. I am acutely aware that this is a lot to ask and that for some people it will simply be too much for them. However, if you are able to dedicate yourself to the process, your child’s learning journey will be made infinitely easier and they will develop the habits and routines that will set them up for success in their studies later in life.

Photo by Thomas Park on Unsplash

Ever-present history

Adrian Chiles had a bit of a rant in his column in the Guardian this month. Now, I should say from the outset that I sympathise with his obvious desperation; as someone who has to write a blog post every week, I have a small shred of insight into the pressure that paid columnists must be under to come up with something – anything – to write about every week in their column. I find it hard enough, and I don’t have to write to the standard that’s expected for the Guardian (no jokes, please). Years ago I had a paid gig writing for an online magazine once a fortnight, for which the standard of writing was pretty high: I couldn’t keep it up.

Poor Adrian was obviously having a particularly tough week when he decided to write a piece about television documentaries which use the present tense to describe historical events. Apparently, it “makes his blood boil.”

“If something happened centuries ago,” he says (said?), “let’s talk about it as if it happened centuries ago – not as if it was going on right now.” Chiles even quotes (quoted?) Dan Snow as someone who is (was?) apparently “miserable” as a result of the process, forced by his producers to speak in the present tense about historical events. I cannot begin to imagine their pain.

Sarcasm aside, it is interesting to me that Chiles – and, based on the comments I read online, perhaps others – claims to find the process of talking about past events using the present tense patronising; he seems to have decided that producers have come up with this device as a cynical or simplistic tool to bring events to life for a modern audience with a short attention span. Chiles not only believes that this unnecessary, but cites it as something which is likely to tip him over the edge.

Personally, I had not noticed that the use of the historic present in historical documentaries was on the increase, but if this is the case is then it is certainly not a modern phenomenon. It has always amused me how incensed English teachers become when a student’s work slides between the tenses. In English classes, students are trained that switching tense is an absolute no-no and will mean that their writing makes no sense. In the ancient world, by contrast, switching between tenses for effect was considered the height of excellent writing: Virgil was a genius at it.

A poet such as Virgil sometimes wrote whole passages in the present tense for effect; he would also write in the past tense and then jump into the present for a particularly striking moment, capitalising on the jarring effect to make a moment vivid. So a technique practised by men that were and are (past and present) considered to be some of the greatest literary artists that have ever lived now gets you marked down in GCSE creative writing and certainly gets you up the nose of Adrian Chiles.

In truth, I would not advise students to switch constantly betweeen tenses in the way that Virgil does; it is a not a technique commonly used in modern writing and can indeed lead to potential confusion unless used with caution. Apart from anything, just because a technique is used by a genius doesn’t mean that it’s necessarily a great idea for us lesser mortals. But the use of the present tense to describe historical events is surely an effective way to bring them to life and I’m a little puzzled as to why anyone would find it so irritating. I guess it’s one of those things, like a dripping tap, that starts to wind a person up inexorably once they have noticed it. My advice for Chiles would be to try some deep-breathing exercises next time he watches anything on BBC Four.

Photo by Hadija on Unsplash

Fraud

Some things have happened to me this week that have made me reflect about how we talk to each other online. I mentioned in my last post that I had (accidentally) smashed my iPhone. This is now fixed, although not before I had been through quite the self-reflection on whether it might actually be rather good for me to own a smart phone that was less pleasant to use. In the end, however, I concluded that a broken phone was at risk of malfunctioning and that this was perhaps not the smartest move for someone who is self-employed and relies on business coming in; yesterday, I forked out for a replacement screen.

The smashed phone coincided with some broader reflections that I also mentioned in my last blog post and which have continued to ferment in my mind. Two television programmes have influenced me over the last fortnight, one a drama and one a documentary. A couple of weeks ago I got around to watching the most recent season of Charlie Brooker’s Black Mirror and was moved and disturbed as always. The final episode – without giving too much away – deals with smart phone addiction; it is a thought experiment about where such an addiction might lead in a worst-case scenario, and takes a wry look at how even the creators of the big social media platforms seem to rue their own creation.

This episode of Black Mirror really stuck in my mind and at first I struggled to think why. It wasn’t one of Brooker’s best and it certainly wasn’t one of his most disturbing. (There are other episodes of Black Mirror that I frankly regret watching). Yet this one needled me, I suspect because I recognised the compulsion and the attachment it explored. I knew that I found my smart phone addictive. So I resolved to do better, and as a part of my quest I decided to watch something else that had been on my list for a while, a Netflix documentary called The Social Dilemma. This production, made only a couple of years ago, interviews a range of ex-techies from Silicon Valley, all of whom have left the companies for which they previously worked: there was the guy who created the “Like” button on Facebook, there were techies from the platform formerly known as Twitter, from Instagram and even from Google. All of them had three things in common. Firstly, they had all struggled personally with addiction to the products that they themselves had helped to create: they were suppliers addicted to their own drug. Secondly, they were now united in opposition to the way that these platforms were built and designed in order to be addictive; many of them were actively campaigning against the platforms that they used to work for, appalled by what they themselves had created. Thirdly, not one of them let any of their kids near a smart phone. Not at all. These were wealthy tech whizzes from Silicon Valley and their own kids do not have smart phones. If that doesn’t make the rest of the world reflect on why they let their kids have access to these devices from such a young age, I don’t know what will.

There is so much to love about the internet. I find it empowering and useful and it enables me to do the work that I do. On the other hand, there is much to be afraid of, most of all the addictive nature of the ever-accessible device in your pocket. Listening to the men and women who created these platforms that we all use and hearing them explain how they are built, designed and programmed to be addictive was a sobering experience. I have found myself looking at those around me – both the people I am close to and people who are strangers to me – and I see the signs of compulsive usage everywhere. I see it in myself. To my regret, I have found myself scrolling through and staring at platforms I actively dislike, somehow unable not to look at them, even in the sure and present knowledge that they bring me no joy. Why do these things have such power over us? The answer is that they were built that way; clever people are paid a lot of money to find ever-improving ways to keep us glued to every platform we sign up to.

In response, and taking the direct advice of the self-confessed ex-drug-pushers from Silicon Valley, I have removed all social media apps from my phone. There are several platforms I viscerally dislike and would happily never use again, but they are undeniably useful for business: Instagram, Facebook and LinkedIn; these from now on I will manage solely through scheduling on my laptop, and I will log in to do that kind of work once or twice a week. The messaging services on Facebook and Instagram I have set up to deliver an automated message to anyone enquiring after my services, saying hello, explaining that I do not spend time on those platforms and giving other ways to get in touch with me. The responses to this, I can tell you, have been interesting. A couple of very genuine prospective clients have reached out to me, one even thanking me for enabling them to get off the platform, which she also disliked. Another said “good for you”. But two other people – neither of whom were prospective clients, nor were they known to me personally – have already expressed their disapproval.

When I logged in to check my Instagram account recently, I found one message from someone purporting to be a business coach. I have no interest in using a coaching service, so I would have ignored this man’s approach anyway, wherever he had made it. He sent me a message stating that he “had a question about my business” and, because it was on Instagram, he received my automated response. His immediate reaction was anger. I blocked him, obviously, but I do find myself wondering about just how bad his own addiction is that the very implication that someone else was choosing not to hang out on his platform of choice made him furious.

Further to this, it appears that another person approached me initially on Instagram and then followed this up, as instructed, with an email. This, of course, I received. He too said that he had a question, and I asked him what it was. Fortunately, it was not a ruse to send me something inappropriate, but it was an inroad into asking me to translate something into Latin for him. Now, you probably don’t realise this, but I get literally dozens of these kinds of requests. I used to respond to all of them. I still do to some. A few months ago, someone got in touch and asked for my help with a favourite quotation for their mother’s funeral and of course I replied to them, indeed I corresponded with them at some length.

Much of the time, however, especially when I am busy, I don’t honestly consider it my honour-bound duty to provide a free translation service for anyone and everyone’s t-shirt, club logo, necklace or tattoo. I am a teacher and a tutor, I’m not a motto-creation service. If someone asks nicely, I may help them out. This man, however, before I had even decided whether and how I was going to respond to his request, followed up his initial email with a second one barely an hour or so later, wanting to know whether I had received the first email and intimating that he was waiting on my response. I didn’t like this, so I decided simply to delete both the emails. The consequence of this decision was that he sent me another, one-word message on Instagram. It said “fraud”.

Fraud.

I am sure that this person is a perfectly reasonable and functioning individual in real life. Were I to sit him down face-to-face and explain that this is a busy time of year for me, that I get dozens of these sorts of requests, that I might indeed have responded to him had he been a little more patient and not harrassed me for an answer, I am quite certain that he would have reacted in a rational manner. Yet online, without that human connection, not only did he decide that I am a “fraud”, he felt the need to tell me so. How did he feel after he sent that message, I wonder? Vindicated? Satisfied? Like he’d done a good thing? Somehow I doubt it. It is an empty feeling, shouting into the void and being left to wonder what the reaction at the other end might be.

The truth is that these platforms are not good for us. They make us less honest and they make us less kind. Most of all, it seems to me, they make us lonelier by dividing us further – the very opposite, those recovered tech junkies tell me, of the original Silicon Valley dream. So you will not find me hanging out on LinkedIn, Instagram or Facebook, none of which contain anything that interests me enough to outweigh the excessive demands that they have placed on my attention due to the addictive nature of their construction. I do gain something from the platform formerly known as Twitter, as so many teachers exchange ideas on there and it remains an outstanding medium for finding links to new ideas and research about good practice in education. If Threads takes over that mantle, so be it. Still, however, I have ruthlessly removed these platforms from my phone. I will keep things on my iPad, which I do use but nowhere near as much as I use my phone. So the phone will be solely for genuine messages from real people – family, friends and clients. At the moment, as I get used to the situation, I am finding myself picking the phone up and then wondering what on earth I have picked it up for. Numerous times a day. This only goes to prove that my decision was right – clearly, the number of times I have been habitually checking these platforms for no good reason is genuinely scary.

I think what I have decided is that, like all addictive substances, social media must either be avoided altogether or be very strictly managed. Its usage must be balanced against the risks and if it’s not bringing me joy or enriching my life, then I genuinely don’t see the point of it. For some people, I fear, social media really is the same as drugs and alcohol: highly addictive, with the potential to turn them into the very worst version of themselves.

Photo by camilo jimenez on Unsplash

Back to School

It’s been impossible to ignore the start of the school year this September, even for those people with no children and with no connection to the education system. With the scandal of RAAC concrete rocking the country and all of us reeling once again at what can only be described as years of incompetence and underinvestment by government, whatever your political stripe, the start of the new school term and the new school year has been on everyone’s mind.

This academic year feels like a milestone for me. This time last year felt truly surreal, as for the first time I did not return to school as I had done for the previous 21 years. The start of last September was very strange and somehow I didn’t quite believe it was happening; I still had the familiar anxiety dreams, so convinced was my subconscious I would be returning to the chalkface as usual. This year, with some distance in place between myself and the school grounds, I forgot altogether which day my old school was returning (although old colleagues did keep me posted on the usual hilarities of INSET day).

I have enjoyed the summer holiday immensely, working to a different schedule (I only saw clients in the morning) and doing significantly fewer hours compared to my usual schedule. But it also feels great now to be settling back into the routine again and I am loving seeing the return of regular clients as they come back for their old slots and restart the academic year. There is also the excitement of starting to work with new students, especially the ones that I really feel I can help make a difference to; nothing in life is as rewarding as helping a student to turn their performance around.

This year I decided to reflect on what happens in schools at the start of the new academic year and to apply the best and most important aspects of this to my tutoring business. I have refreshed my safeguarding training, a legal requirement for teachers in schools but not something which is (yet) regulated for tutors. I have looked at my results and done some reflection, although one of the joys of one-to-one work is you do not face the surprises and disappointments that inevitably occur across a year group in a school. I have reflected on my own practice, decided what worked best last year and resolved to apply the most effective techniques to all clients. Over the last couple of weeks I have reshaped my daily timetable and applied some lessons learnt from last year about when I work most effectively as well as where demand is highest. FInally, I have reflected on how I can reduce unncessary administration and time-wasting, most especially the time spent on social media, which I have reduced to an absolute minimum; I have put systems in place to mean that I don’t have to engage at all with the platforms which do not bring me joy, namely Facebook and Instagram. That final decision has been rather well-assisted by me smashing up my iPhone (not deliberately, but there is a psychological school of thought that there are no real accidents …); this sparked some further reflection on just how much screen time is truly necessary for running a business like mine and how much of it was mindless, fruitless scrolling in the name of “visibility”, which so many business coaches seem to preach is essential to the success of my business. With a website that performs as well as mine does, I do not find this to be so.

Thus, as I settle in to my second year as a full time, independent, one-to-one tutor, I could not be happier with my role and with the balance I have managed to strike between meaningful employment and a better quality of life. I cannot wait to get on with helping my clients, old and new, and to see what the new academic year will bring.

Photo by Aaron Burden on Unsplash

Why all teachers should tutor

Many trained teachers try their hand at tutoring: demand is high and the money is useful. I tutored consistently throughout my first few years in teaching, then returned to it when my husband gave up work to re-train. As time went on, however, I found myself bound to it by more than just financial necessity; I came to realise that private tutoring has was having a profoundly positive impact on my work as a classroom teacher.

It may sound absurd, but it’s easy to lose sight of what you’re paid to do in the frenetic world of mainstream education; marking and administrative tasks – not to mention the ever-shifting sands of expectations – can overwhelm you to the point where you lose perspective on what’s actually important. Tutoring reignited my passion for teaching on a fundamental level; not only did it take me back to some essential skills, it made me question the value of some other things that were taking up too much of my time. It made me better at saying “no” to things that impacted upon my ability to perform my teaching role to the best of my ability and – as a direct result – I stepped aside from roles and responsibilities that were in danger of doing so.

Tutoring exposed me to a wider range of specifications and teaching methodologies that were outside of my range of experience. Habits inevitably become entrenched when you teach the same subject in the same system to the same age-group for a number of years: tutoring forced me to think again. When I started tutoring face-to-face in my area, local demand was highest for Common Entrance coaching, so – despite the fact that I was a secondary school teacher – this became a specialism. Finding out what some 10-year-olds were being exposed to and could cope with made me question where I was setting the bar in secondary school; it also made me ask myself some fundamental questions about what, when and why I was teaching the core principles to older students. All of this came at would could not have been a more useful time: a few years prior to OfSted’s new framework and the huge shift towards a focus on curriculum coherence. When all other departments were running around in a panic, asking themselves why they were teaching what they were teaching and in what order they were teaching it, I had already been through that process and had totally refreshed my curriculum from bottom to top.

Perhaps the biggest impact that tutoring had on me while I was still teaching was a powerful shift in mind-set that is hard to quantify. When I started working with some local prep school students, I took several of them from the bottom of their class to the top. What this felt like is hard to convey, but suffice to say it was emphatically empowering. This positivity then filtered into my classroom practice and somehow made me feel as if anything were possible. This is not to say that I was naïve about the fundamental differences between what can be achieved through one-to-one tutoring and what can be realised in the mainstream classroom; but experiencing the irreplaceable value of one-to-one attention forced me to think of ways in which I could provide more of that magic in the classroom, particularly for the school’s Pupil Premium students (those who are defined by the government as coming from disadvantaged backgrounds). Blessed with an excellent trainee teacher most years, I began to take every opportunity to act as an expert Teaching Assistant to our Pupil Premium students in the trainee’s classes, coaching and guiding them to make more progress than they otherwise could.

Tutoring also opened my eyes to the phenomenal value of spaced learning and retrieval practice, as well as to the stark truth about just how much information children will forget once they have been taught it – a topic I have written on many times. That harsh reality fed through into my classroom teaching and fundamentally changed my approach to the basics of whole-class tuition. I introduced some of the exercises that I had created for the one-to-one setting and incorporated them into my classroom practice; I never took for granted that the students would have remembered what I had taught them the day, the week or the month before – I tested them repeatedly on basic knowledge. Once again, this all happened shortly before there was an explosion of this kind of practice in schools. I feel hugely grateful that tutoring gave me a bit of a heads-up.

As a full-time tutor now, with my own business, it seems obvious to say that tutoring has been a major influence in my life. But I would recommend it to any classroom teacher, not necessarily as a potential career shift but as a way of gaining access to new ideas, new experiences and new ways of informing your current classroom practice. If my experience is anything to go by, your performance in the classroom will benefit enormously.

Photo by Element5 Digital on Unsplash

A study in cultish madness

Since my last post, so many people have sent me messages asking what my research was actually about that I have decided to write an explanation. You only have yourselves to blame.

One of the difficulties one faces when writing a proposal for a PhD is to find a niche in one’s subject where there is work left to be done. I have met academics in my time who have written PhDs on Virgil or Homer, but how they managed to come up with a new angle, never mind how they managed to get a handle on everything that had been written already, is completely beyond me. Personally, I decided that something a little more obscure was the way forward.

I had an interest in ancient philosophy and I was also lucky enough as a part of my degree to do an undergraduate course on the rise of Christianity in the ancient world. These two fields of study collided when I started to learn about Neoplatonism, a branch of thinking in late antiquity which is notoriously difficult to define. In origin and essence, Neoplatonism was everything that was said, thought and written about Plato, Aristotle and other key thinkers in the generations after they lived. Initially, this was the men studying in the schools in which Plato and Aristotle themselves taught (Aristotle was a pupil of Plato, so the process started with him), but as the centuries rolled by Neoplatonism became the wildly diverse writings that were produced generations and even centuries after Plato and Aristotle were writing and teaching. People also wrote intensively about Pythagoras and some ancient scholars became interested in finding what they believed to be religious and philosophical allegories in the writings of Homer. The study of what these men wrote at the time is thus an entire field in itself – if you like, it’s the study of Platonic, Aristotelian and Pythaorean reception in the ancient world. Its most famous and respected proponent was a man called Plotinus, who lived and wrote in the 3rd century AD and had a strong influence on Christian philosophy; I specialised in the men who came shortly after him.

Despite its noble origins as an intellectual field of study, Neoplatonism took on a life of its own and morphed into something really rather bizarre as the years rolled by. This was partly because it was influenced during this period by the growth of religions that focused on developing a personal relationship with one’s god, but there were other complicating factors too. Suffice to say, by the time you get to the period in which I specialised, Neoplatonism had become something pretty weird and wonderful: an intensely intellectual field of study on the one hand and a downright barking set of pseudo-philosophical cultish ravings on the other. I do not exaggerate – better scholars than I have said as much.

Most of the writings from the period we are talking about were so mystical and incomprehensible that modern scholars had no interest in bothering with them. As a result, many of the texts remained untranslated until a movement led by Richard Sorabji, who was a Professor at King’s College while I was studying and researching. Sorabji oversaw a series of texts and translations, making many of these works available for the first time to undergraduates and indeed to anyone else who was bonkers enough to be interested. He specialised in the commentators on Aristotle, the scores of ancient scholars who had spent their lives poring over Aristotelian texts and writing down their thoughts on them.

So I ended up wading around in this quagmire of growing information in this developing field and, prompted by my Supervisor, took a look at a text nicknamed the De Mysteriis by an author called Iamblichus, a Syrian thinker who was writing in Greek during the late 3rd and early 4th centuries AD. He was particularly keen on Pythagoras, and wrote masses of pseudo-mystical nonsense about him; we have one complete surviving work which has frankly undeniable parallels with the Gospels and presents Pythagoras as what can only be described as a Christ figure. He also wrote various other works including the De Mysteriis, on which I wrote my research and which is fundamentally about theurgy or divine magic. Yeah. I told you it was weird.

So. Theurgy. It is pretty difficult to define without presenting my entire thesis, but in essence it was a range of mystical rituals, all with the aim of connecting humans with the divine. You’d recognise some of them from your general knowledge of the ancient world: oracles, for example, through which the gods supposedly spoke to men. Iamblichus believed very firmly that there was a right way and a wrong way of doing these divine rituals, and the De Mysteriis is his authoritative account of what’s what when it comes to doing this stuff. As a result it is – inevitably – absolutely barking. This is not exactly what I said in my thesis, but it’s the honest truth in summary. Indeed, the De Mysteriis is so barking that previous scholars had largely consigned it to obscurity and it had not been translated into English since 1911. So, that’s where I came along. My PhD was a study of the work and through that research I hooked up with another couple of scholars – far older and more prestigious in the field than I was – and who had in the previous decade taken on the task of producing a modern edition and translation of this text. They were – to put it mildly – rather regretting doing so. One of them had already had a heart attack, although the jury was out as to whether the De Mysteriis was entirely to blame or only partially. Long story short, they drafted me in as Chief Editor and I finished it for them. My PhD was also published.

As I wrote last week, I did not enjoy the process of academic research and I regretted signing up for it. However, this does not mean that I was uninterested in much of what I was doing. What it did reveal is what I should have been studying, and it wasn’t Classics. During the process of my research I realised that what fascinated me more than anything else in the world was (a) what makes people do, think and believe what they do and (b) how it is possible to persuade even the most intelligent and educated person of something which is provably impossible. In simple terms, why do people believe in miracles? Why did Iamblichus believe that a truly inspired (for which read fully possessed) spokesperson for the gods could be struck on the back of the neck with an axe and not be injured? Did he really believe that the famous oracles of which he spoke were still functioning? (We know for a fact that most of them had been disbanded by his time – one that he writes about fulsomely had become a Christian campsite by the time he was writing). Following my interests, and whilst I was meant to be working exclusively on Neoplatonism, I ended up going down all sorts of rabbit holes. I read about early 20th century research into “shell shock” (now known as PTSD); I read purported accounts from the 19th century of children possessed by the devil; I read about mass conversion rallies such as those led by Billy Graham; I read about attacks of crowd hysteria, such as faining fits or hysterical laughing in nunneries and girls’ boarding schools; I read about witch trials; I read about Zaehner’s LSD-fuelled research into what would happen to his mind when, enhanced by hallucinogenic drugs, it was exposed to art or literature. (Not much as it turns out – he just couldn’t stop laughing). In short, I read a wildly diverse range of stuff about possession, altered states of the mind and all sorts of jolly interesting weirdness. Long story short, I should have switched to anthropology.

My interest in such things remains to this day and in other guises I have written articles about belief, conversion and religiosity. I even dipped my toe into novel-writing and wrote a dystopian Young Adult novel about a world in which beliefs are controlled and dictated. Much of my spare time these days is spent reading about a variety of cult-like beliefs which are developing rapidly and spreading online. I might even write about it one day.