Congratulations and confabulations

I asked HAL 9000 to write this blog post for me. Damn thing couldn’t even get my name right.

I get irrationally annoyed by dictionary ‘word of the year’ announcements. Something that is obviously of no importance makes me shout at the radio – not because of the choices (although Oxford’s ‘youthquake’ in 2017 was discreditable) but because of the concept. It’s PR gimmickry disguised as lexicography.

(Also, the announcements tend to start in mid November, which suggests an unconventional definition of ‘year’.)

But I shouldn’t get too worked up. Dictionaries are good things, and good things cost money to run, and with the shift from print to online you have to grab eyeballs any way you can. This whole word of the year thing seems to work for them. (Here I am, taking the bait…)

Anyway. Cambridge Dictionary’s Word of the First Ten-and-a-Half Months of the Year of 2023 is ‘hallucinate’. Specifically, it’s a new sense of the word. (Update 13/12: Dictionary.com has made the same choice.) The meaning we’re used to is this:

to seem to see, hear, feel, or smell something that does not exist, usually because of a health condition or because you have taken a drug

And the new one that Cambridge has added is this:

When an artificial intelligence (= a computer system that has some of the qualities that the human brain has, such as the ability to produce language in a way that seems human) hallucinates, it produces false information

The idea is that ChatGPT and other large language models (LLMs) aren’t really writing explanations of whatever their users are asking them to explain: they’re generating things that look like explanations but might be hopelessly wrong.

This is undeniably topical, although I note that the new definition doesn’t really read like a definition in the way that the first one does. And, if you read down Cambridge’s article announcing the choice, they soon pivot to explaining why a human-compiled dictionary is superior to one generated by LLMs. What was I saying about PR disguised as lexicography…?

Anyway.

I’m not at all sure how widely ‘hallucinate’ is used in this sense at the moment, but it strikes me that there’s already an established word that does the desired job: ‘confabulate’. Cambridge defines this as:

to invent experiences or events that did not really happen

It’s often associated with brain damage or neurological disorders that affect memory, but it can happen in more mundane circumstances too. An example:

Once, about 20 years ago, I spent a weekend with a friend. On Saturday night we had a lot to drink and very little to eat. On Sunday morning we got up and walked a couple of miles to where a friend of his was laying on breakfast for a bunch of people. When we got there the food wasn’t yet ready, and so we stood around for a while. I felt hungover, very tired and extremely hungry. Having sat down on the floor for a rest, I started to stand up again, but the people suddenly around me said ‘No, stay still’ and ‘Sit back down’. I had briefly fainted. On coming to I had no memory of fainting, and so my brain quickly squirted out the first plausible explanation it could grab of why I would be sitting on the floor.

That’s confabulation. And I think it’s a better fit for what LLMs do. (In fact, Cambridge does list the new sense of ‘hallucinate’ as a synonym of ‘confabulate’.)

To hallucinate in the established sense is to falsely sense something that isn’t there. For LLMs to be hallucinating in this way, they’d have to think they were thinking when actually they weren’t. But Descartes might have something to say about that. If LLMs don’t really think, then they can’t think that they think.

The ones doing the hallucinating are us, as we read an LLM-generated text and imagine that it’s the result of intelligence rather than just pattern-recognition algorithms designed to game the Turing test. So ‘simulated intelligence’ might be a better term than ‘artificial intelligence’ (this was the gist of an article I read earlier in the year but now can’t find). (Update: still don’t know which article this was, but I think it must have been an interview with/profile of Geoffrey Hinton, who has been saying for a while that ‘confabulate’ is a better word in this context than ‘hallucinate’.)

Anyway.

I appreciate that ‘confabulate’, however apt it might be in this case, is not a well-known word and so it may not catch on as well as a less apt twist to ‘hallucinate’. We’ll see.

In conclusion: Dictionaries are good things. Use them. The more traffic they get online from genuine uses, the less they might need the gimmicks.

Leaving everything or leaving nothing? A saying of two halves

Alex Greenwood, who hurt her head pondering a point of semantics.

The greatest controversy arising from England’s footballing defeat to Spain yesterday wasn’t about aggressive tackles, debatable penalties or inconsistent reffing. It was a turn of phrase used by the British prime minister.

Rishi Sunak tweeted:

You left absolutely nothing out there @Lionesses. It wasn’t to be, but you’ve already secured your legacy as game changers. We are all incredibly proud of you.

Plenty of us were puzzled by “You left absolutely nothing out there” – a rough equivalent of “you gave it your all”. I’m not a huge sporting fan, so my knowledge of these idioms is limited, but I’d thought the correct phrase was “You left everything out there” or similar.

The first time I heard it was in a 2005 West Wing episode. Leo, talking to Bartlet (both in dubious health) about how little time the administration had left, said: “For both of us, sir, this is our last game. Let’s leave it all out on the field.”

Various people mocked Sunak for getting the phrase the wrong way round, in some cases implying that he was being phoney and/or that he’s out of touch and/or that he’s a terrible human being who’s unfit to lead the nation.

Danny Finkelstein, a Conservative peer and a far keener football fan than me, offered a defence of Sunak’s version, which he found to be more natural. He dug out numerous examples of it being used that way – by the likes of Roy Keane, Declan Rice, Andy Murray and others.

Checking the VAR

Let’s get some data. The News on the Web corpus, managed by Mark Davies, is a text archive of more than 30 million English-language news articles from around the world, from 2010 onwards. I searched for occurrences of both versions of the phrase, using eight variants of specific wording for each (you can search for up to a five-word string).

For the “everything” version:

  • leave/left everything out on the
  • leave/left everything out there
  • leave/left it all out on the
  • leave/left it all out there

For the “nothing” version:

  • leave/left nothing out on the
  • leave/left nothing out there
  • leave/left anything out on the
  • leave/left anything out there (these last two pairs are used after negations like “didn’t” or “never”)

The results: the “everything” version is about 11 times as common as the “nothing” version. This ratio hasn’t changed much over the period 2010-23. For British news sites only, the ratio is just 3.6 to one.

As far as I can see, none of the major dictionaries yet include either version of the phrase. But the “everything” version is listed in a couple of the less academically rigorous online ones: Wiktionary defines “leave it all out there” as “to strive to the limits of one’s capacity; to give one’s all when playing a game”. The Free Dictionary defines “leave it all on the field” as “to expend the utmost of one’s energy and effort, typically while playing a sport”. I can’t find the “nothing” version listed anywhere comparable.

From a rummage around Google Books, the earliest use of the “everything” version I can find is in a 1996 issue of Runner’s World: “I left everything out on the course”. The earliest “nothing” version is in a speech from Bill Clinton in 2000: “work like crazy and don’t leave anything out there on the floor on election day”.

Of course both versions will be older, but probably not by much. My guess is that the “everything” version came first, maybe in the 80s, and then the “nothing” version followed a few years later, emerging among people who had heard the “everything” version but found “nothing” more intuitive.

Explaining the offside rule

So why are there two different interpretations of how to get the desired meaning? And is one of them provably wrong?

I have to say I don’t quite understand how the “nothing” version of the phrase is supposed to work. The “everything” version makes sense to me: you go out onto the pitch and give it everything you’ve got, you keep nothing in reserve, and at the end when you come off the pitch you’ve got nothing remaining. You’ve left it all out there. Figurative, of course, but it feels logical.

Finkelstein explained the “nothing” version like this:

You don’t leave things on a pitch. You exhaust yourself until there is nothing left because you’ve given everything.

I’m afraid I still don’t get it. Of course players don’t leave things on pitches, but that’s true whether they’ve made a superhuman effort or jogged around half-heartedly.

The metaphysics of events (bear with me) is such that, while they are happening, they can be located. A football match takes place on the pitch, as does the players’ expending of effort. But once the event is over, where is it? Where is all that effort afterwards? That question is bizarre if taken literally. Figuratively, though, you could say that the resting place of the effort is in the place where it was made – on the pitch.

When a drained player departs the pitch, she has nothing left because she has given everything. And everything that she has given is – well, technically it’s nowhere, but poetically she has left it out on the pitch.

The key to the different interpretations may be in the triple meaning of “left”: departed, deposited, remaining. I understand the phrase to be about putting (“leaving”) everything you have on the pitch and then departing with nothing remaining in your possession. Others seem to think the phrase is about using up everything you have while on the pitch so that there’s nothing remaining (“left”) when you depart. Or perhaps the idea of nothing being omitted (“left out”) is coming into play too. Or even an echo of leaving no stone unturned.

But I just can’t get the “nothing” version to work logically in my head.

When phrases change ends

Still, I might just be overthinking this. Idioms aren’t always as well fleshed out and nailed down as the more formal parts of Standard English. They shift around, wrapped in half-forgotten imagery, and they often don’t make literal sense. Sometimes they even get reversed.

Take the phrase “all mouth and no trousers” – meaning someone who’s all talk but no action. Or is it “all mouth and trousers”? Either may be regarded as a mistake, depending on who you ask. “All mouth and trousers” is the original version: the OED and Google Books both make it about 25 years older than “all mouth and no trousers”, which is now more popular.

Here’s David Marsh trying to make sense of the two versions – it seems that “trousers” means either another aspect of action-free presentation or the container of the body’s sexyparts which would produce the implied action. Whichever one you prefer, you can’t deny that this is a pretty odd phrase.

Or think about “you can’t have your cake and eat it”, which began life as “you can’t eat your cake and have it”. The original version, now nearly extinct, is more chronologically sensible (have, then eat – no problem; eat, then have – impossible). The newer version is semantically peculiar, but has a bit more rhetorical oomph at the end which I guess is why it caught on. To use the original nowadays would be deeply peculiar.

Likewise “head over heels in love”. If you have any training in anatomy, you’ll know that the normal place for the head is indeed some distance over the heels. So the phrase doesn’t say anything remarkable. Originally, though, it was “heels over head”, which far more clearly conveys the idea of being bowled over by emotion. But “head over heels” just trips off the tongue so much more snappily, doesn’t it? And that’s probably why it’s now standard.

Less figuratively, there’s the phrase “cannot be overestimated”. This means that something is so important (or whatever) that however big a deal you might guess that it is, you won’t be able to overestimate it. But many people instead say “cannot be underestimated”, which to those of us who are pedantically inclined means the exact opposite. If something can’t be underestimated, then it’s minuscule and trivial beyond our imagining.

If you squint, though, the alternative version looks a bit less daft. Remember that “cannot” in practice often means “should not”: it’s a moral or practical restriction rather than an absolute physical or logical one. So to say that something can’t be underestimated means that it’s so obviously big and important that only an irresponsible fool would underestimate it.

Hmm. I’m not convinced about that one, but no doubt it is in common use both ways.

The final score

Where do we stand with leaving everything or nothing out on sporting arenas? It seems that the “everything” version is predominant, but “nothing” is common enough to be noteworthy and to persist. Still, either way this is a newish phrase, not so thoroughly or formally established that it couldn’t be upended in time. I think that at the moment we may be somewhere between over/underestimated and mouth/trousers territory.

Rishi Sunak used the much less common version, and the version that (I think) makes less sense when you spend long enough prodding it. But he didn’t just witlessly pull it out of the air, and nor did Bill Clinton 23 years ago. I don’t think it’s grounds for impeachment.

I once heard about a man who left his heart in San Francisco. No doubt the medical journals rushed to examine him, only to discover he simply meant that Frisco was the place he loved more than any other and he didn’t feel complete anywhere else. In a different way, the England team left their hearts – and everything else they had – out on that stadium in Sydney. I hope that before too long they’ll recover their loss.

Colonial English?

‘A new and accurat map of the world’ by John Speed, 1626. We regret to inform you it is neither new nor accurat.
‘A new and accurat map of the world’ by John Speed, 1626. We regret to inform you it is neither new nor accurat.

Oxfam drew some political flak earlier in the year when it published a new version of its inclusive language style guide, to advise staff on their choices of words. The guide, among other things, described English as a colonial language – an idea that offended the sensibilities of some on the right. Criticisms included “totally bizarre”, “virtue-signalling”, “ridiculous” and “politically correct woke drivel”.

The style guide’s introduction says:

We further recognize that this guide has its origin in English, the language of a colonizing nation. We acknowledge the Anglo-supremacy of the sector as part of its coloniality. This guide aims to support people who have to work and communicate in the English language as part of this colonial legacy. However, we recognize that the dominance of English is one of the key issues that must be addressed in order to decolonise our ways of working and shift power.

Well, I find the prose style mildly offensive, and some parts of the advice in the guide seem a bit ill-thought-through, but on this point there’s no room for dispute. It’s a simple historical fact that English is a colonial language, steeped in conquest and violence.

When the settlers crossed the sea all those years ago, they brought words and weapons alike. They seized land and resources, and set up new political and social structures. They killed plenty of the indigenous people, drove many more away, and absorbed the rest – to some degree – into their own imported culture. Vanishingly few traces of the indigenous languages survived in the speech of the new society.

Then the Vikings turned up, with much the same idea.

For a long time they were a major force, dominating much of the nascent England and shedding plenty of blood, but they couldn’t do to the Anglo-Saxons what had previously been done the Celts. Ultimately the Scandinavian invaders and their descendants assimilated, adding some of their words to the stock and helping to reshape Old English grammar as well.

Then, as these changes were working their way through the language, the Normans turned up.

Their brand of colonialism was different: they didn’t want to wipe out the population, just the elite, so that they could become the new rulers. Once that was done, they weren’t too interested in the peasantry, and they weren’t too interested in English, either. They conducted their affairs in French and Latin.

But a lot of their language seeped into that of the common folk, who adjusted their ways as necessary to get along under the new political order. And after three centuries or so, the Middle English that we can still just about understand rose to become England’s dominant tongue – a product of Angle, Saxon, Jute, Norse and Norman linguistic colonialism.

I think some other stuff may have happened later, too, but that’s not really my period.

Boffin-biffing buffoons baffled by beef, before befriending

The Institute of Physics has launched a campaign against using the word “boffin” to refer to scientists and other researchers. They argue that the word conjures up an unhelpful, outdated stereotype that could put young people off science. According to their survey: “When asked to describe what a boffin looks like in three words, respondents painted a clear picture: glasses, geeky, nerdy, male, white coat, serious, bald and posh.”

“Boffin” is one of those odd, slightly dated slang words that don’t much exist outside of UK newspapers. (See Rob Hutton’s book Romps, Tots and Boffins: The Strange Language of News for an insider’s guide to the lexicon of the Great British press.) So the Institute of Physics is in particular asking the big tabloid newspapers to stop using it.

This plea is obviously a red rag to a bull, and the Daily Star has duly (and quite magnificently) charged:

Daily Star headline reading “Boffins: don’t call us boffins”

According to the Star, and I think they mean it, they use “boffin” not just cheekily but affectionately. “We bow to no one in respect of our boffins. But the berks have buggered it up with this Bin the Boffin befuddlement.”

But one argument that might carry a bit more weight than avoiding stereotypes is that the word is – perhaps surprisingly – unclear. The Institute of Physics found:

Over a third of all adults and young people surveyed had never heard of the term before. For those who had heard the word before, there was confusion as to what boffin meant. Suggestions that were put forward of what boffin means included a kind of bird, a type of biscuit, or even a fancy coffin.

So maybe this is a word (for which I confess I have a tongue-half-in-cheek fondness) whose limited niche is going to contract as the generations change.

Its first known use, according to the boffins lexicographers at the OED, was during World War II. It started off meaning an older officer, but it soon shifted to mean “a person engaged in ‘back-room’ scientific or technical research” – perhaps because older officers were more associated with such roles. It was applied in the RAF to scientists working on radar:

Their ages are as youthful as air crews. Thirty-two is considered the maximum… In H.M.S. Wasps’ Nest, anyone aged thirty-two is officially a ‘boffin’. There is even a song about them… ‘He glares at us hard and he scowls, For we’re the Flotilla Boffins.’ (C Graves, 1941)

A band of scientific men who performed their wartime wonders at Malvern and apparently called themselves ‘the boffins’. (Times, 1945)

‘What’s a boffin?’ ‘The man from Farnborough. Everybody calls them boffins. Didn’t you know?’.. ‘Why are they called that?’.. ‘Because they behave like boffins, I suppose.’ (N Shute, 1948)

The origin is, to paraphrase the OED, anyone’s guess. Etymonline suggests that it may have been a reference to a fictional character, perhaps Mr Boffin in Dickens’s Our Mutual Friend. I haven’t read it, but I gather he’s a genial but bumbling figure, an illiterate former dustman who unexpectedly inherits money and hires a personal reader to broaden his education. Not quite sure how that would translate into the WWII usage, though. It feels like a private joke among a small group of friends which then caught on, changing as it did.

So what’s the alternative? The physicists suggest “scientist”, or being specific about the relevant specialism. The only problem there is that “scientist” is harder to fit in a headline than “boffin”. The only relevant word I can think of that’s comparably short is “expert”. But that, in UK politics, comes with its own cultural baggage.

Further research is needed! Send for the bo— [gunshot]

When words don’t mean what they’re meant to mean

Plato, the inventor of the plate.

David Bentley Hart has written a witty, insightful, elegant and provocative piece on ‘How to write English prose’. Given the topic, it’s hard to judge the style and the substance separately.

On the whole, I enjoyed his writing. Savour this passage, on why cultures develop great prose far later than great poetry:

Poetry entered the world almost as early as words did; it is the first flowering of language’s intrinsic magic—its powers of invocation and apostrophe, of making the absent present and the present mysterious, of opening one mind to another. It comes most naturally to languages in their first dawn, when something elemental—something somehow pre-linguistic and not quite conscious—is still audible in them. Prose, however, evolves only when that force has been subdued by centuries upon centuries of refinement, after unconscious enchantment has been largely mastered by conscious artistry, and when the language has acquired a vocabulary of sufficient richness and a syntax of sufficient subtlety, and has fully discovered its native cadences.

That’s a gem.

I also relished his scorn for the unjustly famous writing advice of Strunk and White: “In fact, if you own a copy of The Elements of Style, just destroy the damned thing. It is a pestilential presence in your library. Most of the rules of style it contains are vacuous, arbitrary, or impossible to obey, and you are better off without them in your life.” He does Orwell too.

But Hart is a bit too fond of obscure words: within the space of a 200-word passage near the start, he introduced me to anfractuous, volutes, modillions and quadrature. I don’t mind being sent to the dictionary now and then – it’s good to learn new words – but each occasion acts as a dam that interrupts the flow of the piece. Too many, and the whole thing can dry up.

A proud indifference to the reader’s vocabulary, though, is part of his argument: great prose blends the simple and the complex, whereas nowadays too many writers hew to a bland, formulaic conception of plain English – “denuded of nuance, elegance, intricacy, and originality”.

I should point out that Hart means his recommendations to apply to literary prose – fiction, essays and the like – rather than more functional writing such as public health information leaflets. At least, for the sake of public health, I hope he does.

*

Time for some backstory.

I first encountered Hart’s writing a little over a decade ago, when he wrote a pair of pieces complaining about the myriad failings of common usage. He passed judgement on the correct meanings of words including infer, hopefully, fortuitous, intrigue, momentarily, presently, refute, restive, transpire, reticent, aggravate, enormity and fundament.

I’d say that on a majority of these words, his advice is sound – or at least that he raises a fair concern. But my way of thinking about these questions differs from his. I come at them looking at what aligns with current usage, while he seeks authority in the traditions of literature: “a word’s proper meaning must often be distinguished from its common use”.

On transpire, he’s particularly stern: “I am as inflexible as adamant, as constant as the coursing stars: it does not mean ‘occur,’ no matter how many persons use it that way.” Even allowing for theatrical exaggeration, the rejection of actual usage is unjustified dogmatism.

For me, the main factor is how readers will understand a word; for him, it’s how they ought to understand it. And this connects perfectly with his view on rarefied words.

In his new article, he offers a set of rules for writers. The very first one is:

1. Always use the word that most exactly means what you wish to say, in utter indifference to how common or familiar that word happens to be. A writer should never fret over what his or her readers may or may not know, and should worry only about underestimating them.

In a similar vein, his third rule is:

3. When the occasion presents itself for using an outlandishly obscure but absolutely precise and appropriate word, use it.

His merry indifference to the obscurity of a word, when it carries the precise meaning he wants, raises the same question as his prescriptivism: to whom does this word carry that precise meaning? In the esoteric cases of farraginous, purling, banausic and other selections he makes, the answer must be: not many people. Even, I suspect, among the readership of literary magazines.

You can view the Great Writers of yesteryear as the best guides to a word’s “proper meaning” if you want, but for the purposes of communication a word’s actual meaning is what – if anything – it means to the person reading it.

In this, I believe I’m in agreement with at least the spirit of Hart’s second rule:

2. Always use the word you judge most suitable for the effect you want to produce, in terms both of imagery and sound, as well as of the range of connotations and associations you want to evoke.

An effect produced is produced only in a reader’s mind; connotations and associations are always for someone; meaning is not independent of the community of language users and their understanding of words, an understanding that may vary from person to person, generation to generation.

*

Whether on obscure words or disputed words, Hart almost – almost – seems to believe that meanings are perfect Platonic forms, abstracted from the grubby, flawed, mundane business of human communication and existing in some transcendent realm to which mere mortals have only limited access.

I knew that Hart is a theologian, so I did a bit of googling (look, if you want proper research then you’re going to have to start paying me), and it turns out that, yes, he is a Platonist.

Here he is, some years back, on the idea of “truths deeply hidden in language”:

Consider, for instance, the wonderful ambiguity one finds in the word invention when one considers its derivation. The Latin invenire means principally “to find,” “to encounter,” or (literally) “to come upon.” Only secondarily does it mean “to create” or “to originate.” Even in English, where the secondary sense has now entirely displaced the primary, the word retained this dual connotation right through the seventeenth century. This pleases me for two reasons. The first is that, as an instinctive Platonist, I naturally believe that every genuine act of human creativity is simultaneously an innovation and a discovery, a marriage of poetic craft and contemplative vision that captures traces of eternity’s radiance in fugitive splendors here below by translating our tacit knowledge of the eternal forms into finite objects of reflection…

And in another essay:

A god… whose works are then unnecessary but perfectly expressive signs of this delight, fashioned for his pleasure and for the gracious sharing of his joy with creatures for whom he had no need, is a God of beauty in the fullest imaginable sense. In such a God, beauty and the infinite coincide; the very life of God is one of, so to speak, infinite form; and when he creates, the difference between worldly beauty and the divine beauty it reflects subsists… in the analogy between the determinate particularities of the world and that always greater, supereminent determinacy in which they participate.

For me to pontificate on Christian theology would be ultracrepidarian, but the analogy between Hart’s view of divine creation and his prescriptivism about meaning does seem striking.

I doubt he believes that every English word (and every word in French, Japanese, Yoruba, Cherokee, Farsi, etc, etc) really has a Platonic true meaning independent of usage. I think it’s more likely that when he writes, he strives to emulate in some small way the spirit of creative, self-expressive joy that he describes above. He then shares his joy with readers – although we’re readers for whom he has no need. If we like what he writes, that’s great, but we’re not really the point. Hence his disregard for semantic understanding other than his own.

That’s a defensible (if self-indulgent) position for a literary stylist, but as a general philosophy of usage and meaning its authoritarianism literally defies comprehension.

*

Clearly Hart has a magnificent vocabulary, and revelling in language is a fine thing to do. And yet… of the seven words in his article that I didn’t know (not counting passages quoted from elsewhere), four appeared in the first 5% of the text, and two more in the following 15%. The remaining 80% contained only one. Maybe that’s just chance, or maybe he got bored of searching for those exact meanings as the task of writing wore on – or maybe he deliberately front-loaded the piece with a display of exotica to dazzle the reader before settling down to the business or arguing more intelligibly for linguistic complexity.

Who can say?

But consider this: in his article he offers as examples of great prose ten passages from other writers, totalling nearly 1,200 words – only one word of which caused me trouble (cunctation, from Thomas Browne more than 350 years ago). His star witnesses prove that brilliance can dazzle without blinding.

And so does he. Let’s end on a positive note, with another passage of Hart’s that I loved:

Language is magic. It is invocation and conjuration. With words, we summon the seas and the forests, the stars and distant galaxies, the past and the future and the fabulous, the real and the unreal, the possible and the impossible. With words, we create worlds—in imagination, in the realm of ideas, in the arena of history. With words, we disclose things otherwise hidden, including even our inward selves. And so on. When you write, attempt to weave a spell. If this is not your intention, do not write.

So maybe the effect on the reader does matter after all. Why not see what effect the piece has on you?

Writing skills and grammar teaching: the misinterpreted study of Englicious

A teacher running an interactive grammar exercise (still from Englicious in the Classroom video)

A recent study, as you may or may not have heard, has found that teaching grammar to Year 2 children (age 6-7) does not improve their writing. But that’s not what it found.

The study, by researchers at UCL and the University of York, did not compare grammar teaching with no grammar teaching. It compared one particular programme of grammar teaching, called Englicious, with the grammar teaching that schools were doing already, and it found that Englicious produced results that were essentially no better than the other grammar teaching.

The research paper makes this very clear, although the conclusions section seems to stray a little into over-generalisation. And the news articles published by the two universities both lead with the generalised claim:

The teaching of grammar in primary schools in England (a key feature of England’s national curriculum) does not appear to help children’s narrative writing, although it may help them generate sentences, according to new UCL-led research.

UCL

Lessons on grammar are a key feature of the national curriculum taught in England’s primary schools, but they don’t appear to help children to learn to write, new research reveals.

York

This is the angle the media coverage has largely followed. (Note to journalists: always read the PDF.)

The background

Grammar teaching, which has become a bigger part of England’s national curriculum since 2014, is contentious. Some see it as providing the essential building blocks of literacy and good communication, while others think it a bewildering morass of jargon that has little relevance to how people really speak and write.

I broadly support grammar teaching in principle, although I don’t have strong views on how much to teach at particular ages and I can’t claim any expertise on teaching methods. But I have liked the idea of Englicious since I first heard about it.

Englicious is a set of resources to help teachers run classes on grammar in line with the curriculum. It aims to address two of the big worries about grammar teaching: that it’s dull and off-putting, and that the theoretical knowledge of grammar is disconnected from children’s actual writing skills. Its approach is strong on interactive exercises, trying to incorporate some fun, and it links grammatical concepts with practical writing work. This sounds to me like a great idea.

The study

To test it, the researchers recruited primary-school teachers and put them into two groups: one group used Englicious with their Year 2 classes, after an introduction to the resources and training in how to use them; and the control group taught grammar using the various approaches they were already using (this is the crucial detail that most of the coverage has obscured).

The research paper highlights one distinctive feature of the Englicious approach:

One key difference between intervention and control schools… was that the Englicious lessons consistently included an opportunity for pupils to apply their learning through an independent writing activity that was part of the Englicious lesson. It appeared that this was not a typical approach in every lesson observed in the control schools. In the control schools a wide range of teaching strategies was seen being used to support learning about grammar, for example general approaches to grammar teaching that included using a text to contextualise teaching of grammatical terms and their properties; teacher-led strategies including deliberate inclusion of errors when presenting texts; whole-class activities including discussions while pupils were sitting on the carpet; and use of mini whiteboards for pupils to write sentences…

Because Englicious was designed to link grammar to writing, the main way the researchers assessed its effect was through a narrative writing test, in which pupils had to create a narrative based on a prompt. They also used a sentence generation test, in which pupils were given two words as a prompt and had to generate sentences using them both – a task more focused on grammatical understanding.

The findings

Looking at the test scores from before and after a ten-week period of teaching, the study found that Englicious had no effect on the pupils’ narrative writing scores relative to the control group, and that it may possibly have improved sentence generation scores a little, but this difference was not statistically significant (p=0.25).

The researchers gamely describe the second finding as “encouraging”, although I think “disappointing” would be a fairer assessment. Englicious may have been slightly helpful with actual grammar teaching (more research is needed), but it failed in its primary objective.

This doesn’t mean the approach is worthless: it seems to be at least as good as other current methods; it may have other benefits that weren’t covered by the two tests; and it might help pupils’ learning to last for longer than this ten-week study. And the teachers who used Englicious gave largely positive feedback on it in questionnaires, saying that the lessons were a positive experience for both them and their pupils. That counts for something. They also made some suggestions that could improve Englicious.

The confusion

But there’s no escaping the fact that the study didn’t find the main desired effect. And here, perhaps, is where the widespread misunderstanding of the findings was born. The paper says:

The lack of effect on narrative writing is the main outcome of our research, and is consistent with previously published studies on grammar and writing at primary education level.

These older studies, though, didn’t look at Englicious or the post-2014 English curriculum. The general thrust of the previous research, as this paper summarises it, is that grammar teaching, on the whole, shows little if any sign of improving pupils’ writing. To further support that conclusion, you would have to compare grammar teaching (of some kind) with no grammar teaching, not compare different kinds of grammar teaching.

One sentence on the paper’s methodology acknowledges this problem:

The context of England’s national curriculum requirements meant that it was not feasible to have a control group that did not have any grammar teaching, a control that some would regard as a better comparison.

That last bit seems quite an understatement.

In light of this problem, I struggle to see how the following conclusion is a fair reflection of a study that compared different methods of teaching grammar:

The research found that seven-year-old pupils’ narrative writing was not improved as a result of grammar teaching.

I suppose the researchers could say “We gave it a really good shot, but all we’ve got is another way that doesn’t work; this suggests that the whole idea of connecting writing skills to grammar teaching is doomed to failure.” But it’s clear that they think the Englicious approach has room for improvement.

The upshot

I wish the Englicious team the best in further developing their work. Their approach has merit – it’s just not yet clear how much. And I agree that there are broad grounds for concern about how grammar teaching can improve writing skills.

This was a valuable and well-conducted study – the first randomised controlled trial in the world to examine this topic. But I think it has been over-interpreted: in parts of the conclusion and the university press releases, and in media reports that don’t look too closely at the detail.

Grammar teaching is politically fraught. For many people, it taps into notions of authority, discipline and tradition versus liberalism, diversity and modernity. Debates on it can get pretty heated, with people often falling back on ideological preconceptions. So if we want to improve how grammar is taught, we all need to be clear about what the relevant evidence does and doesn’t show.

Bad grammar, bugbears and dæmons

Dafne Keen as Lyra with Pantalaimon, from the BBC adaptation of Philip Pullman’s His Dark Materials.

Susan McDonald, an experienced subeditor at the Guardian, has written an article that appears to be about grammar and usage but is really about everyone’s favourite topic: how annoying other people are.

McDonald doubtless knows more than a thing or two about whipping ungainly sentences into shape. Her daily work involves tweaking punctuation, replacing clichés, shepherding stray verbs towards their subjects, and making all sorts of other small changes that smooth the path from the writer’s brain to the reader’s.

But she says she doesn’t nitpick for the sake of it, instead using common sense to decide when rules can be bent. I agree with that as a broad principle, but the thing about common sense is that it’s never as common as you think. What strikes one person as sensible flexibility will strike another as sloppy inconsistency; one person’s high standards will be another’s restrictive dogmatism.

McDonald gives some examples of things that definitely do matter (to her):

Some of my personal bugbears come up more regularly than others.

“Compared to” is different from “compared with”. Use the first if you want to liken one thing to another, the second to contrast.

And that reminds me: in my book “different from” is correct, “different to” and “different than” are not.

“Who” and “whom” are both needed but not interchangeable. The same goes for less/fewer, like/such as and imply/infer.

As a copyeditor, I think I would be absolutist about only one of these six. For moderately formal pieces, I’d probably apply three or four more of them across the board and the other one or two depending on context.

But I would also usually avoid using ‘regularly’ to mean ‘frequently’, as McDonald does here – so on that point I’m more of a stickler than her. And there are people who would scorn both of us for beginning sentences with ‘and’.

There’s no objective way of telling which ‘rules’ are the ‘correct’ ones. Any of us can talk about what’s right ‘in my book’ – but each of our mental rulebooks is different.

Some people respond to these differences by saying that the safest thing to do is always follow all the rules – that way, you won’t annoy anyone. But a lot of these (alleged) rules are, to put it politely, stupid. Picking every conceivable potential nit would be enormously time-consuming and make prose awkward, stiff – even annoying. McDonald rightly points out that, for instance, split infinitives and sentence-ending prepositions often produce better results.

A lot of these decisions are judgement calls. You have to think about audience, medium and desired effect. You have to keep abreast of how people are using the language and how they expect it to be used. You have to estimate which fine distinctions are too fine to be reliably clear, and you have to have a nose for what might be likely to cause a stink.

You also have to remember that the people who complain about ‘broken rules’ are far louder than those whose reading is eased by a certain breakage – but that doesn’t mean the loud ones are anything like a majority.

Sometimes there’s no right answer. Language isn’t like mathematics; it’s like life.

McDonald describes her linguistic gripes as bugbears, and many people talk semi-fondly of having pet peeves, but really these attitudes are more like Philip Pullman’s dæmons – they’re aspects of ourselves. They are changeable during childhood but become fixed as we grow up. They might cause us annoyance, but they are a dear, cherished part of who we are, and any attempt to separate them from us causes terrible pain.

The last line of McDonald’s piece is:

Language reflects – and can even define – who we are. So a little respect, please, for its rules.

It’s not just language but also our attitudes to language that are part of our identities. But the ‘rules’ we get the most righteously angry about don’t belong to language in itself. They belong to our personal conception of it. And when we meet someone whose internal rules are frustratingly different, we have two options: banish their dæmon or pacify our own.

Neither is easy.

Why can’t the English learn to speak about English?

Rex Harrison and Audrey Hepburn in My Fair Lady

Clare Foges has written a passionate column in the Times about the dangers posed by linguistic prejudice.

She highlights evidence of discrimination against people, especially young people, whose speech doesn’t fit in. Civil servants who lack the “right accent” are less likely to get promoted, even if they do good work; in industries from law to accountancy, those who don’t sound middle-class enough are less likely to be hired, even if they have good grades; and many professionals give lower ratings to answers delivered in certain varieties of English, even when the content of those answers is good.

As Foges says, people who speak dialects other than standard English “are surrounded by invisible barriers to success, yet we as a nation are too squeamish to say anything about it”. I commend her for raising this injustice, and for her directness in saying that “We do young people no favours by pretending that the way we speak doesn’t matter any more, because it does.” Collectively, we do need to get better at talking about our language and the role it plays in society.

The current situation isn’t just bad for those who are looked down on because of the way they talk. It’s bad for all of us. It means that the legal system is needlessly denying itself valuable talent. It means that the accountancy firms that businesses rely on are carelessly tossing out some of the best recruits available. And it means that we, the taxpayers, are getting poor value for money when mediocrities get ahead in the civil service while more capable public servants are overlooked.

How, then, can we combat this economically and socially ruinous linguistic prejudice? Foges has a simple solution: elocution lessons.

Wait, what?

Yes, she wants to give young people “speech coaching”, to endow them with the diction of the middle classes and the grammar of standard English. The best way to deal with prejudice, it seems, is to eliminate its object.

She briefly considers an alternative proposal – that linguistically prejudiced people should “challenge their biases” – but she isn’t interested. Because, you see, she agrees with them that standard English is better than other dialects. Here’s her argument:

Standard English is best because (the clue is in the name) it is the standard, with rules the vast majority understand. It is the medium through which writers and speakers of the language can achieve maximum clarity and minimum confusion. This is why deviating from it can grate. If people speak sloppily, mangling their grammar and failing to enunciate their words properly, language turns from a window between souls into a wall between them — and swiftly, subconsciously, we label the speaker.

In the spirit of diplomacy, I will try to meet Foges partway on this.

I agree it is important for children to learn standard English – but not because it’s better than other dialects or more precise or more expressive. It’s useful to know because it’s widely used in the public sphere, in business, in academia… in the kind of professions traditionally dominated by people who grew up in well-to-do families that speak standard English.

But standard English didn’t achieve its high social status in some merit-based competition, beating other dialects because of its intrinsically superior grammatical conventions and vocabulary. It got where it is because of William the Conqueror.

Before 1066, the capital of England was Winchester, which had previously been part of the Kingdom of Wessex. Back then, regional differences in English were much bigger than they are today, but the West Saxon variety – spoken in Winchester, the seat of power – looked set to become top dog. After the Norman Conquest, though, the capital was moved to London. The locally spoken Mercian became the high-status dialect, and out of it grew what we now call standard English.

While I bear our friends and neighbours in France no ill will for that ancient act of aggression, I cannot fathom why we in England should let a long-dead Frenchman determine which variety of our own language is best.

Standard English is only standard because of an accident of history. And different dialects are not “deviations” from it, they’re not “sloppy” or “mangled” attempts to speak it: they’re just different. What they lack is not clarity but prestige.

Yes, kids should learn standard English, but they should also learn that English is a family of dialects, related to region, class, and more recently ethnicity. All of these dialects change, including standard English; they influence each other and their borders overlap. (Foges complains about the adoption of Multicultural London English by “teenagers in country towns who desperately want to appear cool”, but changing the way you talk to make yourself come across a certain way is exactly the policy she recommends.)

But if we want to reduce the ill-effects of linguistic prejudice, there’s another side of the coin. Those of us who are perfectly at home with standard English should be careful how much importance we attach to whether others speak like us, especially if we’re in positions where we can help others to advance.

I am an editor. It is literally my job to improve other people’s writing – pretty much always in standard English. I fix typos, I substitute words, I tweak grammar, I rejig paragraphs, trying to help my colleagues come across as clearly and effectively as possible. What I absolutely do not do is judge those of them who are less comfortable than me with the niceties of standard English.

I’ve worked with a lot of people who are good at all sorts of things – and some of them not so good – but I’ve never noticed that their skills and commitment have any correlation with their dialect, accent and enunciation.

So this attitude described by Foges is a lamentable mistake:

When someone says “could of” instead of “could have”, or “pacifically” instead of “specifically”, or “froo” instead of “through”, they are labelled. The interviewer labels them “not sharp enough”. The colleague labels them “not up to it”.

It’s not quite clear whether she shares this attitude herself, but it’s damn clear that she’s not going to do anything to oppose it.

Foges ends by quoting Henry Higgins talking to Eliza Doolittle:

Think what you’re dealing with. The majesty and grandeur of the English language, the greatest possession we have. The noblest thoughts that ever flowed through the hearts of men are contained in its extraordinary, imaginative, and musical mixtures of sounds.

But Eliza’s English is no less English than Henry’s. If all our speech were standardised, those mixtures of sounds would be so much less extraordinary, imaginative, and musical.

Fronted adverbials: what the hell is going on with English grammar?

Every so often a kerfuffle erupts about the teaching of grammar in English schools, and the focal point these days is often the ugly term “fronted adverbials”. What on earth are these obscure things, people wonder, and why are young children being forced to learn about them?

Two daunting words, one simple concept

Let’s start with “adverbial”. It looks like an adjective, doesn’t it? “Of or relating to adverbs”, that sort of thing. But, unhelpfully, it’s actually a noun. An adverbial does the same job as an adverb (modifying a verb or a clause, typically to express manner or time), but it can consist of more than one word.

And “fronted” means it’s at the start of a sentence or clause. That’s not too tricky to grasp, but we hardly use “fronted” to mean that in other contexts, so the phrase feels kind of strange.

Here are some fronted adverbials:

  • Cheerfully, I bit into the apple.
  • Yesterday evening, they went out.
  • Further along the road, a cat was sitting on top of a car.
  • When the going gets tough, the tough get going.

It’s not nearly as hard as it sounds. Whichever linguist came up with the term needs to be docked a week’s pay, but a fronted adverbial is a pretty basic grammatical device that we all use every day.

(The term dates back to the 1960s, but it’s only recently escaped from academia to bother the public at large.)

Well, we never needed to be taught that in my day

Most of us have been confused by why this new term is being taught in primary schools when we as adults have managed perfectly well without it.

I suspect that kids may not find this kind of novel jargon nearly as discombobulating as we adults do. When you’re at school, you expect to learn new things, to be taught new words for new ideas, all the time. “Fronted adverbials”, for all its unintuitive awkwardness, is just another to add to the list. But as adults, we like to think we already know what’s what.

It can be uncomfortable to find your children learning things that you don’t know, but that doesn’t necessarily mean that the new teaching is needless or too advanced, or that you are stupid. It just means that teaching has changed – in this case, because it’s become informed by a better understanding of English grammar.

(One of my nieces, aged five, impressed me by introducing me to the word “digraph”. It means a pair of letters that combine to make a single sound. So in the word ”shoot”, “sh” and “oo” are digraphs. Simple. But the term itself is baffling if you don’t already know it.)

I’m neither a parent nor a teacher, and I don’t have a view about how old children should be when they meet adverbials and their fronting. I would say, though, that this concept should be taught not just as a term to memorise, with bland exercises that require kids to identify or create example sentences on demand. Fronted adverbials are a way of shifting emphasis, of re-ordering information, of changing the rhythm of a sentence. Children should be guided to try out using them, or not, in passages of writing to see the different effects that result. That’s how a knowledge of grammar really helps to improve your writing ability.

And yes, you can achieve that without knowing that there’s this specific term, but having a term can help you identify and think about the concept more tangibly than just “that bit at the end could go at the front”.

The theory can help with the practice. You can walk and run and jump without knowing anything about the muscles and bones in your legs, but if you want to train to get really good, or to deal with the risk of injuries, it’ll help to know a bit about what’s going on inside.

Fronted adverbials are only part of it, though. There’s a lot more grammar in the curriculum than there was in my day, and the grammar taught nowadays is quite different from the grammar that was taught in my parents’ day.

Let’s have a bit of a history lesson.

The decline and fall of the Roman linguistic empire

Social climbers, in their yearning for prestige, often adopt the language of the upper classes – often unconvincingly. The same thing happened to English grammar itself.

A rough, common tongue that grew up on a rainy island on the outskirts of Europe eventually came of age, and it realised that despite its growing literary prowess it still lacked respectability. So it modelled itself on the noblest and most distinguished language of all: Latin.

Sadly, some of the concepts and categories and distinctions used in Latin grammar didn’t fit English very well, but enough of the literati and educational establishment still insisted on adopting them. For this act of vanity, generations of schoolchildren have suffered.

Roman rule in Britain ended before the (Old) English language had even taken hold. But, a millennium later, Rome’s dead language colonised our understanding of our own. It ruled, ineptly but firmly, for a few centuries. Then, after a string of revolts, Latinised English grammar was finally overthrown in the 1960s, and – after a few decades of anarchy – a more authentic understanding of English grammar is now being applied in classrooms.

The new regime has had no shortage of teething problems, and sometimes its proponents implement their concepts with heavy-handed revolutionary zeal. It also has to contend with nostalgics, counter-revolutionaries, and people who just don’t want to have to think about this stuff.

(For a more thorough and less metaphorical telling of this history, try David Crystal’s book Making Sense: The Glamorous Story of English Grammar.)

The aversion to “fronted adverbials” is part of this. It’s a sound concept but it comes across badly. While the theory of grammar is in a much better place than it used to be, it will take a while to figure out how best to present the ideas in schools.

But what’s the point of talking about fronted adverbials?

This morning Michael Rosen, who abominates the current state of grammar teaching, wrote:

Just last week I was privileged to record a radio programme about writing with one of the great modern writers, Hilary Mantel. We talked about the sound and rhythm of sentences, the struggle to find the right word, the shaping of a paragraph so that it sets a scene before introducing a character, and much more. We talked for nearly an hour and we did not mention a fronted adverbial once.

The shaping of a paragraph so that it sets a scene before introducing a character. That’s exactly what Rosen does here, and he uses a fronted adverbial – “just last week” – to do it.

They’re worth knowing about. I just wish they had a better name.

How do you cope when everyone’s usage is wrong?

Princess_Bride_That_WordThe remarkable thing about language change is that it only started happening when I started noticing it. For centuries, English was constant and true, but as soon as I was old enough to have an appreciation of good standards of usage, people around me started falling short. Since then, there has been an alarming, unprecedented surge in rule-breaking.

Neither I nor anyone else really believes any such thing, of course, but some of us sometimes talk as if we do. One such person is Lionel Shriver.

In an article in Harper’s, she wages war on what she calls “semantic drift”. Using the rhetorical style that’s obligatory for such pieces – mock-theatrical (and therefore deniable) moral horror – she rails against “decay”, “degeneration”, “blight”, “barbarism”, “mob rule” and the replacement of “civilised” with “contaminated” English at the hands of “animals”. Shriver’s a fantastic writer, but this kind of thing is just tiring.

The substance of this linguistic apocalypse is, as she sees it, the ignorant modern misuse of words such as literally, nonplussed, notorious, performative and enervated, and the blurring of distinctions such as less/fewer, as/like, who/whom and that/which.

On some of these, I think she has a point. While it’s unlikely anyone will be genuinely confused by “My head literally exploded”, the near-opposite meanings that nonplussed now has make it hard to use reliably. And it’s handy, even if only for formal occasions, to know how to whom. The that/which distinction, on the other hand, is needless. Most Brits (and a good many Americans) are indifferent to it, with no ill effects.

But Shriver’s examples of “semantic drift” also include grammar (flat adverbs and pronoun case) and punctuation (comma splices and indiscriminate dashes), so I guess the word semantic is drifting quite a bit too. She also makes it drift to include pronunciation, claiming that “‘flaccid’ is actually pronounced ‘flak-sid’”. In light of usage, which she accepts is almost entirely ‘flassid’, the meaning of actually must have drifted as well.

OK, that was cheap snark. But it gets us to the heart of the matter: what determines the actual rules of English?

There’s a view that the rules are wholly independent of the usage of English speakers, that the theory is what’s real and true while the practice is at best an approximation and more often a travesty. On this view, usage is evidence of nothing other than failure and corruption.

Nobody really believes any such thing, of course, but some of us sometimes talk as if we do.

The Good Book or the guidebook?

Shriver was raised as a language stickler, and the pedantry she inherited from her parents she reinforced at school. But for her, the ultimate authority, the guide to “official” English, is The Dictionary. She knows that she is fallible – her parents too – and is willing to take corrections when appropriately justified:

Hence when the copy editor on my first novel claimed that there was no such word as “jerry-rig,” I was incensed. Determined to prove her wrong, I went to my trusty, dusty-blue Webster’s Seventh (based on the august Webster’s Third), only to find she was right: “jerry-rig” wasn’t listed. Apparently I’d grown up with a garbled portmanteau of “gerrymander,” “jerry-build,” and the word I really wanted: “jury-rig.” The scales fell from my eyes.

A convert, I explained to my mother her lifelong mistake, but she was having none of it. “Oh, no,” she said gravely. “‘Jury-rig’ refers to rigging a jury, which is very serious.” Explaining the allusion to a “jury mast,” a makeshift sail, with no etymological relationship to a judicial “jury,” got me nowhere. It’s fascinating how ferociously people will cling to their abiding linguistic assumptions, however wrongheaded.

But there’s a twist: nowadays, dictionaries list the “incorrect” spelling as standard. “The mob – and my mother – have won.” Shriver, though, isn’t going to budge. Even though recent dictionaries now align with the way most people spell it – and the way Shriver herself long did – she has found her truth and she’s sticking to it, with the zeal of a convert whose prophet has snuck off to the pub.

For Shriver, a dictionary should be a rulebook of almost scriptural immutability. She wants usage to adhere to the rules that she spent time and effort internalising; any deviation, whether by the ignorant masses, by trendy literati or by dictionaries themselves, is to be fought.

The better way to view a dictionary is as a guidebook. It describes the features of the language as you’re likely to encounter it, and it thereby helps you find your way around. To do this, a dictionary needs to record differences in usage and it needs to be able to change.

Don’t just take my word for it, though.

Shriver’s “trusty” Webster’s Seventh New Collegiate Dictionary is a 1963 abridgement based on the “august” full-length Webster’s Third New International Dictionary, which came out in 1961. The Third was not seen as august at the time. In fact, it outraged many contemporary sticklers, who were appalled by its permissive, descriptivist approach. In the preface (the bit that nobody reads), its editor, Philip Gove, wrote that “a definition, to be adequate, must be written only after an analysis of usage”. He concluded:

This new Merriam-Webster unabridged is the record of this language as it is written and spoken. It is offered with confidence that it will supply in full measure that information on the general language which is required for accurate, clear, and comprehensive understanding of the vocabulary of today’s society.

Today’s society. As a new dictionary, it paid no heed to the aggrieved traditions of yesterday’s sticklers. And Gove knew that his work – his guidebook – would have a shelf-life. He knew that some of the language his team mapped would change in years to come. He wouldn’t have wanted the book to treated as scripture almost six decades later.

But that scripture is what Shriver grew up with. That book formed part of the fundamental order of the world as she was honing her command of English, so it’s understandable that departures from it seem like creeping anarchy, like the destruction of something precious – like a “bereavement”, even.

Each generation thinks it invented language change

Maybe I can offer a scrap of consolation. Despite Shriver’s fears, language change definitely isn’t her fault.

Noting that she is more liberal than her father on some matters, such as the meaning of decimate, she says: “my own generation probably instigated this decline in the first place”.

Not guilty. Decimate slipped the bounds of “reduce by one-tenth” to start meaning “destroy a large part of” as early as 1663.

And some of the recent changes that make up her bugbears are not that recent:

  • Notorious, Shriver says, doesn’t just mean “well-known”. But the word dates back to the 15th century, when originally it meant exactly that. Over the years it acquired negative connotations, and for a long time it has mostly been used negatively – but only mostly.
  • She deplores the modern use of quicker as an adverb. But here’s Tennyson in 1865: “Nature… on thy heart a finger lays, Saying ‘Beat quicker’.” Adverbial quick has been in constant use since 1300 – informal, but hardly disreputable.
  • Performative is a term in linguistics, relating to utterances that enact what they state: “I promise”, “I warn you”, “I apologise”. Nowadays most people use it to mean “relating to performance”, but the correct word for that, she says, is performatory. In fact both words have a patchy history. JL Austin coined the technical sense of performative in 1955, but for several years before that he had been using performatory that way. For the performance-related meaning, performative goes back half a century earlier and is the norm today. Despite Shriver’s pessimism, the word’s linguistic meaning is alive and well too – among linguists. Many words comfortably carry more than one meaning, depending on context. We don’t need performatory and we shouldn’t mourn it.

Change didn’t begin with the baby boomers. It’s always been happening (and people have always been complaining about it). The rules Shriver grew up with were simply the customs of the day – some ancient, some much newer. Most are still in place, but the changes stand out. And even when the changes aren’t changes, the realisation that many or most people don’t follow your preferred conventions can be disconcerting.

Our language is part of our culture, our identity. We like things to be done our way, and we like to think that our way has some objective, enduring superiority. So yes, it’s fascinating how ferociously people will cling to their abiding linguistic assumptions, however wrongheaded.