It’s the etymology, silly

I was in Dusseldorf airport last week and I overheard two American travelers share their experience of learning German in Germany. Eventually they landed on the language’s case and gender system. Morphology always makes a good conversation topic.

“I would understand if there was any logic to it. But it makes no sense”, said one of them.

He was referring to why some German nouns are masculine, some femine, yet others are neuter. This is a big source of confusion for learners, whose expectation of patterns is constantly disrupted by exceptions to the rule to the point that rules cease to have the force they ought to in the process of memorisation and retention.

But the sad truth is that it makes sense, and that there is logic to it. It is not the logic of the pattterning textbooks propose to us, but the logic of origin and evolution.

German gender is but a drop in the ocean of linguistic defeat-the-ruleism. Take English. Why would the word ‘lamb’ have a silent ‘b’ or ‘salmon’ a silent ‘l’? And why Grosvenor Square should be pronounced Grovener?

The answer is etymology. Words have a history, they evolve like societies, like cultures, like kin. They are born and they change change, growing into other forms. But somehow they also stay true to their source. They reflect their parentage. And it makes sense if we care to check. To probe, to look more closely.

Words, like people, come from somewhere. The English word lamb comes from the same Proto-Germanic word which produced the Dutch word ‘lam’ and the German ‘lamm’, both devoid of the b-sound. Salmon comes, of course, from British waters, but as a word it is of old French origin, ‘samoun’ where the l-sound of the Latin ‘salmo’, its ancestor, had already fallen off. Grosvenor is also French, from ‘le gros veneur’, (‘s’ being silent in Old and Modern French) the fat huntsman, an epithet bestowed on Hugh Lupus, one of William the Conqueror’s aids, who also bore the title ‘Le Grand Veneur’, the master huntsman.

The roots often have the key. But we don’t always play in that key.

‘Teste David cum Sibylla’

The Sibyl of the Delphic oracle on Michelangelo’s Sistine Chapel

Western culture is one of most hybrid of all modern cultures. At its root, a mix of ancient Greek tragedy, Roman pragmatism, Jewish narrativism, Christian messianism, Germanic heroism, Celtic supernaturalism, Northern European puritanism, English constitutionalism, German idealism, French revolutionism, Italian non-me-ne-fottoism. The list goes on. The hundred-headed monster whose children we have become.

The Catholic Requiem or Mass of the Dead contains something so synthetic that it often goes unnoticed.

The Dies Irae:

Dies iræ, dies illa,
Solvet sæclum in favilla:
Teste David cum Sibylla.

Literally translated:

The day of wrath, that day,
will dissolve the world in ashes:
this is the testimony of David and of the Sibyl.

Made famous by Mozart and Verdi, the sequence was composed in the 13th century and attributed to two Italian clergymen.

The European 13th century was a time of effervescence and cultural synthesis. A time of weaving all the available threads into the cultural DNA helix which the early modern world inherited.

Teste David cum Sibylla. This is the testimony of David and of the Sibyl. David of the Old Testament and the prophetess of the ancient Greco-Roman world. It is no coincidence that the author of the Dies Irae juxtaposed two of the most prominent figures of medieval culture, the archetypical king and the mysterious soothsayer. Male and female.

For most of the Divine Comedy, Dante also juxtaposes Christian and pagan figures in his encounters with the souls in Inferno, Purgatorio and . Virgil vs Beatrice, Cato, Statius, Ulysses and the Emperor Trajan vs Adam, Thomas Aquinas, St Dominic and Bernard of Clairvaux.

The two threads run side by side in medieval culture, which makes it hard for us reading the sources to understand the extent to which the European medieval mind could see tension as well as convergence in the sources of the worldview they were crafting.

Patterns of trivialisation

D R Thomson, Lost in the wasteland

Our post-industrial cultures have been shaping and exploring various patterns of trivialisation.

Trivialisation of friendship.

Trivialisation of worth.

Trivialisation of sacredness.

Of heroism.

Of freedom and liberation.

Of authority.

Of government and citizenship.

Of living and dreaming.

At the heart of these patterns, there is a strong urge for simulacra and impersonation, metaphor gone wild, words without meaning, spaces of mental vaccuum, disembodiments of various sorts, uncharming paradoxa, the beautification of the ugly, the dissolution of semantic coherence, the road leading nowhere seen from nowhere above.

Instead of trust, widespread suspicion, war declared on everything and everyone, claims of hope in the name of nihilism, the proposal of uprooted roots, and pervasive anxiety over a world seen just not good enough, not constitutive enough for our giant dreams of denial.

More purity codes. The ongoing re-tabooisation of human landscapes, wastelands airbrushed green, the refusal to enter into dialogue for the cultivation of wisdom. The winner takes all.

The abyss has long ceased to stare back at us. It metastasizes across the land, as we retreat into pure subjectivity. On the other side, a locked door. It doesn’t occur to anyone to knock. There’s no-one inside anyway.

There’s plenty to lament, but even more to be grateful for. Plenty to despair over, but more to embrace.

The body of text, the corpse of meaning

Barrie Tullet’s Valley of Disease (Inferno Canto 29 of The Typographic Dante)

No matter how much you exercise, how healthy you eat and how carefully you plan, death comes to us all. Verrà la morte e avrà i tuoi occhi.

It’s the same with words. Language evolved every which way before the advent of Academies, and it continues to do so afterwards, despite the attempts to set it in stone. Words rise and fall like empires, no matter how well enshrined in use and status. Meaning shifts despite all our efforts to either arrest or channel it. Languages don’t have a soul, but they certainly look as though they had a mind of their own.

Language is adaptive, it helps negotiate the agent-arena relationship so language users are best fitted to their environment. Describe, analyse, communicate. Words are crafted, but we are in turn crafted by them.

The Latin alphabet hasn’t changed at all for over 2,000 years, but the languages which are written in it haven’t ceased to change. Even when the words didn’t change, the meaning has continued to shift. Morphological bodies standing triumphant over semantic corpses.

The excitement one experiences in the face of etymological insight, the wow moment as a word discloses its roots and its history is also the uncomfortable realisation that form outlives content. Language may facilitate our being in the world, but taken historically, it may hinder it. Perhaps that is why all languages and meaning decline, that we may be reminded that nothing really lasts, and the only hope is to recognise this and strive towards a constant sense of epistemological renewal.

Leading out and leading in

To think that as one moves across the board, things don’t get lost is quite unrealistic. Something always gets lost, pieces get captured. the question is: what’s the trade-off? Has the strategy worked? Has the outcome been achieved?

There is always a cost. Sometimes the cost can be computed, sometimes it can’t.

The cost of education goes beyond the financial cost of education to the pupil or to the state. The cost of education is the total cost of cultural acquisition which led to what our education is supposed to stand for and achieve: the pursuit and cultivation of wisdom, the achievement of financial independence and support for kin and community, the responsible championing of the civilizational project of our species.

The ancient Romans collected all of these insights into the word they bestowed to us and which has already been used four times in this post: education.

At the head of the Roman troops was the dux, the leader, a name, an office and a title which was derived from the verb ducere, to lead. The English word ‘duke’ which evolved out of it doesn’t even begin to do justice to the Latin root.

Leading was so pivotal to Roman culture, and the military played such an important role in the construction of Ancient Roman state identity that leadership was seen as the mainstay of society. Role-modeling and the celebration of memorable styles and acts of leadership shaped the way in which the Romans understood their history and addressed the challenges of their time.

But leadership was even more foundational than the military. The way to bring up a child in Roman society was to educere, to educate them, from the verb ducere, to lead, and the separation particle ‘ex’ meaning ‘out of’. So to lead the child out of the wilderness of culturelessness into the life of a citizen. To bring them out of ignorance into the light of knowledge, skill and culture. Acculturation was understood as guiding the child out of whatever lies outside civilization, the barbaritas of those who can’t use the psycho-technologies of the wisdom tradition.

And the cost of ex-ducere? The schools of ancient Greece and Rome understood very well the value of education in the great insurance project of Mediterranean civilization. Grounded in role-modelling and centred around the spoken and written word – the words of grammar, rhetoric and philosophy, the gateway into effective and persuasive communication and access to the cultural capital of the past.

The limp confessions of a late bloomer

I was not an early reader nor, once reading became a feature of my mental landscape, an avid one.

As a child, I loved being read to, and I dazzled the adults around me with my capacious memory and ability to recite thousands of lines of poetry. But by the time I turned 14, I had forgotten all of them and I began to be a source of disappointment to the literary-minded grownups around my parents (and secretly to my parents as well).

Despite my early ‘prowess’, I didn’t discover the pleasure of reading until I turned 18 or thereabouts. But then, like a Pentecostal tongue of fire, it settled over me and hasn’t left me since. To this day, I feel mild anxiety whenever a few days have passed without me turning pages and caressing the book covers like the body of a beautiful woman.

But, to go back to my literate beginnings, nothing would have suggested my slow, very slow, progress in opening up to the world of books. I learned how to read and write quite effortlessly. I don’t remember the day I could say, yes, I can read, but I do recall the ease with which I came to master not just writing letters, but writing beautiful letters during what at the time was still known as the ‘calligraphy class’, one of the mandatory modules of primary school tuition.

I also recall teachers being quite fond of my reading diction, as I was often asked to read passages to my peers, something I did with panache. I enjoyed performing without being in any way literate minded.

When other pupils looked forward to their summer holidays, I dreaded them because of the daunting ‘summer reading list’, a now-in-hindsight quite overambitious list of books the school asked us to read and summarise over the roughly three months of summer break. That, and not maths problems or dealing with bullies at school, taught me the value of strategising and innovative problem-solving, as I looked for ever more sophisticated ways of showing to my teacher and my parents (who were effectively the teacher’s minions at home) that I had read the books without ever reading them. I shared some of my stratagems with my mates, but most of the time I kept things to myself. Before Cliffnotes, Reddit, Blinklist or GPTs, this level of deception required serious brainpower. Looking back, I regret not using my powers for more lucrative projects later in life.

A friend of mine, whose name I shall not mention though this might bring him unequal fame, could speak for hours quite convincingly about books he’d never read. He never taught me his trade, which meant that I had to find proprietary ways to deceive my tutors.

But all of this to say that despite all the resources poured into this kind of subterfuge, I was not drawn to the books whose synopses and digests I pieced together. I didn’t find any pleasure in abandoning myself to a universe that extended beyond my very present concerns, pleasures and sources of enjoyment. I didn’t care for drifting into make-believe, suspending disbelief and letting myself be guided through unsuspected worlds which might have a thing or two to teach or dazzle me.

That came much later, and with a vengeance.

The radical self

Past performance is no guarantee of future outcomes. An increase in quantity won’t lead to a surge a quality. More is not necessarily better, and good yesterday doesn’t mean better today. Yet our mind seems to be wired to believe that the arrow keeps flying and that time, rather than agency, is the real engine of historical change.

Moore’s Law and Raymond Kurzweil’s predictions are both the daughters of the myth of progress. Given more time, any challenge will be overcome. We may not have enough time, some cry out in the wilderness of eco-apocalypse, but otherwise time will solve it.

Every major historical and cultural innovation the West has done – innovation relative to the rest of world cultures – has tended to morph into its radical self. The ‘discovery’ of the human person endowed with rights and the sacrosanctity of human life, which the medieval West started and the Enlightenment completed, the hopefulness of the future tense as a break away from the laws and claws of historical necessity and inexorable cyclicity, the elevation and transfiguration of the marginalised and downtrodden which flies in the face of human evolution — these are all accomplishments which, as the West relentlessly pursued them, led to formidable ideas which came to dominate our culture. Chief among these, the myth of progress, which is really why we always talk about the things we haven’t got but fear/hope one day we might, whether it’s artificial intelligence, transhumanism, energy self-reliance, or else.

The beginning of the end

Fellini’s Satyricon (1969)

It’s not a coincidence that the first European novel is also one of the weirdest pieces of literature.

Petronius’ Satyricon, a work of the 1st century AD, features gay threesomes, warewolves, twisted dinner parties, leather sex toys and cannibalism. It is a monument built on and around irony, but it is also a labour of pleasure, Petronius having certainly enjoyed writing it as much as we enjoy reading it. It is also a work of deeply distressing decadence. Its title, rooted in the genre of satyre and no doubt advocating it, is completely powerless to conceal the sensibilities of an ancient Roman culture which felt itself, 2,000 years ago, already too old, in many ways obsolescent and decrepit. A kind of ancient postmodernism, the cynicism of a kind of the end of historical meaning or the meaningless of history taking over those who had spent too much time around the centre of power and had seen everything culture had to offer.

We read the Satyricon with so much excitement that it’s hard to slow down and maybe arrest the narrative orgy to which the reader is invited and stare at a declining culture, a fading civilization whose only pastime worthy of investment is to ‘pastichize’ the values which had given Rome its strength and ensured its solid, though limited, success: honour, hospitality, curiosity, pietas, frugality and a disavowal of oriental affectation.

Mental breakthroughs

The calendar page for May in an 11th century manuscript from the Abbey of St Gall, Switzerland (St. Gallen, Stiftsbibliothek, Cod. Sang. 394). The medieval calendar integrated the ancient Roman day-week-month system with the Christian liturgical year.

There is a time to innovate and a time to fall back on the certainties of past achievements.

As many of us are moving fast and breaking things, and others wall off golden ages to be recovered and restored, it’s quite useful to look back at an age, perhaps the only age, that tried to embody the ancient Roman symbol of Janus, the god of beginnings and transitions whose double-headed figure looked both backwards and forwards in time.

The medieval period was many things, too many things, which is why most people, who didn’t sign up to a lifetime devoted to understanding the most midunderstood part of European history, are so prone to misrepresent it.

The medieval mind was not unlike our modern mind, but in at least one way it was very different. It consumed itself with a project of convergence and reconciliation. It was eaten up by a desire to harmonize the Greco-Roman ideas and ideals with the Christian values it found itself often incapable of synthesizing and incorporating into the culture from which Europe emerged.

On the one hand, there was Aristotle. On the other, there was Scripture. On the one hand, there was Stoicism, on the other, there was St Paul. On the one hand, there was ancient authority, on the other, there were the original ideas that scholars worked out. Few believed these ideas were reconcilable, but the project was pushed through, despite the odds, despite the apprehension.

For the medievals, a new idea was synonymous with an undesirable idea. The mind stayed faithful to its evolutionary adaptive patterns: what hadn’t been trialed was regarded with suspicion and alarm. What is the point of challenging a tool which had proved to be successful in the past? Unless the problems have been reframed, there is little use for novel ways. We see this approach in full swing among traditional and traditionalist societies and cultures which have little or no patience for changing their ancestral ways. The most moderns of us find this framework unacceptable, but the most moderns of us have themselves little patience for looking through the eyes of those whose understanding of their fittedness to their environment depends on reproduction rather than innovation.

The medieval mind tried really hard to break away from the evolutionary strictures of its own constitution. Nothing is harder than to make someone see things in a different way. So, by setting out on a convergent path of bringing the past and the present together, as for example the work of countless medieval scholars tried to achieve in the area of developing an accurate and culture-viable calendar (the conquest of time), the medieval mind forced out new ways of seeing, or simply the fact that a new way of looking at things isn’t the threatening prospect it thought would be. And this breakthrough, which took centuries to mature, broke out into a myriad other breakthroughs, which created the mental space we inhabit today.

‘You can’t sign an ebook’

I attended a book launch yesterday that my company hosted. The author introduced the book and then took questions from the audience. He had brought a few copies to sign and give away. There were not enough books for everyone who wanted one, at which point the author said, don’t worry, I have a QR code here, you can use it to download the book for free’.

‘Yes, but you can’t sign an ebook’.

There are many things you can do with an ebook that you can’t with a paperback. An ebook is more practical, can be read more easily, it’s cheaper and can reach more people. Under the laws of the market, an ebook should dismantle the hardcover for ever.

But this is not what we’ve seen. I don’t know almost anyone who will prefer a Kindle to a paperback. And they won’t do that because they can’t get the Kindle signed. They won’t make the switch because the book is more than the text it carries.

Many worry that a next-generation AI will write the books that humans won’t have to. The internet is buzzing with the creative capability of the likes of GPT-4, and we marvel at the heights already attained by an ultra-literate machine. But a book is more than the text it carries, and writing a book is more than the narrative it produces. It is an experience which transforms both the writer and the reader in ways that put both on a journey of self-discovery. Not the destination, but the road – and that is precisely what is missing from common understandings of creativity and machine learning and output.

We care so much about the result that we forget the process, which is what has afforded and enabled the insights that took us where we are today. Like in so many other areas, we dig a ditch around our remarkable achievements as a culture and move next door, thinking that the work is done. But the work is never done, not even close.

Blog at

Up ↑

%d bloggers like this: