1035 stories
·
1 follower

A URL to respond with when your boss says "But ChatGPT Said "

1 Share
Comments
Read the whole story
mrmarchant
1 hour ago
reply
Share this story
Delete

Process World, Object-Oriented Mind

1 Share

A couple of weeks ago I shared an attempt at visualizing framings and models, the basic tools of all human reasoning. I ended up with a kind of diagram that looks very much like good ole' UML.

Left: A "framing diagram" about a mental ontology (by me). Right: an example of a class diagram in UML.
Left: A "framing diagram" about a mental ontology (by me). Right: an example of a class diagram in UML.

UML stands for Unified Modeling Language, and it is a set of standardized diagram types designed to prototype and communicate the most common software concepts, from architectural structures to process sequences. There are many kinds of diagrams in UML, but one of the most popular is what's called "class diagrams", of which the right side of the image above is an example.

Hierarchical tree diagram showing the taxonomy of UML diagram types. At the top is 'Diagram', which branches into 'Behaviour Diagram' and 'Structure Diagram'. Behaviour Diagram includes Activity, State Machine, Interaction (with subtypes Communication, Interaction Overview, Sequence, and Timing), and Use Case diagrams. Structure Diagram includes Class, Component, Object, Composite Structure, Deployment, Package, and Profile diagrams.
Genealogy of UML diagrams... as a UML diagram.

Class diagrams are based on the idea that you can divide the world up into things called "classes", which are basically general categories of things, like Person and Job. These classes have their own properties and behaviors, and can interact with each other. For example, the diagram may show that every object in the Person class may have at most one link to an object in the Job class. Define more classes and more interactions, and you can model basically anything you want with your code.

This sounds exactly like what I've been calling framings and models here on Aether Mug! Except I wasn't talking about software but about the human mind.

When I first noticed this parallel, I thought I might have simply defaulted to a familiar thinking paradigm without even realizing it. A sort of unconscious borrowing. This is probably true in the inspiration and the details but, thinking about it more, I think the parallel runs much deeper than that.

The interesting question is why engineers trying to build better software programs would discover an ideal way to represent the human mind.

The Rise of the Object-Oriented

For a long time since the early days of computer programming, code usually took the form of lists of instructions, like this:

  1. Start with a total of 0.
  2. For each price in a given list of prices, add that price to the total.
  3. Once you've gone through the whole list, return the final total number, which is the tally of all prices.

This so-called procedural approach worked perfectly fine for small programs like that, but it ran into several issues for larger and more complex applications. The more moving parts you have, the more your procedural code becomes a spaghetti tangle of variables being passed around and hard-to-track operations. Often you wanted to do a very similar operation again, and you had to duplicate the same code with small variations, bloating the program. You had a high risk of losing track of the details and introducing more bugs.

What emerged as a solution to those problems was the object-oriented programming paradigm, or OOP. This was a remarkably different way to think about the code.

Instead of dry data being manipulated a step at a time, with OOP you identify the key entities (or classes), what their structure is, and how they behave. These classes usually have real-world names like Person and Job, and they are precisely what those UML diagrams (which historically came later) are meant to represent. A basic program of this kind would work like this:

  1. Define a class called ShoppingCart.
    1. The ShoppingCart starts with an empty list of items with prices.
    2. The ShoppingCart has the ability to add items with prices to it.
    3. The ShoppingCart has the ability to compute the total sum of the prices it contains.
  2. Create an object of the class ShoppingCart.
  3. Add three items to it with its predefined ability to add items.
  4. Return the total price of the ShoppingCart using its predefined ability to tally up its prices.

I think you will agree that this feels less abstract and dry than the procedural example, even though it is doing essentially the same thing. (The object-oriented program is still procedural at its core, because it indicates what needs to be done at each step, but it clumps things together differently.)

You're not just "adding X to Y" any more, you're using the "abilities" of objects to do things. Of course there is much more to the object-oriented paradigm than dividing things into classes, but it is this real-world intuitiveness that, I think, led to its taking over the software world.

The theoretical roots of OOP date back to the 1950s and 1960s, but it really began to spread in the late 1970s. The 1990s, when the technology supporting it had been refined enough, saw a tidal wave of hype and excitement about the object-oriented way of coding. Countless books were written about it, more and more languages supported it, and it was clear to most that it was the future of software, and that the purely procedural way of doing things was going extinct.

Why this popularity? Apparently, OO code felt easier and more powerful. There is a surprising amount of psychological and educational research from that era showing its strengths. Among other things, it makes people much faster at sketching out the initial structure of the programs and imagining how it might work, and gives experienced programmers more time to test and evaluate various approaches in practice.

In other words, OOP seems to tap into some of the pre-existing wiring of our minds better than the older approaches did. This, of course, was not an accident. The scientific notion that people tend to think in terms of discrete objects and agents with behaviors and interactions is always cited as one of the main reasons and strengths of this paradigm.

That's the theory.

Loading verification...

Objects Disappoint

Today, the object-oriented paradigm is not considered to be "the future" anymore. Don't get me wrong, OOP is still wildly popular, being at the root of major programming languages like Java, C++, and C#. It isn't going anywhere, but it is certainly considered to be declining in popularity. Many of the top languages today support other paradigms either exclusively or as an option. New languages are appearing that return to the old procedural approach, or to a very different approach called functional programming (more on this below). What is causing this loss of popularity? I find this question to be very revealing about human nature.

As far as I can tell from the research on this question, there are two major categories of problems with the object-oriented view of code.

First, some aspects of OOP feel counterintuitive. A key feature of classes in OOP is the ability to create abstractions, that is, to define a "genealogy" of inheritance from more generic to more specific classes. For example, in a videogame where players battle monsters, the class Person may be defined as the "child" of a more abstract class called Being, which is also the parent of the Monster class. Being would be used to define all of the traits in common between all children, like the management of their hit points, their ability to move, etc. The sibling concrete classes Person and Monster, on the other hand, would each define characteristics specific to themselves only—the handling of user control for Person objects, for example, and autonomous AI behavior in the Monster objects.

In simple use cases with few moving parts, people have no trouble with such inheritance patterns. But when the problems get more complex, and the solution requires deep hierarchies and many kinds of interactions, people find it increasingly difficult to figure out the best way to define classes and their family trees. Indeed, they find it easier to think in the supposedly-outdated procedural style—despite its spaghettification risk!—than in terms of classes.

The second kind of problem people encounter with OOP is that it simply isn't always the best way to think about a problem. Some tasks are inherently about processes happening more than they are about entities doing things. For those tasks, research has shown that people actually perform better with the procedural way of programming.

These two obstacles often lead to more bugs, harder-to-debug bugs, and headaches with certain kinds of problems, for example those involving parallel and distributed computation. We'll come back to these observations later.

From the perspective of many software engineers today, OOP has failed to deliver the amazing promises it made forty years ago. I don't think many of those engineers are against it 100% of the time, but there is a general understanding that it is simply a tool among others, and that there are many important cases in which it is not the right one. Some of them may tell you that it very rarely is the right one.

Wait a minute. Wasn't OOP successful because it is grounded in the way we humans think—what I call framings, models, and their black boxes? Is there something even better taking its place?

A Humble Functional Answer

Another paradigm is recently emerging in contrast with object-oriented programming, with a radically different approach. It is called functional programming and, as the name implies, it deals with functions, not objects.

A functional programming (FP) language would implement the shopping cart example as follows:

  1. Define a function that takes a list of numbers and recursively adds them up one at a time until it has gone through the whole list.
  2. Input a list of prices into the function.

Unlike OOP, functional programming lacks concrete objects that "exist" and "do", and instead works with pipelines through which data is transformed.

(The simple example above may look similar to the procedural case, but it has some key differences that avoid many of its usual pitfalls. Instead of giving a list of instructions like "do this, then do this, then that", with FP you define what transformations are possible, then you see what happens to the data that you put in. The functional mindset is like building an automated assembly line, while the procedural mindset is like training a team of (very fast) assembly-line workers with the steps they have to execute.)

In general, FP leads to fewer bugs, and it works better with distributed systems and parallel processing compared to the other paradigms. And I don't think this is a coincidence.

The real world is a continuous network of interactions, and the currency being transferred and transformed through it is differences (what some call "information"). Boundaries are in the eye of the beholder: the things we tend to see as separate objects aren't really that separate after all. From this perspective, it seems to me that the functional paradigm is a much better way to represent the real world around us: everything is transformation, everything is "pipeline", nothing persists. What we call the "state" of something is just a snapshot of an ever-changing process, even when it is changing very slowly.

Like most of these approaches, FP has a long history, but it only recently started going mainstream. Many big programming languages, like Python, Javascript, and Ruby, provide the tooling to code in the FP way. Even FP-only languages like Clojure and Scala have their relatively small but passionate communities of users. It may seem like the functional way might be the way of the future.

Sadly—because I love FP more than any other paradigm—I don't think FP is the future. Things are more nuanced than that.

Problem is, FP is hard! It is much less intuitive to design with than OOP, and arguably less so than even the plain procedural style. Thinking in terms of functions is not what we usually do in our heads. You don't generally look at a shopping cart and say, "that is a useful product-aggregation process right there, son." You look at the cart and say, "oh, a cart, the thing you put products in."

Our reason is fundamentally concept-based, black boxes with boundaries drawn as and when needed to make specific predictions. The virtual physics in our minds needs discrete building blocks to think about anything.

Even if I'm right—even if FP best reflects the world's incessant and borderless transformation of differences—functions are never going to be the only paradigm for software, and perhaps they will never even be the major one either. Our minds won't let them. Even saying "the function transforms the data" is describing a functional truth in an object-oriented language—English—where "functions" and "data" are black boxes. The same goes for "the world is a network." We just can't help it.

Screenshot of an n8n workflow automation showing four connected nodes in a linear pipeline: 'Start: Every Hour' (with a clock icon), 'FTP: List Files', 'FTP: Download File', and 'Google Drive: Upload' (with Google Drive icon). Each node is represented as a rounded rectangle connected by lines, visualizing a functional data transformation pipeline.
Low-code tools like n8n use the visual metaphor of a pipeline. They are functional in nature, transforming data at each step, but notice how the UI designers chose to show it as a sequence of 'node' objects, making the computation easier to grok for a human.

Living with the Tension

I love software for how it regularly leads to questions much deeper and more interesting than "how do I automate the checkout of my e-commerce website". On the surface it is "just" engineering, but it is really philosophy, psychology, and more rolled into one.

Remember that the first category of mistakes attributed to OOP had to do with abstractions and inheritance. This seems to be the part of the OOP method that least reflects our natural way of framing the world. While we do think "in objects", we always use them mentally in concrete, goal-oriented tasks, where the horizontal relationships between objects are what matters (e.g. "the person fights the monster").

To think about inheritance, on the other hand, you need to think "vertically", about the levels of abstraction of the categories involved ("a person is a being, and traits X and Y are shared by all beings"). We can do this—it is an important and useful effort—but it's not what our minds are best at.

The second kind of cognitive issue with OOP had to do with cases in which objects are just a bad fit for the problems. Here is where procedural and, even better, functional methods shine: not because they don't apply just as nicely in other cases, but because these are the circumstances in which the object-oriented fictions in our minds break down in the face of reality.

In short:

  • Object-orientation is declining in popularity and people see it as flawed not (mainly) because it doesn't reflect how our minds work, but because our minds don't work very well.
  • Functional code is not more popular, and perhaps will never be, not because it is a worse model of reality, but because it is alien to how our minds tend to work.

What we are left with is an awkward mixture of both paradigms, both in software and in life.

When programming, I'm afraid we can't hope for a single neat paradigm to rule them all, a way of designing code that is both intuitive and world-accurate. We (programmers) have to find the right tool for each case, learn to discern in which situations the cognitive benefits of thinking in objects outweigh the technical downsides of less-functional code, and vice versa. Even better, we can use both at the same time and trade elegance for pragmatism.

When reasoning in general, outside the narrow world of software, I think the same wisdom applies. Knowing the limits of your framings and the failure modes of your models can only be good for you.

It may be worth the effort to borrow ideas from visualization tools like UML to represent what goes on in our minds—especially what goes wrong. Make this intuitive process of framing everything into something more deliberate when necessary. Design your understanding of the world like an engineer designs his programs. ●

Loading verification...


Read the whole story
mrmarchant
1 hour ago
reply
Share this story
Delete

Aggressive bots ruined my weekend

1 Share

On the 25th of October Bear had its first major outage. Specifically, the reverse proxy which handles custom domains went down, causing custom domains to time out.

Unfortunately my monitoring tool failed to notify me, and it being a Saturday, I didn't notice the outage for longer than is reasonable. I apologise to everyone who was affected by it.

First, I want to dissect the root cause, exactly what went wrong, and then provide the steps I've taken to mitigate this in the future.

I wrote about The Great Scrape at the beginning of this year. The vast majority of web traffic is now bots, and it is becoming increasingly more hostile to have publicly available resources on the internet.

There are 3 major kinds of bots currently flooding the internet: AI scrapers, malicious scrapers, and unchecked automations/scrapers.

The first has been discussed at length. Data is worth something now that it is used as fodder to train LLMs, and there is a financial incentive to scrape, so scrape they will. They've depleted all human-created writing on the internet, and are becoming increasingly ravenous for new wells of content. I've seen this compared to the search for low-background-radiation steel, which is, itself, very interesting.

These scrapers, however, are the easiest to deal with since they tend to identify themselves as ChatGPT, Anthropic, XAI, et cetera. They also tend to specify whether they are from user-initiated searches (think all the sites that get scraped when you make a request with ChatGPT), or data mining (data used to train models). On Bear Blog I allow the first kinds, but block the second, since bloggers want discoverability, but usually don't want their writing used to train the next big model.

The next two kinds of scraper are more insidious. The malicious scrapers are bots that systematically scrape and re-scrape websites, sometimes every few minutes, looking for vulnerabilities such as misconfigured Wordpress instances, or .env and .aws files, among other things, accidentally left lying around.

It's more dangerous than ever to self-host, since simple mistakes in configurations will likely be found and exploited. In the last 24 hours I've blocked close to 2 million malicious requests across several hundred blogs.

What's wild is that these scrapers rotate through thousands of IP addresses during their scrapes, which leads me to suspect that the requests are being tunnelled through apps on mobile devices, since the ASNs tend to be cellular networks. I'm still speculating here, but I think app developers have found another way to monetise their apps by offering them for free, and selling tunnel access to scrapers.

Now, on to the unchecked automations. Vibe coding has made web-scraping easier than ever. Any script-kiddie can easily build a functional scraper in a single prompt and have it run all day from their home computer, and if the dramatic rise in scraping is anything to go by, many do. Tens of thousands of new scrapers have cropped up over the past few months, accidentally DDoSing website after website in their wake. The average consumer-grade computer is significantly more powerful than a VPS, so these machines can easily cause a lot of damage without noticing.

I've managed to keep all these scrapers at bay using a combination of web application firewall (WAF) rules and rate limiting provided by Cloudflare, as well as some custom code which finds and quarantines bad bots based on their activity.

I've played around with serving Zip Bombs, which was quite satisfying, but I stopped for fear of accidentally bombing a legitimate user. Another thing I've played around with is Proof of Work validation, making it expensive for bots to scrape, as well as serving endless junk data to keep the bots busy. Both of these are interesting, but ultimately are just as effective as simply blocking those requests, without the increased complexity.

With that context, here's exactly went wrong on Saturday.

Previously, the bottleneck for page requests was the web-server itself, since it does the heavy lifting. It automatically scales horizontally by up to a factor of 10, if necessary, but bot requests can scale by significantly more than that, so having strong bot detection and mitigation, as well as serving highly-requested endpoints via a CDN is necessary. This is a solved problem, as outlined in my Great Scrape post, but worth restating.

On Saturday morning a few hundred blogs were DDoSed, with tens of thousands of pages requested per minute (from the logs it's hard to say whether they were malicious, or just very aggressive scrapers). The above-mentioned mitigations worked as expected, however the reverse-proxy—which sits up-stream of most of these mitigations—became saturated with requests and decided it needed to take a little nap.

page-requests

The big blue spike is what toppled the server. It's so big it makes the rest of the graph look flat.

This server had been running with zero downtime for 5 years up until this point.

Unfortunately my uptime monitor failed to alert me via the push notifications I'd set up, even though it's the only app I have that not only has notifications enabled (see my post on notifications), but even has critical alerts enabled, so it'll wake me up in the middle of the night if necessary. I still have no idea why this alert didn't come through, and I have ruled out misconfiguration through various tests.

This brings me to how I will prevent this from happening in the future.

  1. Redundancy in monitoring. I now have a second monitoring service running alongside my uptime monitor which will give me a phone call, email, and text message in the event of any downtime.
  2. More aggressive rate-limiting and bot mitigation on the reverse proxy. This already reduces the server load by about half.
  3. I've bumped up the size of the reverse proxy, which can now handle about 5 times the load. This is overkill, but compute is cheap, and certainly worth the stress-mitigation. I'm already bald. I don't need to go balder.
  4. Auto-restart the reverse-proxy if bandwidth usage drops to zero for more than 2 minutes.
  5. Added a status page, available at https://status.bearblog.dev for better visibility and transparency. Hopefully those bars stay solid green forever.

This should be enough to keep everything healthy. If you have any suggestions, or need help with your own bot issues, send me an email.

The public internet is mostly bots, many of whom are bad netizens. It's the most hostile it's ever been, and it is because of this that I feel it's more important than ever to take good care of the spaces that make the internet worth visiting.

The arms race continues...

Read the whole story
mrmarchant
5 hours ago
reply
Share this story
Delete

What is a Hallow, anyway?

1 Share
All hallowe’en (1895), John Collier

Halloween. If you’re from North America, you likely grew up with this spooky “holiday” lurking at the end of October, ever-present in your childhood, like some ancestral curse which flares up yearly along with the changing of the leaves to a sickly hue of yellow.

If you’re from elsewhere in the world, your experience of Halloween in recent years may have resembled that of a witness to a strange and outlandish invasion, whose garish black and orange plastic paraphernalia have spread like a stain, consuming more and more aisles in your local shop year after year.

Or perhaps you live in one of the lands whose ancient customs were stitched together to give life to the monster that we call Halloween today.

The origins and gradual creeping spread of this holiday, and of its strange customs, are a matter of interest for folklorists and anthropologists. For the linguist, however, what captivates our attention more than anything is the name itself, whether you spell it with or without an apostrophe:

Halloween.

You may have heard that this name is some sort of unholy shortening of the phrase All Hallows’ Eve, which is true, but it only pushes the mystery back further into the mists of the linguistic past, tempting us to ask the question: what, pray tell, is a Hallow?


You’re reading The Dead Language Society. I’m Colin Gorrie, linguist, ancient language teacher, and your guide through the history of the English language and its relatives.

Subscribe for a free issue every Wednesday, or upgrade to support my mission of bringing historical linguistics out of the ivory tower and receive two extra Saturday deep-dives per month.

If you upgrade, you’ll be able to join our ongoing Beowulf Book Club. You can also catch up by watching our discussion of the first 1962 lines (part 1, part 2, part 3, part 4) right away.

Subscribe now


The undeathly hallows

The Cemetery, Caspar David Friedrich (1774–1840)

Fear not: a hallow is no cause for alarm! Quite the opposite, actually. A hallow is nothing but a saint.

In Old English, the word was hālga, literally ‘holy one.’ It’s a definite form of the adjective hālig, which gives us Modern English holy.

But it’s hard to recognize the connection, because the two words holy and hallow sound so different in Modern English. This difference is all because of the little vowel -i-, which was there in the middle of hāliġ, but which dropped out in the form hālga. In Old English, hālga is a definite form, used in phrases meaning ‘the holy (whatever)’ rather than ‘a holy (whatever).’

The loss of that i had ripple effects, in the first place because the Old English g-sound sounded different in different contexts: at the end of a word after an i, it sounded like a y. This is notated by kind editors of Old English texts by putting a little dot above the g, like so: hāliġ.

But in a word like hālga, where the g comes between a consonant like l and a vowel like a, the g made a gh-sound, similar to the sound made at the end of the exclamation ugh. Here’s what the two words sounded like in Old English:



The y-sound of the suffix - eventually dropped out, leaving just the vowel i, which we ended up spelling -y, giving us the modern form holy.

But in hālga, things took a different course. The gh-sound made by the g in hālga stuck around long enough to be spelled with a special letter in the later Middle Ages, the yogh.

So much for the difference at the end of the words holy and hallow. But what about the difference in the vowel in holy and hallow?

Both words had the same vowel in Old English, a long ā vowel. Of the two Modern English words, holy actually has the vowel we’d expect it to have. The Great Vowel Shift turned the long ā vowel of Old and Middle English into an o vowel in Modern English, just like stān became stone and hām became home.

It’s actually the word hallow that has the unexpected vowel a (as in hat). This is, in fact, exactly the vowel we would expect to come from an Old English short — rather than long — a, as happened in words like Old English mann ‘person,’ which became Modern English man.

The reason the vowel in hālga was eventually shortened is wrapped up in a more general phenomenon in the history of English, where long vowels were shortened before certain clusters of two consonants: l + g was one of them.

And there were many others, which is why we have differences in the vowel sounds of certain related words, such as five (from a long ī) alongside fifth (from a short i); or heal (from a long vowel spelled ǣ) alongside health (from its short counterpart, æ).

The word hallow meaning ‘saint’ fell out of use after 1500, and was preserved only in the phrase All Hallows’, referring to the festival celebrated by the Christian Church on November 1, also known as All Saints Day.

This is also where the verb hallow ‘to make holy’ comes from, heard in phrases like hallowed ground or hallowed be thy name, from the Lord’s Prayer.

And, because I know some of you have been waiting for this: the word hallow was also used in the plural, as hallows, to refer to the relics of saints or to the shrines where those relics are held. This is the meaning that J. K. Rowling made use of in the title of the final Harry Potter book, Harry Potter and the Deathly Hallows — incidentally, a title composed entirely of elements found in Old English!1


Pease porridge in the pot nine days old: now that’s a horror story

Evening in the Limburg Kempen, Joseph Théodore Coosemans (1828–1904)

The -een component in Halloween comes from the word even, which sounds like a short form of evening. In fact, evening is a long form of even: in Old English, the word for ‘evening’ was ǣfen.

If you’re wondering where eve (as in All Hallows’ Eve or Christmas Eve) comes from, it’s part of a general loss of the -en ending which occurred to several nouns in Middle English: in some cases this created doublets, that is, pairs of words which exist with and without the -en ending: eve and even(ing), morn and morrow, maiden and maid. In some nouns, we only retain the -en-less form: for example, the word game descends from gamen in Old English.

This probably happened because -en was a formerly common plural ending (which we still see in oxen), and the words whose singular forms ended in -en may have started to sound a bit plural, so the -en got chopped off.

This same thing happened later with the words pease (as in pease porridge hot) and cherise, both of which got borrowed into English from French with plural-sounding singular words (ending in -se), so the -se got chopped off to create pea and cherry. This phenomenon is called morphological reanalysis.

The word morn was originally morgen, where the g in between the r and the -en ending made that same gh-sound we saw earlier in hālga. But, over time, morgen shortened to be pronounced as a single syllable, and this gh-sound disappeared in the process, as it was strangled by the two consonants on either side of it.

But in the form where the -en dropped off, the gh-sound survived. Eventually, gh-sounds at the end of words (well, some of them at least; this is a story for another day) became an o-sound, just as it did in hallow from hālga. Other examples of this are burg becoming borough and sorg becoming sorrow.

Our modern word evening (from Old English ǣfenung) is formed from an old verb to even (Old English ǣfenian), meaning ‘to become evening.’ Evening is just a regular -ing noun formed from a verb, like fighting comes from fight. So, when you think about it, evening means ‘evening-ing.’

The name for morning, ‘becoming morn’, is also formed in this way, coming from a verb to morn, meaning ‘to become morning.’

Dawn has a similar story: it corresponds to an Old English word dagung, literally meaning ‘day-ing.’ There, the Old English gh-sound, spelled by -g-, turned into a w. But there’s some strange stuff going on in the history of dawn, because the Old English word dagung should have given us dawing (which actually did occur until the 16th century, and even later in Scots).

The existence of the n in dawn is hard to explain: it may have crept into the word from a reinterpretation of the Middle English verb dawen ‘to become day’, where the -en was the infinitive ending, corresponding to the to in Modern English to be, to do, to see, etc., and still found as the infinitive ending in Modern German.

Since dawen was such a short verb, people may have reinterpreted the ending -en as part of the verb root itself — this is basically the opposite of what happened with pease and cherise!

It’s also possible that the -n- crept in thanks to Norse influence, since formations in -ning are common in Old Norse. The modern Scandinavian languages all have forms like dagning, which correspond to Modern English dawning.


The vanishing v

Am Schlosstor, Ferdinand Knab (German, 1834 - 1902)

With the mystery of the names for the parts of the day solved — at least as much as we’re likely to solve them, we turn our attention to the case of the vanishing v in -een.

What happened to the v to turn even into -een?

This deletion of v after a stressed vowel is a sound change that occurred sporadically throughout the history of English. It produced variant forms o’er and e’er for over and ever, now used mainly in poetry (for their quality of being just one syllable long — easier to fit into iambic pentameter).

Scots was particularly fond of this change, producing deil corresponding to Modern English devil and Scots gie, which corresponds to Modern English give.

But the loss of -v- occurred in some Modern English words, too. The word head descends from Old English hēafod — Old English spelled the v-sound as f between vowels. So too does lord descend from Old English hlāford and lady from Old English hlǣfdīġe. These words, too, have very interesting stories (involving bread, believe it or not), but I’ll save those for another day.

It’s all too tempting to continue, but, as you can see, explaining the etymology of the single word Halloween is like wandering into a twisted (and very likely haunted) labyrinth of interconnected rooms.

It’s easy enough to take your first step in, but with each door you open, you’re presented with three more, and before you know it, you’ve lost your way in a maze of morphological reanalysis, pursued by a pack of wild yoghs.

As you page frantically through etymological dictionaries written by forgotten linguists, in that very moment, you’re in the gravest of dangers. For you too may, in a moment of weakness, succumb to the madness and devote your life to the study of language.

Scary stuff. Happy Halloween!

1

For the record, translated literally, Harry Potter and the Deathly Hallows would be something like Hāmrīċ Pottere and þā Dēaðlīcan Hālgan. In Old English this would mean ‘Harry Potter and the Mortal Saints.’ Another note: Hāmrīċ ‘home-ruler’ is the Old English version of the Frankish name that would eventually become Henry, whose nickname form became Harry, although I don’t believe the name Hāmrīċ itself is attested in Old English.

Read the whole story
mrmarchant
5 hours ago
reply
Share this story
Delete

What the Weeds Are Telling Us

1 Share
In Arkansas, farmers are fighting and dying over pigweed. Are weeds just an ancient curse on humankind, or can they teach us something?

Read the whole story
mrmarchant
9 hours ago
reply
Share this story
Delete

Space Type Generator

1 Share
don't miss the "Select" menu at the top for many more options #
Read the whole story
mrmarchant
9 hours ago
reply
Share this story
Delete
Next Page of Stories