Note: this post is part of #100DaysToOffload, a challenge to publish 100 posts in 365 days. These posts are generally shorter and less polished than our normal posts; expect typos and unfiltered thoughts! View more posts in this series.
Programmers mostly know about Regex and dislike it. Normal people are mostly blissfully ignorant of the existence of RegEx.
To me, RegEx is like superpower that’s available to everyone, but most don’t bother to develop, because you have to eat lots of spinach.
What is RegEx (Regular Expressions)? It’s a language for describing patterns in text. If a human can describe a pattern (ie, the phone book has numbers with separators) then regex can usually describe that pattern!
You don’t need to be a programmer to use regex because it’s built into most text editors. Even LibreOffice and Word!
Here’s a gif that shows the power of regex:
A simple example: find all telephone numbers and change their format with LibreOffice Writer search and replace.
Regex’s power is even greater when combined with other tools (like multiple selections).
Regex is horrible because the syntax is hard to learn. But once you get it, the syntax is easy. Then, regex is horrible because the syntax isn’t the same for different regex “engines”.
There’s POSIX regex with basic (BRE) and extended (ERE) versions, and Perl/PCRE regex, and Vim regex (which I still haven’t figured out) with its \verymagic and \nomagic modes, and probably countless more. (I haven’t found any tool that converts between Regex variants. This seems silly.) Then there’s the fact that the regex engine used by various hip software like Helix and Ripgrep is a Rust crate that doesn’t support regex features that I consider table stakes: lookahead and lookbehind.
But if regex is worth using despite all that, it must be good! And it is.
Here’s my suggestion.
If you’re a programmer, write a tool that can translate between regex flavors.
If you’re a programmer who isn’t super familiar with regex, regexr.com is the best tool I’ve found for testing and debugging regex.
If you’re a normal person, next time you find yourself looking for a pattern – ie, you know some digits of a phone number, or want to find all jpegs without a date in the filename – that you can describe with words, or performing the same operation a bunch of times in a text file, ask a nearby programmer to help you write your first regex. Or email me.
I sometimes resort to a rather common measuring technique that is
neither fast, nor accurate, nor recommended by any standards body
and yet it hasn't failed me whenever I have had to use it. I will
describe it here, though calling it a technique might be overselling
it. Please do not use it for installing kitchen cabinets or
anything that will stare back at you every day for the next ten
years. It involves one tool: a sheet of A4 paper.
Like most sensible people with a reasonable sense of priorities, I
do not carry a ruler with me wherever I go. Nevertheless, I often
find myself needing to measure something at short notice, usually in
situations where a certain amount of inaccuracy is entirely
forgivable. When I cannot easily fetch a ruler, I end up doing what
many people do and reach for the next best thing, which for me is a
sheet of A4 paper, available in abundant supply where I live.
From photocopying night-sky charts to serving as a scratch pad for
working through mathematical proofs, A4 paper has been a trusted
companion since my childhood days. I use it often. If I am
carrying a bag, there is almost always some A4 paper inside: perhaps
a printed research paper or a mathematical problem I have worked on
recently and need to chew on a bit more during my next train ride.
Dimensions
The dimensions of A4 paper are the solution to a simple, elegant
problem. Imagine designing a sheet of paper such that, when you cut
it in half parallel to its shorter side, both halves have exactly
the same aspect ratio as the original. In other words, if the
shorter side has length \( x \) and the longer side has length \( y
, \) then
\[
\frac{y}{x} = \frac{x}{y / 2}
\]
which gives us
\[
\frac{y}{x} = \sqrt{2}.
\]
Test it out. Suppose we have \( y/x = \sqrt{2}. \) We cut the
paper in half parallel to the shorter side to get two halves, each
with shorter side \( x' = y / 2 = x \sqrt{2} / 2 = x / \sqrt{2} \)
and longer side \( y' = x. \) Then indeed
\[
\frac{y'}{x'}
= \frac{x}{x / \sqrt{2}}
= \sqrt{2}.
\]
In fact, we can keep cutting the halves like this and we'll keep
getting even smaller sheets with the aspect ratio \( \sqrt{2} \)
intact. To summarise, when a sheet of paper has the aspect ratio \(
\sqrt{2}, \) bisecting it parallel to the shorter side leaves us
with two halves that preserve the aspect ratio. A4 paper has this
property.
But what are the exact dimensions of A4 and why is it called A4?
What does 4 mean here? Like most good answers, this one too begins
by considering the numbers \( 0 \) and \( 1. \) Let me elaborate.
Let us say we want to make a sheet of paper that is \( 1 \,
\mathrm{m}^2 \) in area and has the aspect-ratio-preserving property
that we just discussed. What should its dimensions be? We want
\[
xy = 1 \, \mathrm{m}^2
\]
subject to the condition
\[
\frac{y}{x} = \sqrt{2}.
\]
Solving these two equations gives us
\[
x^2 = \frac{1}{\sqrt{2}} \, \mathrm{m}^2
\]
from which we obtain
\[
x = \frac{1}{\sqrt[4]{2}} \, \mathrm{m}, \quad
y = \sqrt[4]{2} \, \mathrm{m}.
\]
Up to three decimal places, this amounts to
\[
x = 0.841 \, \mathrm{m}, \quad
y = 1.189 \, \mathrm{m}.
\]
These are the dimensions of A0 paper. They are precisely the
dimensions specified by the ISO standard for it. It is quite large
to scribble mathematical solutions on, unless your goal is to make a
spectacle of yourself and cause your friends and family to reassess
your sanity. So we need something smaller that allows us to work in
peace, without inviting commentary or concerns from passersby. We
take the A0 paper of size
\[
84.1 \, \mathrm{cm} \times 118.9 \, \mathrm{cm}
\]
and bisect it to get A1 paper of size
\[
59.4 \, \mathrm{cm} \times 84.1 \, \mathrm{cm}.
\]
Then we bisect it again to get A2 paper with dimensions
\[
42.0 \, \mathrm{cm} \times 59.4 \, \mathrm{cm}.
\]
And once again to get A3 paper with dimensions
\[
29.7 \, \mathrm{cm} \times 42.0 \, \mathrm{cm}.
\]
And then once again to get A4 paper with dimensions
\[
21.0 \, \mathrm{cm} \times 29.7 \, \mathrm{cm}.
\]
There we have it. The dimensions of A4 paper. These numbers are
etched in my memory like the multiplication table of \( 1. \) We
can keep going further to get A5, A6, etc. We could, in theory, go
all the way up to A\( \infty. \) Hold on, I think I hear someone
heckle. What's that? Oh, we can't go all the way to A\( \infty? \)
Something about atoms, was it? Hmm. Security! Where's security?
Ah yes, thank you, sir. Please show this gentleman out, would you?
Sorry for the interruption, ladies and gentlemen. Phew! That
fellow! Atoms? Honestly. We, the mathematically inclined, are not
particularly concerned with such trivial limitations. We drink our
tea from doughnuts. We are not going to let the size of atoms
dictate matters, now are we?
So I was saying that we can bisect our paper like this and go all
the way to A\( \infty. \) That reminds me. Last night I was at a
bar in Hoxton and I saw an infinite number of mathematicians walk
in. The first one asked, "Sorry to bother you, but would it be
possible to have a sheet of A0 paper? I just need something to
scribble a few equations on." The second one asked, "If you happen
to have one spare, could I please have an A1 sheet?" The third one
said, "An A2 would be perfectly fine for me, thank you." Before the
fourth one could ask, the bartender disappeared into the back for a
moment and emerged with two sheets of A0 paper and said, "Right.
That should do it. Do know your limits and split these between
yourselves."
In general, a sheet of A\( n \) paper has the dimensions
\[
2^{-(2n + 1)/4} \, \mathrm{m} \times
2^{-(2n - 1)/4} \, \mathrm{m}.
\]
If we plug in \( n = 4, \) we indeed get the dimensions of A4 paper:
\[
0.210 \, \mathrm{m} \times 0.297 \, \mathrm{m}.
\]
Measuring Stuff
Let us now return to the business of measuring things. As I
mentioned earlier, the dimensions of A4 are lodged firmly into my
memory. Getting hold of a sheet of A4 paper is rarely a challenge
where I live. I have accumulated a number of A4 paper stories over
the years. Let me share a recent one. I was hanging out with a few
folks of the nerd variety one afternoon when the conversation
drifted, as it sometimes does, to a nearby computer monitor that
happened to be turned off. At some point, someone confidently
declared that the screen in front of us was 27 inches. That sounded
plausible but we wanted to confirm it. So I reached for my trusted
measuring instrument: an A4 sheet of paper. What followed was
neither fast, nor especially precise, but it was more than adequate
for settling the matter at hand.
I lined up the longer edge of the A4 sheet with the width of the
monitor. One length. Then I repositioned it and measured a second
length. The screen was still sticking out slightly at the end. By
eye, drawing on an entirely unjustified confidence built from years
of measuring things that never needed measuring, I estimated the
remaining bit at about \( 1 \, \mathrm{cm}. \) That gives us a
width of
\[
29.7 \, \mathrm{cm} +
29.7 \, \mathrm{cm} +
1.0 \, \mathrm{cm}
=
60.4 \, \mathrm{cm}.
\]
Let us round that down to \( 60 \, \mathrm{cm}. \) For the height,
I switched to the shorter edge. One full \( 21 \, \mathrm{cm} \)
fit easily. For the remainder, I folded the paper parallel to the
shorter side, producing an A5-sized rectangle with dimensions \(
14.8 \, \mathrm{cm} \times 21.0 \, \mathrm{cm}. \) Using the \(
14.8 \, \mathrm{cm} \) edge, I discovered that it overshot the top
of the screen slightly. Again, by eye, I estimated the excess at
around \( 2 \, \mathrm{cm}. \) That gives us
\[
21.0 \, \mathrm{cm} +
14.8 \, \mathrm{cm}
-2.0 \, \mathrm{cm}
=
33.8 \, \mathrm{cm}.
\]
Let us round this up to \( 34 \, \mathrm{cm}. \) The ratio \( 60 /
34 \approx 1.76 \) is quite close to \( 16/9, \) a popular aspect
ratio of modern displays. At this point the measurements were
looking good. So far, the paper had not embarrassed itself.
Invoking the wisdom of the Pythagoreans, we can now estimate the
diagonal as
\[
\sqrt{(60 \, \mathrm{cm})^2 + (34 \, \mathrm{cm})^2}
\approx 68.9 \,\mathrm{cm}.
\]
Finally, there is the small matter of units. One inch is \( 2.54 \,
\mathrm{cm}, \) another figure that has embedded itself in my head.
Dividing \( 68.9 \) by \( 2.54 \) gives us roughly \( 27.2 \,
\mathrm{in}. \) So yes. It was indeed a \( 27 \)-inch display. My
elaborate exercise in showing off my A4 paper skills was now
complete. Nobody said anything. A few people looked away in
silence. I assumed they were reflecting. I am sure they were
impressed deep down. Or perhaps... no, no. They were definitely
impressed. I am sure.
Hold on. I think I hear another heckle. What is that? There are
mobile phone apps that can measure things now? Really? Right.
Security. Where's security?
My first office job was an internship at a law firm in Washington, DC. I was twenty years old and a college student, which meant that I was quite useless. I found out that it was one kind of torture to do pointless work for two or three hours a day—usually, producing research memos that no one read—and then another kind of torture to figure out how to do nothing until it was acceptable to leave the office at 5 p.m. I spent a lot of time texting two friends from high school, who were also newly stuck in office jobs of their own. I perfected my technique for napping while sitting upright.
My second office job was an internship at a management consulting firm in Manhattan when I was twenty-one. At work, I was either gossiping with my fellow interns, trying to figure out how to optimize my per diem for lunch, or messaging different people on dating apps while waiting for one particular person to text me back.
But my lackadaisical workdays as a management consultant weren’t entirely my fault. The office was full of distractions, and I found it difficult to focus. People were frequently pulling me into unnecessary meetings or taking calls around me. Also, I function best when I have a snack every two hours or so. At the office, I was too self-conscious to eat, so I spent hours trying to distract myself from my hunger instead of working. I inevitably did most of my “work”—making PowerPoints and fiddling with spreadsheets—in the evenings and over the weekends. After that summer, I absconded to graduate school and vowed to avoid any job that would require me to be in an office from nine to five for as long as possible.
During the pandemic, the glass high rises that struck terror into my young, impressionable heart stood empty, and for a while, people wondered whether offices were relics of the past. But over the past two years, companies have begun to call employees back into the office. Ontario public servants are expected to return to office full-time this month. Major banks, including the Royal Bank of Canada, Scotiabank, TD, and the Bank of Montreal, have asked employees to come in four days a week. These announcements followed on the tail of controversial RTO mandates at major companies in the United States, including Amazon, AT&T, and Goldman Sachs.
Unsurprisingly, employees are almost universally against RTO mandates. One 2024 study from the University of Pittsburgh found that 99 percent of companies that implemented them saw a drop in employee satisfaction. Part of the problem is that people are back to the commutes they avoided during the pandemic. In some cases, these commutes are longer than they used to be. As housing costs increased over the past few years, many people moved away from cities with the expectation that they could continue to work remotely.
Countless reports have also documented how RTO rules negatively impact women in particular. In places where day care is either unaffordable or unavailable, women typically shoulder the consequences. Many mothers choose lower-paying jobs that allow them to work from home so they can juggle child care at the same time. All this has likely contributed to another depressing fact: over the past two years, the gender pay gap has widened for the first time since the 1960s.
These mandates don’t really make sense for the employers either. Many companies are not equipped to handle the volume of people back in the office. The Globe and Mail recently reported that those who come in often struggle to find desks. The ones they do snag sometimes lack the essentials: a monitor, mouse, or keyboard.
Bruce Daisley, who writes the popular Substack Make Work Better, explained that this lack of space is especially inconvenient because the amount of time people spend in meetings has increased over the past decade as workers, perpetually plagued by precarity, strive to get more face time with superiors. As a result, even when people return to the office, they are not necessarily bonding with colleagues in-person. Rather, they are on back-to-back virtual calls, irritating other people nearby who are also on back-to-back calls. Daisley revealed that, at one organization he worked with, people were doing calls while sitting on the floor.
Companies in Canada urging their employees to return to the office also have to contend with a further problem: limited commercial space in Canadian cities. Downtown Toronto, for example, is running out of vacancy in Class A properties—newer buildings with spiffy amenities close to public transit hubs—and as supply goes down, prices go up. Employers could save massively on rent by allowing at least some of their employees to work remotely.
Why, then, are employers rounding up their workers so insistently, with both stick and carrot? (There are the mandates, of course, and then there are the flashy constructions. Jamie Dimon, the chief executive officer of JPMorgan Chase & Co., just cut the ribbon on an extravagant skyscraper in Manhattan. It includes a luxury gym, meditation rooms, and indoor spin studios. Allegedly, the architect consulted wellness guru Deepak Chopra.) Management typically cites productivity as a key reason for bringing workers back into the office. But several studies have shown that hybrid workdoes not impact productivity. To the contrary, it improves job satisfaction and reduces quit rates.
It may be that the problem is precisely that people are too satisfied with their jobs. Some members of the C-suite have admitted that they implemented RTO mandates to encourage people to quit. RTO mandates offer a way for companies to reduce their staff size without having to pay severance—a tantalizing possibility for employers embattled in the Sisyphean quest to maximize shareholder value.
But the price of playing this mind game with employees is not negligible. For one, management can’t control who will quit, so it’s a rather risky way to reduce the size of a company. You could lose the guy who never does anything, but you could also lose your star player.
The other reason that employers often cite for bringing employees back in-person is “company culture.” But Daisley told me that bosses are “not necessarily being honest about what work was and what we want to go back to.” He recalled that, back in 2019, one of the most common complaints among employers was that workers were sitting around the office with their headphones on. Of course, the headphones that the C-suite were grumbling about from their corner offices were necessary if a worker had any desire to get work done while people around them took calls, crunched chips, and clacked on keyboards. Prior to COVID-19, office space leased per worker had been declining steadily since the 1990s, and employees were increasingly piled on top of each other. If good fences make good neighbours, then no fences presumably make very bad neighbours. All this to say, the “company culture” for which employers are so nostalgic has not existed for a few decades.
Isuspect the real motivation behind RTO mandates has nothing to do with productivity or company culture and everything to do with control. That is what the modern office was designed for, after all.
Herman Miller is an American brand that has become synonymous with mid-century modern chic (think wood and simple designs; the Eames Lounge Chair; the type of furniture sold at Design Within Reach that is definitely beyond reach for most of us). But the company made its fortune back in the 1960s, when it first went public with the Action Office, essentially the precursor to the modern cubicle.
The Action Office was developed by an inventor called Robert Propst at a curious juncture in American history. Two things were happening simultaneously: blue-collar manufacturing jobs were disappearing, and a robust countercultural movement was on the rise. The “action” in the “Action Office” refers to the mobility of its design. Instead of the default bullpen—an open-plan office crammed with desks separated by small dividers or none at all—the Action Office was a work system with three moveable walls and vertical storage. It encouraged workers to stand up, move around, and customize their workspaces. You can see it as an invitation for employees to work more flexibly—or you could see it as a way to domesticate the countercultural energy of the 1960s using modernist design. In other words, the Action Office allowed workers to stand up, move around, and customize their workspaces, all for the cause of working more productively.
The office has seen several iterations since Propst’s Action Office. The tech industry re-popularized open-plan offices in the late 1990s and soon introduced amenities, like breakfast bars and luxury gyms. It’s all fun and games—free granola and fancy squat racks—until you realize how dystopian it is to normalize going to the office before breakfast and staying there until after dinner.
I went to graduate school to study religion, a field of research that turned out to be surprisingly useful for diagnosing my allergy to the office. Scholars of religion know that the sleek lines characteristic of Herman Miller designs, like many aspects of our modern world, have religious roots. Herman Miller was founded in Zeeland, Michigan, which was home to a wave of Dutch Reformed immigrants in the nineteenth century. The Dutch Reformed tradition is Neo-Calvinist—a form of Protestantism consistent with ascetic practices that sent what sociologist Max Weber called the “Protestant work ethic” into overdrive. Herman Miller adapted the Dutch Reformed emphasis on plain living into the clean-cut designs that came to dominate modern office aesthetics.
“The history of the cubicle—like the histories of the prison and the asylum offered by Michel Foucault—began as a Christian proposition for bodily asceticism and became a contribution to the carceral network,” the scholar Kathryn Lofton has observed in an essay on the religious history of the cubicle. “The most alternative office spaces still emphasize your ability to be seen by your team, and used by your team, as an instrument of their mission. The walls may be lower, the colors may be brighter, and the chairs ever more ergonomic, but the territory is still organized for your physical submission.”
I got on a call with Lofton, who clarified that the goal of her essay was not to prove that the cubicle is Protestant, or that office workers are secretly Protestant even though they may identify as Buddhist, Muslim, or “spiritual but not religious.” Rather, she hoped to invite readers to think critically about the visible and invisible structures that organize our lives.
Ultimately, the fact that offices—in whatever shape or form—are not necessarily conducive to productivity is beside the point. The office is, first and foremost, a space geared toward organizing people to become a certain type of subject: a cog in the capitalist machine. By physically containing your body and putting you in proximity to other people who are also “at work,” the office contains you psychologically. You have to work—or at least pretend to work—at the office because you are constantly under surveillance and cut off from the rest of your life. By the time you get home, odds are you are so tired you can barely do more than turn on the television or scroll social media for an hour or two before going to bed, just so you can engage in the same ritual again the next day, and the day after that, ad infinitum.
The fact that psychological control has become more important to employers than productivity is one of the many paradoxes wrought by capitalism. Employers want to control their employees not only physically but also psychologically because they feel like they have bought their employees’ time.
The notion of buying someone’s time is an ideological invention that traces back to the dawn of the industrial period. The anthropologist David Graeber has observed that, in premodern times, you could buy a pot from a potter, or you could buy the potter and enslave him, but you cannot buy the potter’s time. Such a form of temporary enslavement “was considered the most degrading thing that could possibly befall a human being.” But after the invention of standardized time in the nineteenth century, time became yet another commodity that could be bought or sold. And if an employer bought your time, you bet he wants to get the most out of every cent he paid, regardless of whether it would improve the output.
Of course, the discipline instilled by the office didn’t disappear when white-collar employees began working remotely during the pandemic. People simply turned the panopticon on themselves. After decades in the office, many people internalized its disciplinary function. A study that tracked more than 60,000 Microsoft employees in the early months of the pandemic found that, when the company shifted to remote work, employees logged 10 percent more weekly hours.
This was in part due to the fact that people no longer had to commute. A study that analyzed data spanning twenty-seven countries found that remote workers saved seventy-two minutes commuting every day. As a result, the average employee worked thirty minutes more each day, which added up to more than two hours a week. It was also possible that people logged more hours precisely because no one was watching them. Alone at home, they felt the need to go out of their way to prove that they were hard-working and should not be fired.
But eventually, people realized how lovely it was to not be in the office. Employees are quietly bucking RTO mandates when they can get away with it. And when they can’t, they are circulating petitions and challenging their supervisors. Notably, Dimon unveiled the new JPMorgan headquarters in Manhattan after the circulation of an anti-RTO petition signed by 2,000 JPMorgan employees earlier this year.
One of my high school friends, whom I texted constantly during that miserable summer at the law firm, eventually got a real job in tech. With a hybrid schedule, she perfected a practice of working at most five hours a day. She spends the rest of the time walking her dog, going to the gym, and cooking. She goes for runs in the middle of the day when she feels like it. She also started knitting and has since made baskets of beanies and sweaters in complex designs for herself and her loved ones.
Employers are so terrified at the prospect of their employees not working or thinking about work that they would risk cutting into their profit margins. Perhaps they are right to be afraid. If people weren’t locked up in offices for eight to ten hours a day, they might have time to take care of themselves. They might have time to reflect on whether their jobs actually bring them happiness or contribute meaningfully to the world. They might have time to discover other ways of experiencing pleasure beyond the fleeting dopamine hits occasioned by retail therapy. Instead of buying things to fill the voids in their lives, they might make art, they might experiment sexually, they might organize a protest, they might read a book, or they might spend time caramelizing onions for a leisurely dinner with their friends—and God, what a frightful world that would be.
A photographer won a legal ruling for compensation after a news website used his images, but tracking down the site’s owners and editors has proved impossible.
Here’s the most replicated finding to come out of my area of psychology in the past decade: most people believe they suffer from a chronic case of awkwardness.
Study after study finds that people expect their conversations to go poorly, when in fact those conversations usually go pretty well. People assign themselves the majority of the blame for any awkward silences that arise, and they believe that they like other people more than other people like them in return. I’ve replicated this effect myself: I once ran a study where participants talked in groups of three, and then they reported/guessed how much each person liked each other person in the conversation. Those participants believed, on average, that they were the least liked person in the trio.
In another study, participants were asked to rate their skills on 20 everyday activities, and they scored themselves as better than average on 19 of them. When it came to cooking, cleaning, shopping, eating, sleeping, reading, etc., participants were like, “Yeah, that’s kinda my thing.” The one exception? “Initiating and sustaining rewarding conversation at a cocktail party, dinner party, or similar social event”.
I find all this heartbreaking, because studiesconsistentlyshow that the thing that makes humans the happiest is positive relationships with other humans. Awkwardness puts a persistent bit of distance between us and the good life, like being celiac in a world where every dish has a dash of gluten in it.
Even worse, nobody seems to have any solutions, nor any plans for inventing them. If you want to lose weight, buy a house, or take a trip to Tahiti, entire industries are waiting to serve you. If you have diagnosable social anxiety, your insurance might pay for you to take an antidepressant and talk to a therapist. But if you simply want to gain a bit of social grace, you’re pretty much outta luck. It’s as if we all think awkwardness is a kind of moral failing, a choice, or a congenital affliction that suggests you were naughty in a past life—at any rate, unworthy of treatment and undeserving of assistance.
We can do better. And we can start by realizing that, even though we use one word to describe it, awkwardness is not one thing. It’s got layers, like a big, ungainly onion. Three layers, to be exact. So to shrink the onion, you have to peel it from the skin to the pith, adapting your technique as you go, because removing each layer requires its own unique technique.
Before we make our initial incision, I should mention that I’m not the kind of psychologist who treats people. I’m the kind of psychologist who asks people stupid questions and then makes sweeping generalizations about them. You should take everything I say with a heaping teaspoon of salt, which will also come in handy after we’ve sliced the onion and it’s time to sauté it. That disclaimer disclaimed, let’s begin on the outside and work our way in, starting with—
1. THE OUTSIDE LAYER: SOCIAL CLUMSINESS
The outermost layer of the awkward onion is the most noticeable one: awkward people do the wrong thing at the wrong time. You try to make people laugh; you make them cringe instead. You try to compliment them; you creep them out. You open up; you scare them off. Let’s call this social clumsiness.
Being socially clumsy is like being in a role-playing game where your charisma stat is chronically too low and you can’t access the correct dialogue options. And if you understand that reference, I understand why you’re reading this post.
Here’s the bad news: I don’t think there’s a cure for clumsiness. Every human trait is normally distributed, so it’s inevitable that some chunk of humanity is going to have a hard time reading emotional cues and predicting the social outcomes of their actions. I’ve seen high-functioning, socially ham-handed people try to memorize interpersonal rules the same way chess grandmasters memorize openings, but it always comes off stilted and strange. You’ll be like, “Hey, how you doing” and they’re like “ROOK TO E4, KNIGHT TO C11, QUEEN TO G6” and you’re like “uhhh cool man me too”.
Here’s the good news, though: even if you can’t cure social clumsiness, there is a way to manage its symptoms. To show you how, let me tell you a story of a stupid thing I did, and what I should have done instead.
Once, in high school, I was in my bedroom when I saw a girl in my class drive up to the intersection outside my house. It was dark outside and I had the light on, and so when she looked up, she caught me in the mortifying act of, I guess, existing inside my home? This felt excruciatingly embarrassing, for some reason, and so I immediately dropped to the floor, as if I was in a platoon of GIs and someone had just shouted “SNIPER!” But breaking line of sight doesn’t cause someone to unsee you, and so from this girl’s point of view, she had just locked eyes with some dude from school through a window and his response had been to duck and cover. She told her friends about this, and they all made fun of me ruthlessly.
I learned an important lesson that day: when it comes to being awkward, the coverup is always worse than the crime. If you just did something embarrassing mere moments ago, it’s unlikely that you have suddenly become socially omnipotent and that all of your subsequent moves are guaranteed to be prudent and effective. It’s more likely that you’re panicking, and so your next action is going to be even stupider than your last.
And that, I think, is the key to mitigating your social clumsiness: give up on the coverups. When you miss a cue or make a faux pas, you just have to own it. Apologize if necessary, make amends, explain yourself, but do not attempt to undo your blunder with another round of blundering. If you knock over a stack of porcelain plates, don’t try to quickly sweep up the shards before anyone notices; you will merely knock over a shelf of water pitchers.
This turns out to be a surprisingly high-status move, because when you readily admit your mistakes, you imply that you don’t expect to be seriously harmed by them, and this makes you seem intimidating and cool. You know how when a toddler topples over, they’ll immediately look at you to gauge how upset they should be? Adults do that too. Whenever someone does something unexpected, we check their reaction—if they look embarrassed, then whatever they did must be embarrassing. When that person panics, they look like a putz. When they shrug and go, “Classic me!”, they come off as a lovable doof, or even, somehow, a chill, confident person.
In fact, the most successful socially clumsy people I know can manage their mistakes before they even happen. They simply own up to their difficulties and ask people to meet them halfway, saying things like:
Thanks for inviting me over to your house. It’s hard for me to tell when people want to stop hanging out with me, so please just tell me when you’d like me to leave. I won’t be mad. If it’s weird to you, I’m sorry about that. I promise it’s not weird to me.
It takes me a while to trust people who attempt this kind of social maneuver—they can’t be serious, can they? But once I’m convinced they’re earnest, knowing someone’s social deficits feels no different than knowing their dietary restrictions (“Arthur can’t eat artichokes; Maya doesn’t understand sarcasm”), and we get along swimmingly. Such a person is always going to seem a bit like a Martian, but that’s fine, because they are a bit of a Martian, and there’s nothing wrong with being from outer space as long as you’re upfront about it.
2. THE MIDDLE LAYER: EXCESSIVE SELF-AWARENESS
When we describe someone else as awkward, we’re referring to the things they do. But when we describe ourselves as awkward, we’re also referring to this whole awkward world inside our heads, this constant sensation that you’re not slotted in, that you’re being weird, somehow. It’s that nagging thought of “does my sweater look bad” that blossoms into “oh god, everyone is staring at my horrible sweater” and finally arrives at “I need to throw this sweater into a dumpster immediately, preferably with me wearing it”.
This is the second layer of the awkward onion, one that we can call excessive self-awareness. Whether you’re socially clumsy or not, you can certainly worry that you are, and you can try to prevent any gaffes from happening by paying extremely close attention to yourself at all times. This strategy always backfires because it causes a syndrome that athletes call “choking” or “the yips”—that stilted, clunky movement you get when you pay too much attention to something that’s supposed to be done without thinking. As the old poem goes:
A centipede was happy – quite!
Until a toad in fun
Said, “Pray, which leg moves after which?”
This raised her doubts to such a pitch,
She fell exhausted in the ditch
Not knowing how to run.
The solution to excessive self-awareness is to turn your attention outward instead of inward. You cannot out-shout your inner critic; you have to drown it out with another voice entirely. Luckily, there are other voices around you all the time, emanating from other humans. The more you pay attention to what they’re doing and saying, the less attention you have left to lavish on yourself.
You can call this mindfulness if that makes it more useful to you, but I don’t mean it as a sort of droopy-eyed, slack-jawed, I-am-one-with-the-universe state of enlightenment. What I mean is: look around you! Human beings are the most entertaining organisms on the planet. See their strange activities and their odd proclivities, their opinions and their words and their what-have-you. This one is riding a unicycle! That one is picking their nose and hoping no one notices! You’re telling me that you’d rather think about yourself all the time?
Getting out of your own head and into someone else’s can be surprisingly rewarding for all involved. It’s hard to maintain both an internal and an external dialogue simultaneously, and so when your self-focus is going full-blast, your conversations degenerate into a series of false starts (“So...how many cousins do you have?” “Seven.” “Ah, a prime number.”) Meanwhile, the other person stays buttoned up because, well, why would you disrobe for someone who isn’t even looking? Paying attention to a human, on the other hand, is like watering a plant: it makes them bloom. People love it when you listen and respond to them, just like babies love it when they turn a crank and Elmo pops out of a box—oh! The joy of having an effect on the world!
Me when someone asks me an open-ended question about myself (source)
Of course, you might not like everyone that you attend to. When people start blooming in your presence, you’ll discover that some of them make you sneeze, and some of them smell like that kind of plant that gives off the stench of rotten eggs. But this is still progress, because in the Great Hierarchy of Subjective Experiences, annoyance is better than awkwardness—you can walk away from an annoyance, but awkwardness comes with you wherever you go.
It can be helpful to develop a distaste for your own excessive self-focus, and one way to do that is to relabel it as “narcissism”. We usually picture narcissists as people with an inflated sense of self worth, and of course many narcissists are like that. But I contend that there is a negative form of narcissism, one where you pay yourself an extravagant amount of attention that just happens to come in the form of scorn. Ultimately, self-love and self-hate are both forms of self-obsession.
So if you find yourself fixated on your own flaws, perhaps its worth asking: what makes you so worthy of your own attention, even if it’s mainly disapproving? Why should you be the protagonist of every social encounter? If you’re really as bad as you say, why not stop thinking about yourself so much and give someone else a turn?
3. THE INNER LAYER: PEOPLE-PHOBIA
Social clumsiness is the thing that we fear doing, and excessive self-focus is the strategy we use to prevent that fear from becoming real, but neither of them is the fear itself, the fear of being left out, called out, ridiculed, or rejected. “Social anxiety” is already taken, so let’s refer to this center of the awkward onion as people-phobia.
People-phobia is both different from and worse than all other phobias, because the thing that scares the bajeezus out of you is also the thing you love the most. Arachnophobes don’t have to work for, ride buses full of, or go on first dates with spiders. But people-phobes must find a way to survive in a world that’s chockablock with homo sapiens, and so they yo-yo between the torment of trying to approach other people and the agony of trying to avoid them.
At the heart of people-phobia are two big truths and one big lie. The two big truths: our social connections do matter a lot, and social ruptures do cause a lot of pain. Individual humans cannot survive long on their own, and so evolution endowed us with a healthy fear of getting voted off the island. That’s why it hurts so bad to get bullied, dumped, pantsed, and demoted, even though none of those things cause actual tissue damage.1
But here’s the big lie: people-phobes implicitly believe that hurt can never be healed, so it must be avoided at all costs. This fear is misguided because the mind can, in fact, mend itself. Just like we have a physical immune system that repairs injuries to the body, we also have a psychological immune system that repairs injuries to the ego. Black eyes, stubbed toes, and twisted ankles tend to heal themselves on their own, and so do slip-ups, mishaps, and faux pas.
That means you can cure people-phobia the same way you cure any fear—by facing it, feeling it, and forgetting it. That’s the logic behind exposure and response prevention: you sit in the presence of the scary thing without deploying your usual coping mechanisms (scrolling on your phone, fleeing, etc.) and you do this until you get tired of being scared. If you’re an arachnophobe, for instance, you peer at a spider from a safe distance, you wait until your heart rate returns to normal, you take one step closer, and you repeat until you’re so close to the spider that it agrees to officiate your wedding.2
Unfortunately, people-phobia is harder to treat than arachnophobia because people, unlike spiders, cannot be placed in a terrarium and kept safely on the other side of the room. There is no zero-risk social interaction—anyone, at any time, can decide that they don’t like you. That’s why your people-phobia does not go into spontaneous remission from continued contact with humanity: if you don’t confront your fear in a way that ultimately renders it dull, you’re simply stoking the phobia rather than extinguishing it.3
Exposure only works for people-phobia, then, if you’re able to do two things: notch some pleasant interactions and reflect on them afterward. The notching might sound harder than the reflecting, but the evidence suggests it’s actually the other way around. Most people have mostly good interactions most of the time. They just don’t notice.
In any study I’ve ever read and in every study I’ve ever conducted myself, when you ask people to report on their conversation right after the fact, they go, “Oh, it was pretty good!”. In one study, I put strangers in an empty room and told them to talk about whatever they want for as long as they want, which sounds like the social equivalent of being told to go walk on hot coals or stick needles in your eyes. And yet, surprisingly, most of those participants reported having a perfectly enjoyable, not-very-awkward time. When I asked another group of participants to think back to their most recent conversation (which were overwhelmingly with friends and family, rather than strangers), I found the same pattern of results4:
Error bars = 95% confidence intervals. Big dots = means. Small, transparent dots = individual data points.
But when you ask people to predict their next conversation, they suddenly change their tune. I had another group of participants guess how this whole “meet a stranger in the lab, have an open-ended conversation” thing would go, and they were not optimistic. Participants estimated that only 50% of conversations would make it past five minutes (actually, 87% did), and that only 15% of conversations would go all the way to the time limit of 45 minutes (actually, 31% did). So when people meet someone new, they go, “that was pretty good!”, but when they imagine meeting someone new, they go, “that will be pretty bad!”
A first-line remedy for people-phobia, then, is to rub your nose in the pleasantness of your everyday interactions. If you’re afraid that your goof-ups will doom you to a lifetime of solitude and then that just...doesn’t happen, perhaps it’s worth reflecting on that fact until your expectations update to match your experiences. Do that enough, and maybe your worries will start to appear not only false, but also tedious. However, if reflecting on the contents of your conversations makes you feel like that guy in Indiana Jones who gets his face melted off when he looks directly at the Ark of the Covenant, then I’m afraid you’re going to need bigger guns than can fit into a blog post.
Obviously, I don’t think you can instantly de-awkward yourself by reading the right words in the right order. We’re trying to override automatic responses and perform laser removal on burned-in fears—this stuff takes time.
In the meantime, though, there’s something all of us can do right away: we can disarm. The greatest delusion of the awkward person is that they can never harm someone else; they can only be harmed. But every social hangup we have was hung there by someone else, probably by someone who didn’t realize they were hanging it, maybe by someone who didn’t even realize they were capable of such a thing. When Todd Posner told me in college that I have a big nose, did he realize he was giving me a lifelong complex? No, he probably went right back to thinking about his own embarrassingly girthy neck, which, combined with his penchant for wearing suits, caused people to refer to him behind his back as “Business Frog” (a fact I kept nobly to myself).
So even if you can’t rid yourself of your own awkward onion, you can at least refrain from fertilizing anyone else’s. This requires some virtuous sacrifice, because the most tempting way to cope with awkwardness is to pass it on—if you’re pointing and laughing at someone else, it’s hard for anyone to point and laugh at you. But every time you accept the opportunity to be cruel, you increase the ambient level of cruelty in the world, which makes all of us more likely to end up on the wrong end of a pointed finger.
All of that is to say: if you happen to stop at an intersection and you look up and see someone you know just standing there inside his house he immediately ducks out of sight, you can think to yourself, “There are many reasonable explanations for such behavior—perhaps he just saw a dime on the floor and bent down to get it!” and you can forget about the whole ordeal and, most importantly, keep your damn eyes on the road.
Experimental History reminds you to don appropriate eyewear before looking directly at the Ark of the Covenant
Psychologists who study social exclusion love to use this horrible experimental procedure called “Cyberball”, where you play a game of virtual catch with two other participants. Everything goes normally at first, but then the other participants inexplicably start throwing the ball only to each other, excluding you entirely. (In reality, there are no other participants; this is all pre-programmed.) When you do this to someone who’s in an fMRI scanner, you can see that getting ignored in Cyberball lights up the same part of the brain that processes physical pain. But you don’t need a big magnet to find this effect: just watching the little avatars ignore you while tossing the ball back and forth between them will immediately make you feel awful.
My PhD cohort included some clinical psychologists who interned at an OCD treatment center as part of their training. Some patients there had extreme fears about wanting to harm other people—they didn’t actually want to hurt anybody, but they were afraid that they did. So part of their treatment was being given the opportunity to cause harm, and to realize that they weren’t really going to do it. At the final stage of this treatment, patients are given a knife and told to hold it at their therapist’s throat, who says, “See? Nothing bad is happening.” Apparently this procedure is super effective and no one at the clinic had ever been harmed doing it, but please do not try this at home.
You might notice that while awkwardness ratings are higher when people talk to strangers vs. loved ones, enjoyment ratings are higher too. What gives? One possibility is that people are “on” when they meet someone new, and that’s a surprisingly enjoyable state to be in. That’s consistent with this study from 2010, which found that:
Participants actually had a better time talking to a stranger than they did talking to their romantic partner.
When they were told to “try to make a good impression” while talking to their romantic partner (“Don’t role-play, or pretend you are somewhere where you are not, but simply try to put your best face forward”), they had a better time than when they were given no such instructions.
Participants failed to predict both of these effects.
Like most psychology studies published around this time, the sample sizes and effects are not huge, so I wouldn’t be surprised if you re-ran this study and found no effect. But even if people enjoyed talking to strangers as much as they enjoy talking to their boyfriends and girlfriends, that would still be pretty surprising.