382 stories
·
0 followers

Why I stopped using AI code editors

1 Share

TL;DR: I chose to make using AI a manual action, because I felt the slow loss of competence over time when I relied on it, and I recommend everyone to be cautious with making AI a key part of their workflow.

In late 2022, I used AI tools for the first time, even before the first version of ChatGPT. In 2023, I started using AI-based tools in my development workflow. Initially, I was super impressed with the capabilities of these LLMs. The fact that I could just copy and paste obscure compiler errors along with the C++ source code, and be told where the error is caused felt like magic.

Once GitHub Copilot started becoming more and more powerful, I started using it more and more. I used various other LLM integrations right in my editor. Using AI was part of my workflow.

In late 2024 I removed all LLM integrations from my code editors. I still use LLMs occasionally and I do think AI can be used in a way that is very beneficial for many programmers. So then why don’t I use AI-powered code editing tools?

Tesla FSD

From 2019 to 2021 I drove a Tesla. Though I would never make the same purchase again, not for political reasons, just because the cars are quite low quality, very overpriced and a hell to repair or maintain.

When I got my Tesla, I started using the Full Self-Driving (FSD) anytime I could. It felt great to just put the car on FSD on the highway and zone out a bit. Switching lanes was as simple as hitting the turn signal, and the car would switch lanes. Driving for me was just getting to the highway, turning on FSD, telling the car to switch lanes every now and then, and listen to music/podcasts while zoning out.

If you drive a car often, you’ll know that when you’re driving on the highway, everything sort of happens automatically. Keeping your car in the lane at the right speed becomes a passive action, it does not require the type of focus that for example reading a book requires, it’s the type of focus that walking requires, it happens in the background of your mind.

In the period from 2019 to 2021 I exclusively drove my Tesla for longer rides. After 2021, I went back to driving regular cars and making this switch was definitely not what I expected. Driving on the highway required my full attention for the first month or so, I had to re-learn keeping the car in the middle of the lane without thinking about it.

Being reliant on Tesla’s FSD took away my own ability to go into autopilot.

My experience with AI code editors

Working with AI-powered code editors was somewhat similar. Initially, I felt that I completed work a lot faster when assisted by AI. The work I was doing most of the time was not super complex, and AI felt like putting my Tesla on FSD, I could just guide the machine to do my work for me.

In my free time, I started working on a side project on my personal account on my work device. On this account, I did not have access to Copilot and my other cool, fancy AI tools. This is when using AI started to feel very similar to my Tesla FSD story.

I felt less competent at doing what was quite basic software development than a year or so before. All of a sudden, it made it very clear to me how reliant I had become on AI tools. Anytime I defined a function, I paused in my editor to wait until the AI tools would write the implementation for me. It took some effort to remember what the syntax was to write unit tests by hand.

With my work, AI started to become less useful over time as well. Not only did it take out the fun for me, but I started to feel a bit insecure about making some implementation decisions myself. Outsourcing the decisions to the AI seemed a lot easier. But sometimes, the AI couldn’t figure things out, even with the best prompts. It was quite clear that because I did not practice the basics often, I was less capable with the harder parts as well.

The loss of Fingerspitzengefühl

Fingerspitzengefühl [ˈfɪŋɐˌʃpɪtsənɡəˌfyːl] is a German term, literally meaning “finger tips feeling” and meaning intuitive flair or instinct, which has been adopted by the English language as a loanword. It describes a great situational awareness, and the ability to respond most appropriately and tactfully. 1

Defining seniority is a very tough thing. Though in my opinion a lot of being a “senior” is in soft-skills, when it comes to the technical hard-skills, a lot comes down to Fingerspitzengefühl. The longer you work with a language, framework or codebase, the more you develop this kind of intuition of what the correct approach is. The gut feeling of “something feels off” slowly turns into a feeling of “this is what we should do”.

This developed intuition is not just on an architectural level. A big component is in the lower level details, when to use pointers (or what type of pointers), whether to use asserts or checks, what to pick from the standard library when multiple options are available (though senior C++ programmers still can’t seem to agree on this).

This intuition is what I was slowly losing when relying on AI tools a lot. And this is coming from a lead developer. When I see a lot of hype about vibe coding, I can’t help but think: how do you exactly expect to vibe code your way to senior? Where will you get the skills from to maintain and extend the vibe-coded codebase when the AI tools are down, or have become too expensive?

Even with larger context windows, more computing power, reasoning models or agents, there will be things that AI won’t be able to do. Over time, the AI tools will be more and more powerful, sure. But when you receive a Slack message that “the website works fine, but the app is down in production; I tried it locally and there it works fine, nothing in Sentry either”, good luck getting an AI agent to fix this for you. Maybe it can, maybe it can’t. And when an AI agent can’t figure it out, will your reply be “sorry, Cursor doesn’t get it, will prompt more tomorrow”?

You can get by without these tools

Sometimes it feels like you have to use AI or be out of a job in 6 months. We’ve been hearing the “3-6 months from now”-story for over two years at this point. I stopped trusting CEO promises about functionality “3-6 months from now” years ago. When I got my Tesla in 2019, I paid €6400 for functionality that was supposed to arrive in “3-6 months from now”, and the functionality is still not present the way it was promised over 5 years ago.

Right now, it is unlikely that letting AI do your coding will work for projects larger than a university project. When working on legacy systems or larger projects in enterprises or when you need to work with and consult a lot of dependency internals (like I do with Unreal Engine), AI tools will often not be able to make things work. When you need to work with internal DSLs, tools or frameworks, good luck getting LLMs to generate useful output. For some industries, you can’t even use AI tools at all for a multitude of reasons.

For some things you really should not want to rely on AI. When implementing authentication systems like JWT2 signing or RBAC3, adding “and it should be secure” to the prompt won’t make it secure if it’s been trained on GitHub code that had CVEs4. When it comes to security, you should be the person who is responsible and understands this fully. Critical systems should be written and reviewed by humans, if we are heading to a situation where one AI agent writes the code, another reviews the autogenerated PR and then another AI agent deploys the code, we will see a huge spike of security issues soon.

Where I draw the line

I still use AI, sometimes. I think it can be a great tool, when used wisely. I draw the line at integration. I keep AI fully separate from my code editor. All of the context, I add manually. I intentionally keep the effort required quite high, so it disincentivizes me.

Examples where I use AI for work include “convert these Go tests in structs to tests in a map”, “convert this calculation to SIMD”, or “when the content type is application/zlib, decode the body”5. I have set up some custom instructions to only give me the code that has changed, and give me instructions for adding it. This way, I am still the one making the changes in the codebase. Just approving a Git diff is not enough, I want to manually add the code myself, only then do I feel confident to sign off on it and take responsibility for it.

Another great use case for AI is learning. I often have questions that are quite uncommon, as I have a few very niche interests. Turns out, adding netcode to a custom game engine using ECS doesn’t have a lot of learning resources. What has worked for me, is asking AI to explain pieces of code, like “explain this assembly code”, “explain what this shader does”, “which books go in-depth about resolving client/server desyncs in game engines”. The AI seems to struggle with these sometimes, I’m getting mixed results, but the results are still much better than search engines. I will even use it for this article, though not for writing content, but for checking6.

Another benefit of using AI this way is the cost. No unnecessary API calls, manually managed contexts and more control over the LLM settings. I use a desktop application with a bunch of different LLMs hooked up to it. I have used it daily for the last 3 months or so, and in total, I have consumed around $4 in credits.

I do want to add that with some things I am more strict. On my personal website, I don’t want any AI-generated content, whether that’s text or images. I don’t like AI generated images or ‘art’ personally for various reasons and I think AI-generated text lacks character, it feels very flat and boring. When something is created by humans, it to me has more value than when it’s created by AI.

Doing what you love

It is also worth noting that there are more things to think about than efficiency and productivity. It’s also about doing what you love. If you love coding, keep doing it yourself, even if a computer might be better at it.

In 1997, Deep Blue won the chess match against the then world chess champion Garry Kasparov7, yet people still play chess. When it comes to programming, I’d say that I program for the same reason that people still play chess8. Though chess and software development are very different, with chess being much more limited in scope, I think it is good to keep in mind that sometimes, we can do things just to enjoy them.

My advice to new programmers

Don’t become a forever junior who lets AI do all their work. If you want to become a programmer, learn to program yourself. Be curious, put in the time and effort to learn how things really work, and how things work in the layer below that. It really pays off. Learning how everything works under the hood and using that is amazing, just keep learning, don’t be a prompt engineer (if you can even call that engineering). Believe me, it’s more fun to be competent9.

Even though AI might be smarter than you, never blindly trust the AI output. Don’t build your whole workflow around it. Sometimes try to work without it for a few days. The better at programming you are, the more AI will get in your way for the more complex work.

If you learn to code now, keep building your skills instead of letting AI do all the heavy lifting, you’ll be capable of fixing the messes that vibe coding is now creating. I don’t want to sound elitist, but if you don’t want to learn to go beyond vibe coding, maybe coding isn’t for you. Because positions where all work can be done by vibe coding are the ones that will be eliminated first when AI becomes more powerful.

And remember: if you cannot code without AI, you cannot code.

Conclusion

When you are using AI, you are sacrificing knowledge for speed. Sometimes it’s worth making this trade-off. Though it is important to remember that even the best athletes in the world are still doing their basic drills for a reason. The same applies to software development: you need to practice the basics, to be able to do the advanced work. You need to keep your axe sharp.

We are still a long way out from AI taking over our jobs. A lot of companies are creating FOMO10 as a sales tactic to get more customers, to show traction to their investors, to get another round of funding, to generate the next model that will definitely revolutionize everything.

AI is a tool, it is not good or bad in itself, it’s what you do with it. I do think it can be a great tool, as long as you are not reliant on it for your workflow. Make sure you can still work effectively without it, make sure you don’t push code to production that you don’t fully understand and don’t think of AI as a replacement for your own thinking. Stay curious, keep learning.


  1. Source: Wikipedia ↩︎

  2. JSON Web Tokens, or JWTs are a common way to generate authentication tokens, among other uses ↩︎

  3. Role-based access control (RBAC) is a mechanism to restrict system access by setting permissions and privileges ↩︎

  4. Common Vulnerabilities and Exposures (CVE) is a program used to identify, define and catalog publicly disclosed cybersecurity vulnerabilities, cve.org ↩︎

  5. The specific contents here don’t matter that much, they are just examples of what I use AI for ↩︎

  6. The prompt I’ve used for this article: “I want you to proofread an article I have written. I want you to give me feedback on incorrect grammar or broken sentences, using UK grammar. Do not comment on sentences that should be broken up or things that could be improved just slightly, only real errors. Do not return modified sentences, but point out where the issue is, under which paragraph, in which sentence and what the mistake is. I will make the required changes myself”. It came back with a few typos, like “form” that should be “from”, “eb” that should be “be”. ↩︎

  7. Source: IBM History ↩︎

  8. Source: Tsoding on YouTube ↩︎

  9. Source: DHH in an interview on YouTube ↩︎

  10. Fear of missing out ↩︎

Read the whole story
mrmarchant
1 hour ago
reply
Share this story
Delete

The Average College Student Is Illiterate

1 Share
Oxford undergraduates on a late night drinking spree, 1824. By Robert Cruikshank. (Photo by Hulton Archive.)

I’m Gen X. I was pretty young when I earned my PhD, so I’ve been a professor for a long time—over 30 years. If you’re not in academia, or it’s been a while since you were in college, you might not know this: the students are not what they used to be. The problem with even talking about this topic at all is the knee-jerk response of, “yeah, just another old man complaining about the kids today, the same way everyone has since Gilgamesh. Shake your fist at the clouds, dude.” So yes, I’m ready to hear that. Go right ahead. Because people need to know.

First, some context. I teach at a regional public university in the United States. Our students are average on just about any dimension you care to name—aspirations, intellect, socio-economic status, physical fitness. They wear hoodies and yoga pants and like Buffalo wings. They listen to Zach Bryan and Taylor Swift. That’s in no way a put-down: I firmly believe that the average citizen deserves a shot at a good education and even more importantly a shot at a good life. All I mean is that our students are representative; they’re neither the bottom of the academic barrel nor the cream off the top.

As with every college we get a range of students, and our best philosophy majors have gone on to earn PhDs or go to law school. We’re also an NCAA Division 2 school and I watched one of our graduates become an All-Pro lineman for the NFL. These are exceptions, and what I say here does not apply to every single student. But what I’m about to describe are the average students at Average State U.

Reading

Most of our students are functionally illiterate. This is not a joke. By “functionally illiterate” I mean “unable to read and comprehend adult novels by people like Barbara Kingsolver, Colson Whitehead, and Richard Powers.” I picked those three authors because they are all recent Pulitzer Prize winners, an objective standard of “serious adult novel.” Furthermore, I’ve read them all and can testify that they are brilliant, captivating writers; we’re not talking about Finnegans Wake here. But at the same time they aren’t YA, romantasy, or Harry Potter either.


Persuasion is also the home of American Purpose, Francis Fukuyama’s blog, and the Bookstack podcast! To receive all of this great content, simply click on “Email preferences” below and make sure you toggle on the relevant buttons.

Email preferences


I’m not saying our students just prefer genre books or graphic novels or whatever. No, our average graduate literally could not read a serious adult novel cover-to-cover and understand what they read. They just couldn’t do it. They don’t have the desire to try, the vocabulary to grasp what they read, and most certainly not the attention span to finish. For them to sit down and try to read a book like The Overstory might as well be me attempting an Iron Man triathlon: much suffering with zero chance of success.

Students are not absolutely illiterate in the sense of being unable to sound out any words whatsoever. Reading bores them, though. They are impatient to get through whatever burden of reading they have to, and move their eyes over the words just to get it done. They’re like me clicking through a mandatory online HR training. Students get exam questions wrong simply because they didn’t even take the time to read the question properly. Reading anything more than a menu is a chore and to be avoided.

They also lie about it. I wrote the textbook for a course I regularly teach. It’s a fairly popular textbook, so I’m assuming it is not terribly written. I did everything I could to make the writing lively and packed with my most engaging examples. The majority of students don’t read it. Oh, they will come to my office hours (occasionally) because they are bombing the course and tell me that they have been doing the reading, but it’s obvious they are lying. The most charitable interpretation is that they looked at some of the words, didn’t understand anything, pretended that counted as reading, and returned to looking at TikTok.

This study says that 65% of college students reported that they skipped buying or renting a textbook because of cost. I believe they didn’t buy the books, but I’m skeptical that cost is the true reason, as opposed to just the excuse they offer. Yes, I know some texts, especially in the sciences, are expensive. However, the books I assign are low-priced. All texts combined for one of my courses is between $35-$100 and they still don’t buy them. Why buy what you aren’t going to read anyway? Just google it.

Subscribe now

Even in upper-division courses that students supposedly take out of genuine interest they won’t read. I’m teaching Existentialism this semester. It is entirely primary texts—Dostoevsky, Kierkegaard, Nietzsche, Camus, Sartre. The reading ranges from accessible but challenging to extremely difficult but we’re making a go of it anyway (looking at you, Being and Nothingness). This is a close textual analysis course. My students come to class without the books, which they probably do not own and definitely did not read.

Writing

Their writing skills are at the 8th-grade level. Spelling is atrocious, grammar is random, and the correct use of apostrophes is cause for celebration. Worse is the resistance to original thought. What I mean is the reflexive submission of the cheapest cliché as novel insight.

Exam question: Describe the attitude of Dostoevsky’s Underground Man towards acting in one’s own self-interest, and how this is connected to his concerns about free will. Are his views self-contradictory?

Student: With the UGM its all about our journey in life, not the destination. He beleives we need to take time to enjoy the little things becuase life is short and you never gonna know what happens. Sometimes he contradicts himself cause sometimes you say one thing but then you think something else later. It’s all relative.

Either that, or it looks like this:

Exam question: Describe the attitude of Dostoevsky’s Underground Man towards acting in one’s own self-interest, and how this is connected to his concerns about free will. Are his views self-contradictory?

Student: Dostoevsky’s Underground Man paradoxically rejects the idea that people always act in their own self-interest, arguing instead that humans often behave irrationally to assert their free will. He criticizes rationalist philosophies like utilitarianism, which he sees as reducing individuals to predictable mechanisms, and insists that people may choose suffering just to prove their autonomy. However, his stance is self-contradictory—while he champions free will, he is paralyzed by inaction and self-loathing, trapped in a cycle of bitterness. Through this, Dostoevsky explores the tension between reason, free will, and self-interest, exposing the complexities of human motivation.

That’s right, ChatGPT. The students cheat. I’ve written about cheating in “Why AI is Destroying Academic Integrity,” so I won’t repeat it here, but the cheating tsunami has definitely changed what assignments I give. I can’t assign papers any more because I’ll just get AI back, and there’s nothing I can do to make it stop. Sadly, not writing exacerbates their illiteracy; writing is a muscle and dedicated writing is a workout for the mind as well as the pen.

What’s changed?

The average student has seen college as basically transactional for as long as I’ve been doing this. They go through the motions and maybe learn something along the way, but it is all in service to the only conception of the good life they can imagine: a job with middle-class wages. I’ve mostly made my peace with that, do my best to give them a taste of the life of the mind, and celebrate the successes.

Things have changed. Ted Gioia describes modern students as checked-out, phone-addicted zombies. Troy Jollimore writes, “I once believed my students and I were in this together, engaged in a shared intellectual pursuit. That faith has been obliterated over the past few semesters.” Faculty have seen a stunning level of disconnection.

What has changed exactly?

  • Chronic absenteeism. As a friend in Sociology put it, “Attendance is a HUGE problem—many just treat class as optional.” Last semester across all sections, my average student missed two weeks of class. Actually it was more than that, since I’m not counting excused absences or students who eventually withdrew. A friend in Mathematics told me, “Students are less respectful of the university experience —attendance, lateness, e-mails to me about nonsense, less sense of responsibility.”

  • Disappearing students. Students routinely just vanish at some point during the semester. They don’t officially drop out or withdraw from the course, they simply quit coming. No email, no notification to anyone in authority about some problem. They just pull an Amelia Earhart. It’s gotten to the point that on the first day of class, especially in lower-division, I tell the students, “Look to your right. Now look to your left. One of you will be gone by the end of the semester. Don’t let it be you.”

  • They can’t sit in a seat for 50 minutes. Students routinely get up during a 50 minute class, sometimes just 15 minutes in, and leave the classroom. I’m supposed to believe that they suddenly, urgently need the toilet, but the reality is that they are going to look at their phones. They know I’ll call them out on it in class, so instead they walk out. I’ve even told them to plan ahead and pee before class, like you tell a small child before a road trip, but it has no effect. They can’t make it an hour without getting their phone fix.

  • It’s the phones, stupid. They are absolutely addicted to their phones. When I go work out at the Campus Rec Center, easily half of the students there are just sitting on the machines scrolling on their phones. I was talking with a retired faculty member at the Rec this morning who works out all the time. He said he has done six sets waiting for a student to put down their phone and get off the machine he wanted. The students can’t get off their phones for an hour to do a voluntary activity they chose for fun. Sometimes I’m amazed they ever leave their goon caves at all.

I don’t blame K-12 teachers. This is not an educational system problem, this is a societal problem. What am I supposed to do? Keep standards high and fail them all? That’s not an option for untenured faculty who would like to keep their jobs. I’m a tenured full professor. I could probably get away with that for a while, but sooner or later the Dean’s going to bring me in for a sit-down. Plus, if we flunk out half the student body and drive the university into bankruptcy, all we’re doing is depriving the good students of an education.

We’re told to meet the students where they are, flip the classroom, use multimedia, just be more entertaining, get better. As if rearranging the deck chairs just the right way will stop the Titanic from going down. As if it is somehow the fault of the faculty. It’s not our fault. We’re doing the best we can with what we’ve been given.

All this might sound like an angry rant. I’m not angry, though, not at all. I’m just sad. One thing all faculty have to learn is that the students are not us. We can’t expect them all to burn with the sacred fire we have for our disciplines, to see philosophy, psychology, math, physics, sociology, or economics as the divine light of reason in a world of shadow. Our job is to kindle that flame, and we’re trying to get that spark to catch, but it is getting harder and harder and we don’t know what to do.

Hilarius Bookbinder is the pseudonym for a tenured professor with an Ivy League PhD who writes Scriptorium Philosophia.

A version of this essay originally appeared in Scriptorium Philosophia.


Follow Persuasion on X, LinkedIn, and YouTube to keep up with our latest articles, podcasts, and events, as well as updates from excellent writers across our network.

And, to receive pieces like this in your inbox and support our work, subscribe below:

Subscribe now

Read the whole story
mrmarchant
3 hours ago
reply
Share this story
Delete

The Ordinary Sacred

1 Share

A Philosophy of Happiness Through the Uncurated Life

The Ordinary Sacred

In 1953, Ernest Dichter—the father of motivational research—wrote that the American consumer was no longer purchasing soap to clean themselves, but to feel clean.

Advertising wasn’t selling products.

It was selling identity.

A bar of soap promised not just hygiene but moral worth.

Fast-forward seventy years, and you can trace a straight, greasy line from Dichter to Instagram meal-prep influencers, YouTubers waxing poetic about minimalism from $4,000 Herman Miller chairs, and Twitter productivity gurus who wake up at 4:30 a.m. to drink bulletproof coffee and document their sense of superiority.

We didn’t stop at soap. That instinct—to purchase meaning, to wear values like accessories—expanded until it swallowed almost everything. Now, even our meals, hobbies, outfits, and downtime are curated to project identity. Underneath it all is the same pitch Dichter uncovered: you are what you consume. Only now, it’s not just what you buy—it’s how you present it. Life itself is packaged, stylized, and sold back to the self.

Jean Baudrillard summed it up:

“We are at the point where consumption is laying hold of the whole of life, where all activities are sequenced in the same combinatorial mode, where the course of satisfaction is outlined in advance, hour by hour, where the ‘environment’ is total—fully air-conditioned, organized, culturalized. The beneficiary of the consumer miracle also sets in place a whole array of sham objects, of characteristic signs of happiness, and then waits (wars desperately a moralist would say) for that happiness to alight.”

We’re miserable.

Aren’t we?

Lonelier, more anxious, more frustrated, and more exhausted than ever.

Not because we lack comfort or tools or access—but because we’ve staged too much of ourselves.

We’ve turned living into editing. When every bite is a performance, every outfit a brand decision, every hobby a pitch, there’s no space left for boredom. Or rest. Or actual pleasure. We scroll through each other’s highlight reels while quietly assembling our own, haunted by the suspicion that everyone else is doing it better—and forgetting to live in any of it.

I crashed hard after the pandemic. It wasn't cinematic, and it wasn't a breakthrough. It was dull. Exhausting. Slow. It was a kind of rot, spreading inward. At first, I thought I was just tired. Overstimulated, under-rested, another victim of collective burnout. But it got worse. My routines—ones I’d once clung to like scaffolding—started to feel grotesque. Wake up, hydrate, gratitude journal, morning sunlight, stretch, optimize. For what? I couldn’t answer. I didn’t even want to ask anymore. I’d spent years trying to build a life that looked good on paper, sounded smart in conversation, and played well on social media. I had all the systems, all the trackers, all the polished, adult routines. I was a walking Notion template. And...I was hollowed out. I wasn’t living—I was managing myself like a brand asset. I didn’t know how to stop.

What broke me wasn’t a single moment. It was the accumulation of hundreds of tiny ones that didn’t feel like living. Eating meals I didn’t enjoy but felt obligated to consume because they were “clean.” Turning walks into podcasts into productivity. Posting through loneliness and calling it community. Trying to be impressive while privately falling apart. I felt like a parody of myself—curated, competent, and completely numb.

There’s a Camus quote I keep coming back to:

“At any street corner, the feeling of absurdity can strike any man in the face. As it is, in its distressing nudity, in its light without effulgence, it is elusive. But that very difficulty deserves reflection. It is probably true that a man remains forever unknown to us and that there is in him something irreducible that escapes us.”

Absurdity is precisely the word.

One night, I just sat on the floor and thought: none of this is working. None of this is helping. I’ve done everything modern culture told me to do to be happy, successful, fulfilled. I was tracking everything but feeling nothing. That was the moment something cracked open in a quiet, defeated realization: I don’t want to live like this. I don’t want my life to be a performance. I don’t want to be optimized. I just want to feel human again. I want to be messy. Boring. Unimpressive. Real.

In the months, years since the pandemic's peak, I've been unable to reconcile the cognitive dissonance. Seeing the inauthenticity and performance of modern happiness has made it impossible to achieve happiness through the same means. There's a falseness to it all, a sense of how fragile the facade actually is.

After the collapse, after the burnout, after the creeping dread that none of the things I’d been told to care about were making me feel human, I started noticing what actually felt good. Not "aspirational" good. Not "productive" good. Just good. A grilled cheese sandwich eaten in the sun. A day without notifications. Saying no and not explaining. I didn’t see it as a philosophy. I just knew I felt less fake. Less hollow. Less like I was performing a version of myself I couldn’t stand anymore. Over time, I started tracing a pattern. What if I stopped managing my life like a brand? What if I let it be messy, private, low-stakes? What if that was enough?

The Ordinary Sacred: A Philosophy of Uncurated Life

The Ordinary Sacred is my idea for a philosophy of presence without spectacle. A life without audience. A refusal to curate the self into something consumable. It honors sufficiency over scale, texture over narrative, and experience over optics. It says: the real, unpolished, unposted life is enough—and always was.

It names the thing without dressing it up. It doesn’t try to win attention. It doesn’t demand belief. It simply offers a shift—a posture of attention, a refusal to perform, a commitment to being here. And if it sounds pretentious, hell - you should see the rest of my blog.

The Ordinary Sacred turns directly into the mundane. This is where meaning lives. In unremarkable afternoons. In laughter that goes unrecorded. In friendships that don’t need captions. In lives that never go viral. In meals that don’t “flush toxins” or whatever absurd promise is making the wellness rounds this week.

This is the closest I’ve come to a working, personal theory of happiness. Not the performative kind. Not the kind with a morning routine and a five-year plan. Not the kind you can monetize, or coach, or convert into bullet points or turn into a cult. The kind you feel in your teeth when you bite into something good, or in your chest when you laugh too hard, or in your shoulders when you realize—for the first time all day—that you’re not clenching.

It starts with a full-body, bone-deep refusal. To stop turning your life into content. To stop polishing your edges so you can be easier to consume. To stop translating every moment into something brand-safe, clever, legible. To stop acting like joy only counts if it comes with a graph and performs well in metrics.

We live under systems—economic, cultural, digital—that demand we strive to be impressive. Inspirational. Aspirational. Permanently visible. Permanently performing. Eternally, achingly unsatisfied. We’re trained to ask, before doing anything: Will this make good content? Will this signal something useful? Will this get me closer to who I’m “supposed” to be?

The Ordinary Sacred says: fuck all that. Be ordinary. Be quiet. Be offline. Do things because they feel good, or because they’re funny, or because they’re yours. Not because they’ll look good later. What's most valuable isn't what's exceptional, curated, or performative, but rather what's common, authentic, and directly experienced.

It’s not anti-tech, or anti-ambition, or a cry to return to the earth from whence, etc. You can still use GPS. You can still have a favorite brand of headphones. You don’t have to churn butter or reject civilization or free float into the void. You just have to stop selling yourself to yourself.

It’s lo-fi, low-stakes, non-viral. It’s fully alive. It gives you permission to breathe. To be boring. To be happy in a way that no one else gets to gatekeep or approve.

TLDR: The Ordinary Sacred:

There are no habits, wellness tips or life hacks here. Just solid, practical and deeply personal decisions in a culture that pushes all of us to perform.

They’re not the whole picture.

Only the first four that made sense.

  1. Eat for yourself, not for anyone else — Real food, real joy.
  2. Work to pay your bills, not to validate your worth — Labor is not identity.
  3. Buy things, not signals — Use and want over performance.
  4. Live your own life, not for your feed — Escape is freedom.

Eat For Yourself

Wellness influencers parade turmeric like it’s a gift from an alien super being. Kale is fetishized. Everything is raw, anti-oxidized, adaptogenic. Food is now a purification ritual for people afraid of living in their bodies. It’s asceticism disguised as luxury. We pretend a spirulina bowl is satisfying. It’s fucking not. It tastes like damp grass and self-denial.

Food has become a moral minefield. We talk about "clean eating" like impurity is contagious. We moralize portions. We collapse nutrition into discipline, and discipline into righteousness. It’s no longer just a question of what you eat—it’s a referendum on who you are. Hungry? Control it. Ate something with cheese? Better confess. Craving carbs? Repent.

We count calories like sins. We call dessert a "guilty pleasure." We look at our plates and think about punishment. All of it delivered in a glossy wrapper of faux empowerment—"wellness" that is just diet culture with a better font.

And this grift hits hardest where it always does: at the intersection of class, image, and shame. The wellness industry sells restriction as virtue, but only if it comes in the right package. Smoothie bowls cost $14. Alkaline water is aspirational. Fasting is chic if you’re thin and rich, disordered if you’re poor and fat. We have created an entire language of virtue and failure out of what people put in their mouths. And we judge. Constantly.

It's not health. It's a caste system built on quinoa and quiet cruelty.

Food is not fuel. It’s food. It’s greasy, salty, sweet, hot, cold, cheap, satisfying. It’s a double cheeseburger from a place with flickering lights. It’s fries at midnight, cereal for dinner, gas station snacks on a road trip. It’s eating something because it smells good, not because it fits your macros or matches your aesthetic. It doesn’t need to be “activated.” It doesn't need to be spiritualized. It doesn't need to be posted.

Eat the thing. Eat it hot, with your hands, juice on your chin. Don’t wait for the right angle. Don’t explain it. Don’t count it. Taste it.

And when you taste it—really taste it—you’ll understand why Brillat-Savarin, the French epicure and proto-food philosopher, opened his Physiology of Taste with this line: “Tell me what you eat, and I will tell you what you are.” He wasn’t moralizing. He was celebrating. He argued that pleasure from food was not only real, but necessary: “The pleasures of the table belong to all ages, to all conditions, to all countries, and to all areas; they mix with all other pleasures and remain at last to console us for their loss.”

Food isn’t a weakness. It’s a foundation. It outlives beauty, relevance, even sex. It is the last pleasure to go. And still, we punish ourselves. We eat alone and scroll silently past staged plates that make us feel worse. We are told to crave less, shrink more, fast longer.

And we carry that punishment into our bodies. The obsession with weight has become its own religion—worship of thinness masquerading as discipline, devotion measured in deprivation. Your worth reduced to numbers: grams, inches, pounds. Your character tied to how well you say no to hunger. This isn’t health. This is control.

Eat with others. Eat things that drip. Let food be messy and close and real. As M.F.K. Fisher wrote in 1943, during wartime rationing no less, “Sharing food with another human being is an intimate act that should not be indulged in lightly.” That intimacy—the kind born from greasy hands passing a burger across a table—is more nourishing than any supplement, any powder, any smug, made-up-toxin-free pile of performative buckwheat.

Seneca, even in his Stoicism, admitted this: “We should look for someone to eat and drink with before looking for something to eat and drink.” Eating alone isn’t just lonely. It’s unnatural.

A cheeseburger—greasy, simple, immediate—is not a compromise. It’s an honest answer. You don’t have to justify it. You just have to chew.

Simple happiness honors appetite. It mocks dietary performance art. It sees virtue signaling in quinoa bowls and says: you are allowed to want salt. You are allowed to want fat. You are allowed to be full. You are allowed to be more than full. You are allowed to enjoy yourself and your self.

The world doesn’t need more photographed smoothie bowls. It needs more loud laughter over shared fries, more sauce on shirts, more meals that begin with hunger and end in satisfaction, not shame. That’s living. That’s the ethic.

This—this mess, this unfiltered joy—is part of what The Ordinary Sacred defends. A return to trust. In our bodies. In our appetites. What we eat is not a performance. It’s a reunion with our selves, with each other, with the parts of life we’ve been taught to edit out.

Work to Pay Your Bills

The modern economy is a pantomime of purpose. "Do what you love," we are told. As if love is scalable. As if passion is billable. As if your deepest sense of meaning should report to a quarterly KPI.

It sounds harmless, even inspiring, until you realize it’s a trap. Once you collapse identity into labor, every moment becomes a performance review. You’re not just working. You’re branding. You’re optimizing. You’re pretending that the thing you’re doing for rent is your soul’s calling. And if it isn’t? Then you’ve failed—not just professionally, but existentially.

There’s something uniquely exhausting about being alive in an era where your job description is expected to double as your identity. You’re not a coder—you’re a Ninja, a Guru, an Evangelist. The Ordinary Sacred looks at all that and laughs. No one needs your LinkedIn headline to be poetic. Just make rent. Go home. Be a person.

Work is work. You show up. You do the thing. You clock out. That doesn’t make you lazy. That makes you sane.

And yes, there’s dignity in labor. But there’s also dignity in limits. Working more doesn’t make you more if it doesn’t make you more money to support your family, to live your life. You can be brilliant and not be busy. You can be wise and take naps. You can be proud, virtuous (if that's your schtick), and completely unscalable.

There's comfort here in Aristotle. In Politics, he writes, “the end of labor is to gain leisure,” and defines leisure not as rest from exhaustion, but as the space in which real life begins. It’s attention. It’s spaciousness. It’s where we cultivate thought, relationships, and the kinds of things you can’t track on a company dashboard. But it’s also binge-watching a show until your brain goes blank because, fuck it, that’s what you want to do, sitting on the couch at the end of the day with the person you love more than anyone else in the whole damn world and cycling through episodes of Star Trek. Because that’s where the happiness is.

Seneca picked up that thread centuries after Aristotle. In his essay On the Shortness of Life, he observed, “It is not that we have a short time to live, but that we waste a lot of it… Life is long if you know how to use it.” He was writing to Romans who were overcommitted, overextended, and distracted by ambition. Sound familiar? His solution wasn’t to do more. It was to stop mistaking busyness for worth.

In his 1932 essay In Praise of Idleness, Bertrand Russell argued that the modern worship of work had become a sickness. “The morality of work,” he wrote, “is the morality of slaves, and the modern world has no need of slavery.” What he proposed wasn’t laziness—it was balance. He imagined a world where people worked less not because they lacked ambition, but because they had other things worth doing: reading, thinking, walking, being. His version of a good life wasn’t crammed with productivity. It was spacious enough for thought, curiosity, and actual rest. Work wasn’t the point. Living was.

But here we are—half-human, half-product, refreshing inboxes and waiting for the dopamine of a Slack ping. Watching people post hustle reels while secretly googling burnout symptoms.

Stop making your job your personality. Pay your bills. Turn off your phone. Leave the spreadsheet in the cloud and go make dinner. You are not a commodity. You are not your engagement metrics. You are not your title or your side hustle or your carefully sharpened personal brand.

You are a person. Work enough to sustain that. Then stop.

Buy Things, Not Signals

There are whole industries now that don’t sell products—they sell proof. Proof that you’re tasteful. Proof that you’re refined. Proof that your life is intentional, curated, and on trend. There are couches you can’t nap on, plates you don’t eat off, clothes you only wear to signal restraint. These things don’t serve life. They stage it.

We’ve convinced ourselves that taste is the same thing as virtue. That a beautiful apartment is a moral accomplishment. That buying the right lamp means you’ve achieved some kind of internal alignment. But what’s really happening is this: we’re buying our own dissatisfaction. We’re furnishing for the feed. We’re building rooms we don’t relax in. Can’t relax in. Can’t even live in.

Curation culture is a profound narrowing. It's conformity with better lighting. You are allowed to be unique, but only in ways the algorithm understands: neutral tones, soft textures, minimalist aesthetics with maximum expense. Your life becomes a set. And you? A prop.

The Ordinary Sacred doesn’t care what your living room looks like on camera. It doesn’t care if you have matching decanters or poured concrete bookends. It asks: is your life usable? Do your things serve you, or do you serve them?

This tension isn’t new. The Stoics wrote about it constantly—not to glorify suffering, but to keep the focus where it belonged. Musonius Rufus, in his Lectures, writes:

“We must not admire those who own great possessions, but those who have the strength to do without them. For it is not he who has little, but he who desires more, that is poor. The man who is not in need is not the one who has much, but the one who can go without much.”

That’s not minimalism-as-aesthetic. That’s minimalism as escape from the mental leash of consumer anxiety. From the constant itch of want. From the cultural myth that more—prettier, sleeker, more tasteful—is the same as better.

Socrates, when walking through the Athenian marketplace, is reported to have said, “How many things there are in this world of which I have no need.” Imagine saying that in a shopping mall. Imagine feeling it in your bones while scrolling through influencer home tours. It’s not asceticism. It’s relief.

Aim lower. Buy what works. Buy what brings you real joy, not curated aspiration. You don’t need a capsule wardrobe. You need pants that don’t dig into your ribs when you sit. You don’t need artisanal Japanese storage boxes. You need to be able to find your damn keys.

This is not an attack on beauty. Beauty matters. But performance is a parasite that feeds on it. When the primary function of an object becomes its optics, it’s already dead. A beautiful object that demands anxiety isn’t beautiful. It’s a burden.

There is no moral prize for restraint. No award for matching the algorithm’s idea of dignity. Buy dumb stuff. Buy ugly mugs that feel good in your hand. Buy cozy blankets in loud colors. Buy the thing that reminds you of your childhood, even if it’s kitsch. Let your home look like you actually live in it.

Do it for you. Not for the performance.

Live Your Own Life

This is the big one. This is the hardest one. Because even if you eat what you want, and work to live, and buy things that serve you, there’s still the temptation and the judgement of the performance. The hum. The constant background noise of the self being watched.

And worse: the constant comparison.

You check your reflection in the café window. But you’re not really looking at yourself. You’re comparing that reflection to someone else’s post. Someone else’s morning. Someone else’s outfit, hair, discipline, joy. You half-laugh and wonder how it would’ve looked on camera. You draft the caption for the moment before the moment has even ended. You narrate your own life in real time, for an imaginary audience who may or may not ever see it. Every good day is content-in-waiting. Every bad day is a potential comeback arc. We have internalized the lens. And through that lens, we are constantly measuring.

Their kitchen is cleaner. Their life is quieter. Their goals are clearer. Their grief is more graceful. Their version of authenticity looks better than yours feels.

Social media didn’t create this instinct—it just strapped a monetization engine to it. Humans have always sought approval. But now we seek it in obsessively, constantly, publicly, and with analytics. Everything becomes a pitch. Even our sincerity starts to feel rehearsed. Erich From, in The Sane Society, warned of the rise of what he called the “marketing orientation,” where a person’s value is determined by their ability to sell themselves. “The self is experienced as a commodity,” he wrote, “whose value and meaning are extrinsic to the self and lie in its exchangeability.”

And what is Instagram but an exchange market for identity? What is TikTok but a derivatives market for personality? We are priced by our virality, our reach, our relevance. Every scroll becomes a series of self interrogations: Who’s doing life better? Who’s further along? Who’s winning?

Get out. Get the fuck out. Not forever. Not in dramatic fashion. Just enough to find your pulse again. Just long enough to remember that you were never supposed to be measured against everyone else’s highlight reel.

Go for a walk and don’t track your steps. Sit in a room without documenting the lighting. Tell a joke that doesn’t get recorded, doesn’t get retweets, doesn’t get remembered by anyone but the person you told it to. Laugh in a way that’s too loud. Be unflattering. Be unreadable. Wear the weird shirt. Take the ugly photo and don’t post it.

I’m not flogging the dead-horse of "authenticity." That word has been bled dry by marketers. Everything is "authentic" now—packaged transparency, curated vulnerability, aesthetic relatability. What we’re talking about is something rarer: being invisible. Not erased. Not ashamed. Just free from the eyes that don’t matter.

There is a kind of peace in the unrecorded life. A soft, slow quiet that doesn’t ask you to perform. Montaigne understood this long before the internet existed. "I want to be seen as I am," he wrote, "neither better nor worse." And yet, he also kept most of his thoughts private. “The greatest thing in the world is to know how to belong to oneself.”

Be unrecorded. Be untagged. Be.

Marcus Aurelius, writing to himself in Meditations, saw the trap of reputation clearly. “Waste no more time arguing what a good man should be. Be one.” Be. And later: “Do not waste what remains of your life in speculating about your neighbors... how he did this or that, or what he said, or thought, or schemed. Look instead to what you have to do yourself.”

And Epictetus was even blunter: “If you want to improve, be content to be thought foolish and stupid.” It’s the opposite of branding. It’s the rejection of legibility. Improvement, he suggests, is internal. The rest is noise.

In Being and Time, Heidegger argued that modern life alienates us from authentic existence by pushing us into the “they-self”—a mode of being where we exist primarily through the eyes of others. We become anonymous, absorbed in what "they" do, think, expect. "Everyone is the other, and no one is himself," he writes. Social media has weaponized the they-self. And we volunteered.

You can undo that. Not with slogans. With practice. With attention. You can ask: who are you when you’re not being watched? Who are you when nothing is being documented? Who are you without the mirror of other people’s reactions?

You do not need to be legible. You do not need to be a narrative. You do not need to be consumed. You need to be present.

Attention is sacred. Give it to your life—not to the algorithm. And for the love of your peace: stop measuring. You were never supposed to be someone else’s content. And they were never meant to be yours.

The Counterarguments

Isn’t this just another aesthetic? Isn’t this just a different kind of performance—anti-gloss as gloss, curated messiness, low-res virtue signaling? Isn’t it just normcore for burnout millennials?

That’s the risk. Absolutely. Every idea can be flattened into content, ironized into branding, merchandised into a lifestyle product with a logo and a tagline. But that doesn’t mean the idea is empty. It means it’s vulnerable. It means you have to guard it.

In The Society of the Spectacle, Guy Debord warned that “everything that was directly lived has moved away into a representation.” Experience becomes image. Life becomes theatre. Even revolt becomes performance. “The spectacle,” he wrote, “is not a collection of images; it is a social relation among people, mediated by images.”

Which means that anything—including this—can be eaten by the spectacle and turned into another flavor of capitalism. That doesn’t make it meaningless. And it doesn't make it any less precious.

Living the Ordinary Sacred doesn’t mean you disappear. It doesn’t mean you never post again, or live in a cabin without Wi-Fi, or become the annoying guy at parties who lectures people about Instagram. It means you post less, and later, and without expectation. You don’t stage your joy. You don’t curate your dinner. You don’t turn every genuine thing into a teaser for your personality.

You let some things remain yours. Not in the sense of ownership. In the sense of privacy. In the sense that no one else gets to consume or be consumed by them.

Montaigne knew it, even in the 16th century. He wrote, “A man must keep a little back-shop, all his own, wherein to be himself, without reserve.” Not everything goes in the window. Not everything is for show. “If we have been able to live well and think well,” he continued, “we have lived enough.”

You are allowed to have a self that isn’t publicly vetted. You are allowed to experience a moment that doesn’t become story. You are allowed to live outside the logic of performance.

The Ordinary Sacred is a practice. And like any practice worth doing, it requires restraint. You resist the urge to narrate, to beautify, to stage. Not because those things are evil, but because they’re constant—and you need space to hear yourself think.

You can still enjoy the internet. You can still have fun online. But the minute your real life starts feeling like a draft folder for future posts, something is broken. The Ordinary Sacred is the attempt to stop that collapse. To create space between the event and the upload. To protect the sacred ordinary from the economy of spectacle.

This isn’t countercultural for the sake of it. It’s countercultural because the culture is sick. You don’t need to reject technology. You need to reject the demand to turn your soul into a feed.

Keep something for yourself. Leave something unposted. Let your life be bigger than your screen.

I'm not claiming this is a new idea. It draws from a lineage of thinkers and traditions that insisted the sacred was never elsewhere. It was always in the here and now—if you knew how to look.

Zen Buddhism and Stoicism give us the clearest gesture: chop wood, carry water. Enlightenment isn’t found in escape or spectacle. It’s found in full presence within the ordinary. The practice is the life. The life is the practice.

Phenomenology, especially in the work of Merleau-Ponty, roots all meaning in lived experience. There is no truth apart from the body, from perception, from being-in-the-world. The ordinary isn’t something to transcend. It’s the only ground we’ve ever had.

The Transcendentalists—Emerson, Thoreau, Fuller—understood that what’s called "divine" doesn’t live above us. It lives around us. Underfoot. In trees and rivers and silence and solitude. Thoreau wrote: “I went to the woods because I wished to live deliberately… and see if I could not learn what it had to teach.” He didn’t find truth in institutions. He found it in the texture of everyday life.

Thomas Moore, in Care of the Soul, wrote about tending to the world, not overcoming it. He reframed the sacred as something found in dishes, conversations, melancholy, repetition. Not transcendence. Depth.

Annie Dillard, in Pilgrim at Tinker Creek, knew a kind of fierce noticing. She found the infinite in the specific. The sacred in the granular. “How we spend our days is, of course, how we spend our lives,” she wrote. No spectacle. Just close, attentive being.

Where This Leaves Me

I got here the long way around. Through collapse. Through burnout. Through the slow erosion of energy, interest, and self. Through all the self-improvement and content and aesthetics and hustle and intention and systems and routines—until one day I just couldn’t do it anymore.

I kept looking for something that would make my life feel like mine. Something to click. Something to unlock the sense that I was really in it. Instead, I just got more polished. More watchable. More brandable. More tired.

I thought quitting would look like failure. But it didn’t. It looked like sitting on the floor and breathing. It looked like making scrambled eggs without narrating it. It looked like walking without tracking. It looked like being a person instead of a project.

There was no revelation. Just the quiet repetition of No.

No, I’m not posting that. No, I don’t care if this aligns with my "voice." No, I'm not tracking this. No, I’m not optimizing this meal, this moment, this body, this day.

And then—without realizing it—I was living differently. Still online. Still in the world. Just… off the feed. Out of the frame. Inside something I hadn’t felt in a long time: my own life.

That’s what The Ordinary Sacred is. You can't buy it. You can't sell it. It’s not an app. It’s not a coaching program. It’s not a proprietary system.

It’s just a way to live without submitting your life for approval.

It’s the dignity of being a person.

🍕
My goal this year is to make Westenberg and my news site, The Index, my full-time job. The pendulum has swung pretty far back against progressive writers, particularly trans creators, but I'm not going anywhere.

I'm trying to write as much as I can to balance out a world on fire. Your subscription directly supports permissionless publishing and helps create a sustainable model for writing and journalism that answers to readers, not advertisers or gatekeepers.

Please consider signing up for a paid monthly or annual membership to support my writing and independent/sovereign publishing.


Read the whole story
mrmarchant
3 hours ago
reply
Share this story
Delete

Boolean Clashes: Discretionary Decision Making in AI-Driven Recruiting

1 Share

As artificial intelligence (AI) systems increasingly mediate our social world, regulators rush to protect citizens from potential AI harms. Many AI regulations focus on assessing potentially biased outcomes of AI. But AI systems are always embedded into social contexts and decision-making processes that are typically distributed across a range of human and machine agents. Bias and discrimination can occur anywhere in this human-machine network. Only focusing on potentially biased outcomes of an AI system will not fix the bias and discrimination problems that are integral to the whole human-machine network. Addressing this issue means focusing AI accountability approaches on practices and processes, rather than just machines or just humans.

Let’s take the world of recruiting as a case study. Recruiting has become a frontier of AI-driven automation. AI recruiting tools support search for candidates on job platforms, candidate screening (such as video interviewing or technical interviews to test coding skills), crafting job descriptions, and integrating AI (for example, chatbots) into applicant tracking systems. Using these tools can also produce instances of discrimination. Infamous examples include Amazon’s sexist hiring AI1 and Facebook’s ageist and gendered job ads.2 Fueled by the COVID-19 pandemic and demand for remote recruiter-candidate interaction, the human resources (HR) tech market is large, and it continues to grow (projected to $39.90 billion by 2029).4

Even as particularly problematic tools are retired, issues of technology-mediated and AI-accelerated bias and discrimination persist. AI tools used in candidate assessments (such as interviews or tests) are prone to error, often disadvantage certain populations6 or are based on pseudo-scientific constructs.7 Regulators are paying heightened attention to the use of AI in recruiting and employment, with influential regulation focusing explicitly on AI in HR as a “high risk” area of AI deployment.3

But discrimination that persists in HR cannot be attributed solely to the AI. It is the result of a complex sociotechnical system that includes both AI and the many people engaged in HR processes and practices. In recruiting alone, this includes sourcing specialists, talent acquisition managers, recruiters, hiring managers, HR administrators, and others, who interact with and potentially make decisions about candidates. Various AI systems and other technologies are spread across that network of actors. What is needed to mitigate potential discrimination and harm is a closer look at the professional practice of recruiting, how recruiting professionals use and make sense of AI systems, and how this affects their discretionary decision making.

Keeping It Old School: The Persistence of Boolean Search

In low-volume recruiting (that is, recruiting from a scarce talent pool so finding candidates is hard), recruiters’ traditional professional practice revolves around Boolean search. When searching for talent in databases, they assemble the specifications of the job into what they anticipate will be a powerful Boolean string. A professionally crafted Boolean search string designed to locate a computer programmer who has experience with a particular group of programs and possesses leadership skills might appear as shown in Figure 1.

Figure 1.  Real-world Boolean string for sourcing shared by research participant.

The Boolean search method is grounded in binary logic with a simple premise that statements can only be true or false. It has transcended its mathematical origins to become a cornerstone of information retrieval writ large (as anyone who has been taught to use a library catalog knows). Boolean search allows users to express the relationship between keywords in a search, rather than just the presence of the keywords (see Figure 2). Key for this are the three operators AND, OR, and NOT. Using the AND operator narrows the search by including only the results that contain all the specified keywords. The OR operator broadens the search to include results that contain either of the chosen keywords. The NOT operator excludes results that contain the keyword following it.

Figure 2.  Basic Boolean logic (source: Jakub T. Jankiewicz, Wikimedia Commons).

Boolean logic operates within recruiters’ minds as they carefully select the most fitting keywords for the role they are trying to fill. Working to match job specifications with ideal candidates, they turn to Boolean search across vast candidate databases (such as LinkedIn) into an epistemology—a way of knowing and understanding the world of potential hires.8

Constructing a Boolean search string for finding fitting job candidates is not merely a technical exercise. It is a labor-intensive and iterative process that demands creativity, analytical rigor, and often years of experience. Typically, recruiters invested considerable time and effort in iteratively refining their Boolean search strings. Each keyword selection, operator placement, and logical structure acts as a deliberate choice, designed to surface the “right” candidate profiles. This process transcends mere keyword lists; it demands an iterative dance between logic and intuition, honed through experience and a deep knowledge of the target talent pool. For a recruiter, finding the “perfect” Boolean string is like finding a vein of gold. It can vastly improve efficiency and efficacy of candidate search for a specific role, or type of role. Boolean search allows recruiters to iteratively adapt their queries in real-time based on the feedback provided through the search engine results. This is where recruiters can exercise the discretionary decision-making power that is the essence for their own job: making the decision on who is the “right” candidate.

In traditional, non-AI-driven information retrieval by way of Boolean search, recruiters can easily discern the relationship between the keywords expressed in the Boolean string and the search results. This gives them discretionary room to maneuver. They can rely on the search engine faithfully delivering to their query, and they can predict the effects of tweaks in their Boolean expression. In other words, traditional Boolean search provides the kind of transparency recruiters require for the discretionary decisions that are specific to their profession.

AI-Driven Search and Boolean Epistemology

In AI-driven candidate search, however, the system is not faithfully delivering on the keyword relationship expressed in the Boolean string. AI-systems, including generative AI systems that respond to prompts, are calibrated to produce statically probable outputs based on a search or prompt (as well as various unknown factors, such as previous search behavior), rather than the precise keyword relationship. Here, the system interprets the keywords in ways that are not discernable (and therefore actionable) by recruiters. For example, a recruiter may include the term “New York City” in the string because they need a candidate who is based in New York City for tax reasons. The AI may interpret this in undesirable ways and, for example, suggest candidates as “most relevant” (and ranked at the top of the search results) who are based in Hoboken, NJ, USA. A new search with the exact same Boolean expression run a few hours later may show candidates based in the Hudson Valley, NY, USA.

It remains unclear to the recruiter, how and why the AI system powering the search made this leap. The Boolean epistemology that recruiters traditionally deploy affects how they make sense of AI and influences if and where mistrust and potential bias manifest in HR. The “interpretive lift” undertaken by the AI system is palpable but never consistent or squarable with the professional epistemology recruiters use, and it curbs the discretionary decision space available to them. They cannot tweak the keywords to better understand the effects of each one on the search result. In AI-driven search, these causal effects cannot be known by recruiters. In other words: Boolean epistemology and AI epistemology clash.

Navigating the Chasm

The clash between these two epistemologies leads recruiters to mistrust the AI systems that their own employers often require them to use. Recruiters are acutely aware of the “epistemological clash.” They know that, through the machine learning feedback loop (in which data generated through the interaction with an AI system re-enters the system and affects its predictions), their interactions with AI-driven search engines are recorded and affect subsequent search results. To preserve their discretionary decision-space (which is central to their professional identity), recruiters sometimes try to “neutralize” the AI-driven search system or “confuse” it. They may input a vast range of different Boolean search strings, save all results in a separate spreadsheet, and manually comb through them. They may also manually infer features they deem important rather than relying on the machine to do it. For example, they may infer gender or racial identity from location or educational background to try to ensure a diverse candidate pool and avoid bias and discrimination.

Viewed as a larger socio-technical work system, recruiters’ interactions with AI-driven search tools reclaim discretionary capacity and allow them, not machines, to make decisions about candidates. This involves substantial work as Boolean searches must be meticulously composed and continuously tweaked, which reduces the alleged time-saving value of AI systems. It also demonstrates how AI-driven recruiting systems may be used in ways that sustain, rather than curb, issues of (human) bias and discrimination.

Thus, it is insufficient to address AI discrimination by looking at the potentially biased outcomes of an AI system. A more nuanced approach is needed as the field of AI ethics and accountability and transparency progresses, and as AI regulation becomes more common. This becomes particularly important as generative AI systems enter the HR space making the AI’s interpretation of search commands or prompts even less transparent and adding the risk of “hallucinations.”

Understanding how professional discretion is affected by new forms of AI-driven automation, within and beyond HR, is extremely important. We must treat the black box of AI as a socio-technical phenomenon in which professional epistemologies and practices clash with hidden AI functionalities. Concretely, this means integrating work practices and decision-making processes into AI accountability efforts. Only by taking this larger systems view can we avoid the “many hands” problem that makes it so hard to identify who is responsible for the harms that computer systems can cause.5 Centering what people are doing and how—including with machines—rather than treating machines as the sole focus of regulatory attention, can help address the continuation of human-machine bias.

Conclusion

AI functionalities clash with the Boolean epistemology of candidate search in professional recruiting. This encourages human intervention and enables continued employment bias and discrimination. Employment fairness is of enormous ethical importance, but HR recruitment is just one of many areas of life where AI has been implicated in bias and discrimination. Focusing solely on AI opacity as the cause of bias and discrimination misses the fundamental socio-technical nature of the phenomenon and points to ineffectual solutions.

We are in urgent need of more empirically grounded research on how AI is actually used so that we understand and address where and how bias and discrimination can occur in the distributed human-machine decision-system networks that influence important life outcomes. This is increasingly urgent with the rise of generative AI technologies such as ChatGPT and their rapid adoption. Focusing accountability approaches on practices, processes, and technologies rather than just machines or just humans, is a crucial first step toward building a just society.

Read the whole story
mrmarchant
12 hours ago
reply
Share this story
Delete

Unfortunately, This Is What the Soft Animal of Your Body Loves

1 Share

Unfortunately, These Are the Things the Soft Animal of Your Body Loves

Editor’s Note: This work is an irreverent riff on Mary Oliver’s iconic poem “Wild Geese.”

Putting bugles on the ends of your fingers to make cunning little claws because you do not have to be good.

Being crushed under eleven weighted blankets like a cozycore Giles Corey. Meanwhile the world goes on.

Eating a deconstructed PB&J by putting a family of things (a peanut, a grape, a crouton) into your mouth all at once.

Honking and strutting around like a regal, wild goose, over and over announcing your place, while you wait for your Dunkin’ Donuts order to be ready.

Offering yourself to the world’s imagination by growing out very long bangs down to your chin and yelling “LET THE SHOW BEGIN” whenever you part them to reveal your face.

Accidentally “composting” in the crisper drawer until you create a terrarium, a landscape of deep trees and sprouting radishes.

Collecting a nest of small trinkets and clear pebbles of the rain because you have been oh so very good.

/

Walking on your knees, then crawling upon your belly and tonguing crumbs out of the carpet.

Curling up inside the harsh, exciting armpit of your mate and taking a small nap.

The post Unfortunately, This Is What the Soft Animal of Your Body Loves appeared first on Electric Literature.

Read the whole story
mrmarchant
18 hours ago
reply
Share this story
Delete

How Pebbles Form Planets

1 Share

The secret to the formation of planets may lie in ordinary static electricity—the same phenomenon that can make your hair stand on end or give you an electric shock after walking across a carpet.

Nautilus Members enjoy an ad-free experience. Log in or Join now .

A new study, published in Nature Astronomy, suggests that static electricity allows tiny dust particles in protoplanetary disks—the rotating platters of gas and dust that form around young stars—to clump together into “pebbles” that are large enough to play a role in the formation of planets.

The image above shows basaltic beads, each measuring 0.55 millimeters, that were used in an experiment, which took place aboard a suborbital rocket.

The findings help resolve a mystery that has shrouded something called the bouncing barrier—the size threshold that particles must reach in order to rely on gravity to join with other particles—says lead author of the study Jens Teiser, an astrophysicist at the University of Duisburg-Essen in Germany.

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

The dust particles need static electricity to make them “sticky” enough to cluster into pebbles that can form planets.

Only when particles grow larger than this threshold size—roughly a quarter of an inch, depending on conditions—can they eventually join to form rocky “planetesimals,” from about half a mile to 100 miles across, that scientists think then collide within protoplanetary disks to create planets like Earth.

Smaller “dust particles don’t stick together,” Teiser says, unless they have an electrostatic charge.

Static electricity is produced when different objects with an imbalance of positive and negative charges make contact, which results in an electrostatic charge. In this case, the electrostatic charge is generated by collisions between tiny dust particles, which can cause them to either gain electrons or lose electrons, resulting in a negative or a positive charge, respectively. Oppositely charged particles will then attract each other—according to the law of electrostatics—and can clump together to create even larger charged particles, Teiser says.

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

Teiser and his colleagues suspected this was the case after conducting “tower drop” experiments with tiny basalt particles, in which they observed the behavior of these particles during nine seconds of near-weightlessness. But that wasn’t enough time to reach a conclusion, so in 2022 the researchers performed an experiment onboard a suborbital rocket that launched from Kiruna in northern Sweden, to observe how the particles behaved during six minutes of weightlessness.

During the 2022 launch, described in the latest study, the rocket reached an altitude of about 160 miles, and weightlessness kicked in as the rocket’s payload fell back to Earth. At that point, a particle reservoir aboard the vessel opened, releasing the particles. In some cases, the reservoir was shaken to give the particles electrostatic charge, but in other cases, it was not. Only those particles that had been shaken began to assemble into an aggregate. The largest cluster, shown in the image, was a little more than an inch in length. Teiser says his team of researchers sent four versions of their experiment aloft in the rocket, each with different starting conditions.

The researchers believe their findings suggest that the dust particles in protoplanetary disks need static electricity to make them “sticky” enough to cluster into pebbles that can form planets. They were also able to calculate the maximum average speeds the tiny particles can travel when they collide if they are to create clumps: about a foot and a half per second. Collisions at greater speeds tended to erode the surfaces of large clusters.

The results will be used in models that try to explain how massive planets like our own arise from mere dust.

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

Lead image: University of Duisburg-Essen (UDE)

The post How Pebbles Form Planets appeared first on Nautilus.

Read the whole story
mrmarchant
1 day ago
reply
Share this story
Delete
Next Page of Stories