651 stories
·
0 followers

Memories without brains

1 Share

Yellow slime mould on textured tree bark, showcasing intricate branching patterns.

Certain slime moulds can make decisions, solve mazes and remember things. What can we learn from the blob?

- by Matthew Sims

Read at Aeon

Read the whole story
mrmarchant
1 day ago
reply
Share this story
Delete

AI Can’t Gaslight Me if I Write by Hand

1 Share

I drafted and revised this article in longhand, something I haven’t done since the mid-1990s, unless you count the occasional brainstorming I do in my journal. I made this choice because I’ve been worrying about how technology might be encroaching on my writing skills. I wanted to know what it would be like to return to the old ways.

I recognize that the overall loss of skills to technology is nothing new. After all, most of us don’t know how to hitch a horse to a wagon or spin yarn—although I’m interested in learning the latter. The world changes, and that’s okay. But in the digital age, innovation happens at lightning speed, with the results often integrated into our lives overnight—literally—and without our consent. That’s what happened when I woke up a week ago to find Microsoft Copilot, an AI writing assistant, installed on my computer—not as a separate app but as an integrated aspect of Microsoft Word. Its little grey icon hovered next to my cursor, prompting me to let it do my job for me. I spent an hour trying to get rid of it until I finally settled for turning it off.

I feel more and more that technology doesn’t liberate me as much as it diminishes me.

I value many technological innovations, such as the technology that enabled my laparoscopic surgery a few years ago. I think my Vita-mix is pretty darn nifty. And I’ve never doubted that the washing machine set us all free. But in recent years, I feel more and more that technology doesn’t liberate me as much as it diminishes me. Technological innovation has always had this darker side, slowly eating away at the things humans know how to do, or in the case of automation, the things humans get paid to do. But lately, the stakes feel higher. Where I used to feel new technologies robbed me of things I enjoyed doing, like driving a stick-shift car or operating my all-manual thirty-five millimeter camera, I now feel them getting into my head, interfering with the way I think, with my ability to process information. I worry: Am I forgetting how to add? How to spell? How to navigate the maze of streets in the metro area where I’ve lived for over forty years? Am I forgetting how to listen to and comprehend a film without subtitles, or how to read a novel?

That’s a lot of forgetting.


I have tried to resist many of these encroachments, tried to push back to preserve the skills that used to be rote, but I find it increasingly difficult. I feel my intellectual abilities slipping, despite myself, and I know: I am diminished. Now, technology is coming for my most valued skill, the one that has defined me since I first learned my letters: writing.

Of course, I am not alone in my fears about AI and how it might affect my profession as a writer and editor. We all have concerns about property infringement, the loss of jobs, the banality of ideas only formulated by the clunky cobbling together of what’s already been written and fed into the maw of the large language model (LLM). But I have an additional concern. For me, and for many like me, writing is not just a way of communicating, it’s a way of thinking. I rarely begin an essay with the entire thing planned out. Who does? Even if I have an outline, I will not yet have made all the connections that will come to be, let alone have planned out such things as metaphor, imagery, or other figures of speech that emerge in more creative pieces. Something about the state of suspension the brain enters while holding ideas in the air and doing the busywork of typing letters, spelling words, inserting punctuation into grammatically correct sentences, creates space where connections happen and ideas spring. It resembles the way a thing as simple as a person’s name will come to you when you let yourself think about something else. The physical act of writing serves as the distraction that lets the ideas flow. But also, and perhaps more importantly, writing forces the writer to think very slowly, only allowing the brain to move through an idea at the speed at which each individual word can be written. Perhaps that elongation of thinking gives the brain the time it needs to have new realizations. So much discovery happens as a piece of writing evolves that, like many writers, I often set out to write with the purpose of finding answers and prompting evolutions. In this way, writing itself functions as a generative act, a process of discovery and learning that far exceeds the simple recording and communicating of already formed ideas.

The physical act of writing serves as the distraction that lets the ideas flow.

Drafting this essay in longhand led me to think beyond how AI might affect this generative process to consider the other technological changes that have affected my writing over the course of my lifetime. Have those changes also impinged on writing’s process of discovery? I grew up with the unfolding of the digital age. In fact, I’m old enough to have begun writing my school papers with a pencil, reserving the pen for my final drafts. I remember the day when I decided to forgo the graphite and draft in ink. I had to adjust to the permanence of ink on the page when I’d yet to finalize—or sometimes even formulate—my thoughts. The typewriter came into the picture when my sixth-grade teacher required our class to turn in a typed final draft of our research papers. From then on, I typed all of my final drafts for school, progressing from a manual typewriter to an electronic one during high school and experimenting with various inadequate forms of whiteout in the process. I didn’t begin using a computerized word processor until college in the 1980s. And it was bliss! Anyone my age or older knows what a gift the invention of word processing felt like. The ability to add, delete, or rearrange text without having to retype entire pages just to correct one word was pure freedom. The composition on the page became so much more fluid, and the process of creating it so much faster.

But even then, I only used the computer as a glorified typewriter as I continued to compose all of my drafts by hand. It actually took years before it occurred to me to compose on the computer. During grad school, I wrote in longhand, typed up the draft, printed it out, edited it in hard copy, then typed the edits into my digital version, printed again, and repeated. However, when I found myself printing the same 25-page term paper multiple times to edit it, the wasted paper prompted me to consider editing straight on the computer. This process evolved until I finally decided to try composing there as well. Making the leap felt overwhelming because I did not yet know how to think about anything other than typing while typing. The integration of keyboarding into the already merged tasks of formulating ideas and composing grammatically correct sentences gave me the feeling of trying to fly without the proper means.

Of course, I adjusted. And soon I was flying. My fingers raced over the keyboard, enabling me to move through my ideas with a rapidity handwriting could never afford. Composing on the computer happens delightfully fast, but I wonder: if writing is a process of discovery and learning, then what discoveries did I lose by speeding up the process? What connections haven’t I made? Is there a level of richness or complexity I haven’t achieved because I’ve spent less time engaged in that magic writerly state of mind and therefore, less time exposed to the possibility of revelation? I can’t escape the thought that if slowness is key to writing, and writing is a way of thinking, perhaps each tech-driven acceleration of the process has chipped away at my depth of thought.

If writing is a process of discovery and learning, then what discoveries did I lose by speeding up the process?

Ironically, I found the return to hand writing an essay painfully slow at first. Although I eventually rediscovered my old routines, I initially had moments where I couldn’t wait to get to my computer so I could just get it down already—see the clean and neat print on the screen instead of my messy scratched up pages. I also noted that it took me forever to get started. I mulled over my ideas for weeks before putting pen to paper, in part because I felt a pressure to have all my thoughts together first. Before completing my first draft, I saw this delay as an impediment—thinking the prospect of writing by hand had held me up, slowed me down. Now, I see that prolonged period of contemplation as a benefit, providing another means of slowing down that gives the element of time its due, allowing it to generate and enrich ideas. This is why I always try to sleep on a draft before turning it in to a client, why taking a break from writing can help writers problem solve and iron out difficulties in a piece.


Writing is hard, so I see why some might be tempted to let a machine do the initial composing. The blank page represents the most difficult phase of writing because this is when the writer must engage with their topic most fully. In the absence of time or energy, AI might sound like a great solution—just as past innovations felt like godsends. But AI brings changes far more dramatic than those of the typewriter. If I let an LLM compose my first draft, only to edit and shape it and supposedly make it my own afterward—as I’ve heard some writers suggest—then I would have skipped over that initial composition process, that period of intense intellectual engagement through which we enrich our ideas. I would sacrifice the element of discovery, learning, and creation in favor of the LLM’s regurgitation. If the future offers a world filled with AI-produced prose, who knows how much we will collectively lose to writing created without all those unique incidents of epiphany and realization.

The idea that technology may have reduced the generation of ideas by speeding up my writing process came to me while working on this essay. I didn’t begin with that thought. I simply began with a question about how technological change had affected my writing. Answers came through my writing process. Realizing this, I decided to put the same question to ChatGPT. I used a few prompts: How has word processing changed how we write and influenced what we write? How has technology diminished my role as the driver of my own writing? The results were unremarkable. ChatGPT produced predictable answers (some of which I had already—predictably—mentioned). There were a few paragraphs about the speed of word processing and accessibility for those limited by poor spelling or grammar. It mentioned slightly off-topic items such as the effects of social media on writing. Interestingly, in response to the prompt about technology diminishing the writer’s role, it told me that too many AI suggestions might give the writer the “illusion” that the machine is directing the narrative more than the writer. Was AI gaslighting me? 

My essay certainly wouldn’t have evolved the same way if I’d begun writing by feeding those few prompts into an LLM. Who knows, maybe I would have ended up writing about social media? What I do know is that absorbing the results of the AI prompts didn’t feel like thinking, it felt like reading. If I’d started with AI-produced paragraphs, the generative process of writing the essay—not just the arrangement of ideas into sentences and paragraphs, but the process of formulating the actual points—would have come instantaneously from the outside. Meanwhile, I spent hours thinking about the topic before and after I started drafting and revising my handwritten essay. I experienced nostalgia remembering the satisfying clunk! clunk! of my old manual typewriter echoing in my childhood bedroom. I thought fondly of the long-ago graduate school days when I covered my living room floor with my term paper pages while trying to organize my thoughts. I pondered other questions, such as how AI might inhibit the development of voice for new writers only just coming of age. The paragraphs I wrote and cut about voice led to more paragraphs that were also cut about a job I had writing advertorials over a decade ago. I recognized the advertorial internet speak in the AI responses to my prompts. I even spent time thinking about the pleasure of improving my handwriting while drafting, relishing the curve of an “S,” the soaking of ink into the page as my pen looped through the script. These reflections don’t appear here—beyond their mention in this paragraph—but they are part of my experience of writing the essay, giving it more depth than any list of AI talking points. This experience demonstrates something basic, something I’ve known from years of journaling but didn’t think much about when I started this composition: writing is a personally enriching process, and it is this enrichment that comes across in the unique quality of what each of us writes. It is the soul of the writing, the thread that can connect writer to reader, which, I believe, is why we write in the first place.

There are all kinds of slow movements: slow food, slow families. Perhaps it’s time for slow writing.

Tech advancement has always asked us to relinquish our skills to machines in exchange for the reward of time. The deal feels worth it in many cases. But as I held my thick and crinkly sheaf of scribbled-on papers, it felt good and satisfying to have that physical product of my labor in my hands. And I wonder if perhaps we’ve gotten confused, thinking we should always use the extra time technology affords to do more things faster rather than using it to do fewer things slower. There are all kinds of slow movements: slow food, slow families. Perhaps it’s time for slow writing. For me, I plan to adjust my writing process by always writing my first draft on paper. This is, in part, an attempt to assert my humanity and wrest my writing from the clutches of technology, but it’s also a return to a process that feels good, takes time, and opens me more fully to the joys of personal discovery and connectedness that occur when words flow onto the page.

The post AI Can’t Gaslight Me if I Write by Hand appeared first on Electric Literature.

Read the whole story
mrmarchant
1 day ago
reply
Share this story
Delete

Brain Rot

1 Share
Italian brainrot: Do not read this article if you are over six. You won't  get it
This is the first Google image result for “Italian brain rot.” I don’t know what it means.

Brain rot has entered the zeitgeist. It was the Oxford word of the year in 2024. My students like to watch something called Italian brain rot, which is also connected to something called tralalelo tralala. It's all one big inside joke about how the internet is ruining our brains.

There's this vague impression in the ether that things are hopeless. Young people have rotted their brains with social media and memes, they're cooked, there's nothing we can do.

That's not my position. I have a slightly more optimistic view. It's an argument to get phones out of schools, but also to be aware of the other ways brain rot can creep in.

Brain Rot

First, a definition. Brain rot is a state characterized by a need to be stimulated, rapid task switching, short-span attention, and low-level engagement. It's watching TikTok, or scrolling Instagram. It’s trying to focus on something else but constantly picking up your phone to check your notifications. It’s a family dinner out where the kids stare at their phones the whole time. It’s the gravity you feel toward technology, and away from anything that requires sustained focus and attention.

Here's my thesis: brain rot is context-dependent. Someone can rot their brain in one context, but focus for extended periods on complex tasks in another context. I'm sure this is the case for many of my readers. It's true for me. I have plenty of bad social media habits, and at the end of some days I'm exhausted, flop onto the couch, and scroll or watch random Youtube videos. That's brain rot. But I can code switch, and I also write tens of thousands of words a year on this blog, exercise regularly, do a half-decent job as a teacher, and work on a few different side projects. I can rot my brain in some contexts, but focus for extended periods in others. I'm sure many of my readers have similar stories. Brain rot isn’t an all-or-nothing phenomenon, it’s a set of habits in a specific context. Brain rot becomes all-consuming when those habits pervade more and more of daily life, leaving less time and space to give our full attention to other tasks.

So the solution to brain rot isn't to ban Instagram or eliminate cell phones until kids turn 18. The solution is to create contexts where young people aren't rotting their brains, to show them what they can do when they sustain focus and work hard, to help them develop a different set of habits. The task is made harder by omnipresent social media, sure, but it's not impossible. Students' brains aren't irreparably damaged. Our job is to create spaces where they can meet their potential, and give students models for what the opposite of brain rot looks like.

Some Examples

My school had an extremely lenient cell phone policy this most recent school year. Students could use their phones in the hallways between classes and while going to and from the bathroom. If they were caught using their phone in class the teacher would give that student a warning. If they used their phone again in the same class, the phone would be taken away for the remainder of the period. If a student refused to give up their phone, admin would intervene. But a student could get a warning in every class, every day, without incurring any further consequences.

I bet you can predict what happened. I warned students about their phones every day, often several students per class. But I didn't often take a phone for the period — for the most part, after a student got caught once they would keep their phone away for the rest of the period. I only needed to call for help with a student who refused to give up their phone a handful of times the whole year.

Admin loved this policy. They didn't often get called to intervene, so it was a success.

But students’ brains were rotting every day. They were on their phones moments before class began and immediately after it ended. They were messaging and scrolling social media when they went to the bathroom during class. They became crafty at sneaking glances at their phones when I was helping a student across the room. They felt their phones buzz with a text from a friend in the hallway, and they suddenly “needed” to pee. Brain rot was on their minds, all the time.

Brain rot made learning harder, but not impossible. It was an uphill battle. I came up short lots of days. But when I put the pieces together students could still focus, could still impress me with the quality of the thinking they could do.

We're finally adopting a strict, bell-to-bell, lock-the-phones-away policy when we return in August. Students will not have any access to their phones during the school day. I think it will make a huge difference. Brain rot will be easier to keep at bay. I'll spend less time building habits of focus and perseverance, and more time using those habits to help students learn.

But here's the thing. Phones aren't the only source of brain rot in schools. Chromebooks can be just as bad. And Chromebooks are everywhere these days. In many classrooms, students are rotting their brains playing Slope every time the teacher looks away, or looking at shoes on Amazon every chance they get. Students flip between tabs when the teacher walks by and pretend to work for a moment, then return to their distractions. That's brain rot too.

There's even a style of teaching that teaches brain rot. Some teachers have embraced the AI revolution, because "students will need to use it in jobs someday" or something. Their assignments are a bunch of questions that students drop into ChatGPT, paste back the answers, and move on. It's brain rot disguised as education, requiring no thinking, no sustained attention, no effort.

Optimism

I don't think we're teaching a lost generation. Reflecting on my students this last year, some fell into everyday brain rot. But some resisted. I'm excited we're banning phones. I'm working on new routines and structures to help students with sustained focus. I'm continuing to scale back technology use in my class — not to zero, but well below where I was a few years ago, with lots of guardrails in place. I’m optimistic that students will adapt, develop new habits, and see the value of setting aside their phones for the school day. I hope, at least in my classroom, to avoid letting other forms of brain rot creep in.

Today’s students will enter a world where the power of technology is harnessed to capture and manipulate their attention. Everyone has a hot take on what this means. We should teach students how to use that technology well. We should allow technology because that's what's allowed in the "real world." We should use technology to teach 21st century skills. I’m skeptical. I think the most important 21st century skill, and the hardest to teach, is to show students what sustained focus and attention look like, to show students what their minds are capable of. And I think the best way we can do that is to prioritize brain-rot-free zones in schools, as much as we can. I want to model the habits of thinking that will be most useful for students when they graduate. And at the same time, those habits will make it easier for us to teach students the content we want them to learn in school.

I don’t think the battle against brain rot is hopeless. It does require a lot of structure and routine, and careful thought about what technology is used for in schools. That will be a big goal of mine this coming school year.

Read the whole story
mrmarchant
1 day ago
reply
Share this story
Delete

TikTok Never Ruined Anyone's Life

1 Share

I often see people complaining that TikTok or Twitter or some other outside force is making them worse, and I reject this framing. You’ll see or hear comments constantly, either as self-deprecating humor, or older generations complaining about how “kids these days don’t want to work,” not realizing that no one ever in all of history has wanted to work every waking moment. In actuality, the time we spend scrolling has little to no impact on the time we would be working. Humanity has a long and storied history of doing nothing, and finding wondrous new ways to do nothing.

Eris’ tweet is based on two incorrect assumptions: that her smartphone usage is making her worse, and that she would be filling her time with something more productive if smartphones didn’t exist. The reality is that regardless of how people spend their time, no one does anything. If it was twenty years ago, most people would be wasting time watching TV; if it was one hundred years ago, it would be radio. Of course, plenty of people still consume their days listening to music and watching shows, and the way people fill dead time will always evolve (even as I write this post I’m playing sudoku on my phone) but the reality is that it’s the same type of activity, or lack thereof.

a_real_society’s Substack is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.

I also disagree with the implication that reading or TV is inherently better than scrolling short form content, there are countless low quality novels out there that you can read without getting anything from the experience. While long form media gives the illusion of not frying one’s attention span, I find this argument weak. The overall time spent scrolling short form content averages out to the same amount of sustained attention as watching a TV episode, because TV is also broken up into bursts of attention by being interspersed with commercial breaks (and try counting how many cuts each scene has in your typical cop or medical drama on top of that). Time spent scrolling isn’t stealing time from some higher calling, it’s filling the void of doing nothing.

Manvir Singh had this eye opening thread about how anthropologists studied various societies and found that the most common activity was “doing nothing.” This was true across continents and cultures–humans love to do nothing at all. In the screenshot below, you can see one society where the most common activity during waking hours was to do nothing, and they spent nearly a third of their time on this critical task.

Image

The only real exception to this is seen in agricultural societies. Farming is hard, and though modern farming is still difficult, it takes way less time day to day than it did in the past, so this data point becomes increasingly obsolete. People love to theorize about the singularity and life in a post scarcity world, but we’ve already made it. The minimum essentials to survive are trivial to acquire these days, and the amount of time we have for leisure is immense. Most people who complain about "being worse" spend eight hours or less working or studying five days a week, and have another eight hours to fill as they wish. Outside of those with children, people may claim they have a commute, or some other obligations, but those are choices and trade offs that they choose to make.

You do nothing more than you know, and scrolling social media feels like an easy target to blame when the act of doing nothing is the true culprit. This is because filler activity like TV or scrolling is more memorable than doing nothing. Remembering how much time you spent doing nothing is like remembering how many breaths you took last week. Nothing blends together, stretches to infinity, and compresses to zero. This is how it is, always has been, and always will be. You are in a constant battle against doing nothing, because doing nothing is the default state. It is simple to do something, but it isn’t easy.

a_real_society’s Substack is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.

It should be noted that there is value in doing nothing, and it is good to reset when you can. One trick is to switch gears and have your down time filled with a valuable but dissimilar activity. If you work with your mind all day solving complex issues, then find a physical hobby or activity to keep yourself in shape and recharge your mental capacity.

Some people find a way to balance everything, but I tend to go for extremes. The classic label is work hard play hard, but I find that phrase to have morphed into a caricature of what it was intended to be. For four months last year, I took dozens of full-length practice LSATs, and did tons of drills and exercises to be ready for the exam. It was a rigorous period of study, but following this was an extended break where I didn’t write a single practice exam leading up to taking the actual test and getting a near-perfect score. I didn’t complete my CS degree, but I hold several highly sought after cybersecurity certifications that required passing 48 hour to two week long exams, during which I put in concentrated bursts of studying. I would spend all day in the office talking to my coworkers and helping their projects, then knock out all of my own technical work in a few hours of intense work. During all of these times, I would also play video games for hours a day, or read, or go on long drives to clear my head. Throughout my life I have been able to do an incredible number of things, yet I still manage to find time to do nothing.

The takeaway is not that you are powerless against the indomitable nothing, but rather that TikTok and other idle activities are simply filling a void that you could fill with something else. Sometimes you will, and sometimes you won’t, and the fact that you occasionally do nothing isn’t terminal. You have so much more time than you think, and though your natural inclination is to do nothing, or to fill that time with scrolling, your potential is not being ruined by some external influence. When you accept that TikTok is ruining you or those around you, you absolve yourself of accountability. If you have the capability to do more, you can simply choose to do so, and you have the time and ability to make it happen.

Read the whole story
mrmarchant
2 days ago
reply
Share this story
Delete

Do AI Tutors Empower or Enslave Learners?

1 Share
Comments
Read the whole story
mrmarchant
2 days ago
reply
Share this story
Delete

Little videos are cooking our brains

Vox
1 Share
Before the next era of TikTok and its clones overwhelms you, it helps to know how we got here and how to run the other direction.

As an elder millennial, I’ve tried to avoid TikTok because of its documented brainrot potential and despite the fact that it means missing out on an endless supply of fun and strangely specific memes. But somehow, little short-form vertical videos keep finding their way to me.  

Whether they’re on Instagram, Netflix, or Pinterest, swipeable smartphone-shaped videos have taken over the internet. They’re also showing up in places you wouldn’t expect, like Spotify, LinkedIn, and even the New York Times. And whether you enjoy these bite-size bits of content or not, the situation is about to get much weirder.

The dark future of vertical video 

In the near future, the internet may not only be wall-to-wall little videos. Those little videos may also be filled with slop, the term for AI-generated garbage content that is perhaps even more insidious in robbing us of our attention. 

Last week, Google started rolling out its Veo 3 AI-powered video generation model, which can create eight-second clips, complete with realistic soundtracks, based on text prompts. After creating a dozen videos of her own, including some for kids, Allison Johnson at the Verge called this tool “a slop monger’s dream” that’s “more than a little creepy and way more sophisticated” than she’d imagined. String together a few of these clips, and you’ve got a piece of short-form content perfect for TikTok or any of its antecedents that took mere minutes to create. YouTube announced last month that the tool would be built right into its own TikTok clone, YouTube Shorts. These videos are already taking over short-form video platforms. Some of them are racist

AI slop may soon also dominate the ads you’re served on these platforms, too. These ads, while currently laughable, will get much better, according to Mark Zuckerberg, who says Meta will completely automate the creation of ads and even make it possible for ads to exist in infinite versions and evolve based on when and where a person sees them. And as algorithmic feeds of short-form videos spread to more places online, it will be increasingly hard to avoid them. 

We’ve known for a while that the rise of AI would flood the internet with slop. Slop is already remarkably popular on YouTube, where nearly half of the 10 most popular channels contain AI-generated content. There are even virtual personalities powered by AI earning millions on YouTube. These platforms know that making content easier to produce will lead to more content, which leads to more engagement, which leads to more ads, which ultimately leads to a less enriching, more addictive internet. That’s why YouTube is pushing Veo 3 to its creators, and why, as of last month, TikTok and Open AI have pushed out similar tools.  

This wouldn’t be such a concern if you wanted to seek out awful AI-generated videos. Instead, the slop finds you unwittingly and drowns you in anxiety. 

These platforms know that making content easier to produce will lead to more content, which leads to more engagement, which leads to more ads, which ultimately leads to a less enriching, more addictive internet.

People already spend a staggering amount of time on TikTok: 108 minutes a day, which is more than double the time spent on Instagram. There are many, many studies showing how more TikTok use increases anxiety and stress, especially in young people. (One of them coined the term “TikTok brain” and not in a good way.) We’ve also known for a while that watching TikTok has the side effect of shredding your attention span. Researchers have found that TikTok disrupts your ability to complete a task when interrupted. Our attention spans while looking at a screen have shrunk, on average, from two and a half minutes in 2004 to just 47 seconds,  which is incidentally quite close to the average length of a TikTok video

“You can think of it as attentional capacity, and we can use that capacity to get work done, to do important things,” said Gloria Mark, author of Attention Span and professor of informatics at the University of California, Irvine, whose research landed on that 47-second number. “But if we’re switching our attention, that’s draining our tank of resources, and then we just don’t have the capacity anymore to pay attention.”

Before the next era of TikTok and its clones overwhelms you, it helps to know how we got here and how to run the other direction.

Can you opt out of the endless-loop internet? 

There’s a popular narrative that TikTok owes its success to Vine, a short-form video service founded in 2012 only to be bought by Twitter a few months later. It’s a nice thought. Vine, like Twitter itself, was accidentally successful. While many young people first encountered a feed for weird and hilarious short-form videos on Vine, it was the TikTok algorithm that led to that platform’s success, not to mention the long line of companies trying to draft off that success.

That algorithm finds its roots in a viral news app called Toutiao, which ByteDance released in China the same year that Vine launched in the US. (Yes, this is the same ByteDance that now owns TikTok.) The platform’s big innovation was a complex recommendation engine that used machine learning, a type of AI, to create a highly personalized feed for its users based on their interests and behavior — down to their swipes, location, and even their phone’s battery life — rather than what people you know are doing online. The algorithm proved extremely effective at getting people to spend more time on the app. ByteDance made this algorithm the foundation of TikTok’s video feed, when it launched in 2017 (a version of the app, Douyin, launched in China two years earlier). 

If you find yourself stuck

Try these three tips from professor Gloria Mark:

  1. Take breaks. If, rather than enjoying yourself, you find yourself foraging for interesting content, stand up and go outside and look at a tree. There are lots of apps that prompt you to put down the device. 
  2. Be intentional rather than automatic when you use any app. If you tap TikTok because you don’t know what else to do, that’s a sign that you’re tired and low on cognitive resources. 
  3. Think ahead to your future self. Visualize what you want at the end of your day and how you’ll get there. It probably doesn’t involve spending 108 minutes looking at TikTok.

Early on, a one-minute length limit meant that TikTok users were fed videos constantly, often serendipitously, on their For You page. That limit has since been extended to 60 minutes, but users have also learned they can swipe to see a new, unexpected video as soon as they’re bored. This can lead users to keep searching for good videos, which are effectively rewards, triggering dopamine release and effectively getting them addicted to the feedback loop. As Mark put it, “The hardest behavior to extinguish, to stop, is randomly reinforced behavior, and the reason is because of the randomness of the rewards coming.”

The short-form nature of these videos, rapid context-switching, and resultant digital overload has multiple negative effects. A 2023 study from researchers in Germany found that TikTok use impairs our prospective memory, which is what allows you to hold more than one thought in your head when you’re distracted. The subjects of the study were given a task, then interrupted and allowed to scroll Twitter, watch YouTube, thumb through TikTok, or do nothing. The people who chose TikTok were nearly 40 percent more likely to forget what they were doing.

Researchers studying this phenomenon argue that this amounts to a dark pattern, a design that manipulates you to make certain choices. You’ve encountered dark patterns on websites that trick you into signing up for a newsletter or an ad you can’t click out of. Torrents of short-form videos like you see on TikTok are especially pernicious because the feeds are designed to keep you fully engaged and foraging for good content.

“They keep us in an endless loop. We kind of detach from the things that we were engaged with before,” Francesco Chiossi, a researcher at LMU Munich and the study’s lead author, told me. “They are engineered to maximize engagement at the expense of our attention and stability of what we call goal-directed behavior.”

It would be comforting for me to report that you can easily avoid getting stuck in these loops. It’s actually getting harder. You can avoid TikTok, but you might love Netflix, which is rolling out its own TikTok-like video feed on its mobile app. I use Spotify daily, sometimes against my better judgment, but the discovery feature keeps pushing me to watch little video clips rather than simply listen to music. On the LinkedIn video tab, its TikTok clone, a work influencer recently warned me against “peanut-buttering every channel instead of going deep on a few channels.” I spent at least 47 seconds trying to figure out what that meant. 

There’s a pretty straightforward lesson here, though. If you like to watch these little videos, by all means: Enjoy. But know that, like most free things big tech companies make today, these products are designed to keep you engaged, to steal as much of your attention as possible as they collect data about you and serve ads to you based on what that data reveals. TikTok and its many little siblings are free because you’re the product.

Consider taking some of the minutes — or hours — back from TikTok and its many little video clones. You might discover something wonderful in the real world, if you pay attention.

A version of this story was also published in the User Friendly newsletter. Sign up here so you don’t miss the next one!

Read the whole story
mrmarchant
2 days ago
reply
Share this story
Delete
Next Page of Stories