1448 stories
·
2 followers

Living in the Upside Down

1 Share

As you progress in your UI design career, you learn that there are quite a few unsolvable challenges:

  • do you write My Items or Your Items in UI?
  • do you put hand cursors over buttons?
  • for a boolean item (especially in the menu), do you talk about the present state or the future state?
  • do you try to solve for change blindness or change aversion?

I was reminded of one of those today: how do you sort the items in the bottom-aligned menu?

One school of thought is to keep it in the same order as you would a regular top-aligned menu:

On the positive side, this allows to build consistent understanding of how menus are structured: the most important thing is at the top, Quit is always at the bottom. But the downsides are obvious, too – now the most important item is furthest away from where you cursor started, and you have to awkwardly cross all the other items on the way to it.

iOS’s springboard went, literally, the other way:

Here, the bottom aligned menu reverses its item order. This tripped me up today. The dock in macOS was actually more defensible upside down because there, every menu was always going the same way. Here, the inconsistency starts rearing its ugly head.

Of course, the best way to not face an impossible choice is to avoid it altogether. Not sure how one could accomplish it here, though. Placing the menus consistently below would make some of them scrollable, or basically invisible for bottommost icons. You could also slide the entire screen up to make room for the menu, but that would probably feel disorienting.

So, I can’t say this is a wrong solution. The inconsistency might only bother people who use this often, and maybe no one uses this often? Or, perhaps, it was really important to allow to resize widgets and make that item as easy to tap as possible? But still, I think I would have done it the other way – align as needed, but items always in the same order.

Read the whole story
mrmarchant
1 hour ago
reply
Share this story
Delete

The Origins of Agar

1 Share
Ella Watkins-Dulaney for Asimov Press.

This essay will appear in our forthcoming book, “Making the Modern Laboratory,” to be published later this year.

In 1942, at the height of British industrial war mobilization, an unlikely cohort scavenged the nation’s coastline for a precious substance. Among them were researchers, lighthouse keepers, members of the Royal Air Force and the Junior Red Cross, plant collectors from the County Herb Committee, Scouts and Sea Scouts, schoolteachers and students. They were looking for fronds and tufts of seaweed containing agar, a complex polysaccharide that forms the rigid cell walls of certain red algae.

The British weren’t alone in their hunt. Chileans, New Zealanders, and South Africans, among others, were also scrambling to source this strategic substance. A few months after the Pearl Harbor attack, the U.S. War Production Board restricted American civilian use of agar in jellies, desserts, and laxatives so that the military could source a larger supply; it considered agar a “critical war material” alongside copper, nickel, and rubber.1 Only Nazi Germany could rest easy, relying on stocks from its ally Japan, where agar seaweed grew in abundance, shipped through the Indian Ocean by submarine.2

Without agar, countries could not produce vaccines or the “miracle drug” penicillin, especially critical in wartime. In fact, they risked a “breakdown of [the] public health service” that would have had “far-reaching and serious results,” according to Lieutenant-General Ernest Bradfield. Extracted from marine algae and solidified into a jelly-like substrate, agar provides the surface on which scientists grow colonies of microbes for vaccine production and antibiotic testing. “The most important service that agar renders to mankind, in war or in peace, is as a bacteriological culture medium,” wrote oceanographer C.K. Tseng in a 1944 essay titled “A Seaweed Goes to War.”3

Agar was first introduced into the laboratory in 1881. Since then, microbiologists have depended on agar to create strong jellies. When microorganisms are streaked or plated onto this jellied surface and incubated, individual cells multiply into distinct colonies that scientists can easily observe, select, and propagate for further experiments. Many of the most important findings in biological research of the last 150 years or so — including the discovery of the CRISPR/Cas9 gene-editing tool — have been enabled by agar.4 Agarose, a derivative of agar, is also essential in molecular biology techniques like gel electrophoresis, where its porous gel matrix separates DNA fragments by size, enabling researchers to analyze and isolate specific genetic sequences.

Agar plates with E.coli growth on various concoctions, including MacConkey, Mueller-Hinton, and Brain Heart Infusion. Credit: HansN.
An agarose gel. Credit: Kadina Almhjell

Agar is so critical that since WWII, scientists have tried to find alternatives in the event of a supply chain breakdown, especially as recent shortages have caused similar alarm. But while other colloid jellies have emerged, agar remains integral to laboratory protocols because no alternatives can yet compete on performance, cost, and ease of use.

Deep writing about biology, delivered to your inbox. Always free.

From Sea to Table

Microbiologists have been growing microbes on agar plates for nearly 150 years, but agar’s discovery dates back to a happy accident in a mid-17th-century kitchen. Legend has it that on a cold winter day, a Japanese innkeeper cooked tokoroten soup, a Chinese agar seaweed recipe known in Japan for centuries. After the meal, the innkeeper discarded the leftovers outside and noticed the next morning that the sun had turned the defrosting jelly into a porous mass. Intrigued, the innkeeper was said to have boiled the substance again, reconstituting the jelly. Since this discovery, agar has become a staple in many Japanese desserts, from yokan to anmitsu.

Industrial production of kanten (the Japanese name for agar, which translates as “cold weather” or “frozen sky”) began in Japan in the mid-19th century by natural freeze drying, a technique that simultaneously dehydrates and purifies the agar. Seaweed is first washed and boiled to extract the agar, after which the solution is filtered and placed in boxes or trays at room temperature to congeal. The jelly is then cut into slabs called namaten, which can be further processed into noodle-like strips by pushing the slabs through a press. These noodles are finally spread out in layers onto reed mats and exposed to the sun and freezing temperatures for several weeks to yield purified agar. Although this traditional way of producing kanten is disappearing, even today’s industrial-scale manufacturing of agar relies on repeated cycles of boiling, freezing, and thawing.

Because of its capacity to be freeze-dried and reconstituted, agar is considered a “physical jelly” (that is, a jelly that sets and melts with temperature changes without needing any additives). This property makes dry agar easy to ship and preserve for long periods of time.5

Anmitsu. Credit: Ocdp

Over the years, agar found its way around the world into many cuisines, including those of China (where it’s called “unicorn vegetable” or “frozen powder”), France (sometimes called gélose), India (called “China grass”), Indonesia (called agar-agar, which translates simply as “jelly”), Mexico (called dulce de agar, or agar sweets), and the Philippines (known as gulaman).

Agar is prized among chefs for its ability to form firm, heat-stable gels at remarkably low concentrations — typically just 0.5-2 percent by weight. Culinary agar is available as powder, flakes, strips, or blocks, and makes up about 90 percent of the global use of agar. Unlike gelatine, which melts at body temperature, agar gels remain solid up to about 185°F (85°C), making it ideal for setting dishes served at room temperature or warmer. It is also flavorless and odorless, vegan and halal, and can create both delicate jellies and firm aspics. Yet, while increasingly employed in kitchens worldwide, agar had not yet entered the laboratory.

Before agar, microbiologists had experimented with other foodstuffs as microbial media. They turned to substances rich in the starches, proteins, sugars, fats, and minerals that organisms need for growth, testing with broths, bread, potatoes, polenta, egg whites, coagulated blood serums, and gelatine. However, none worked particularly well: all were easily broken down by heat and microbial enzymes, and their surface, once colonized, became mushy and unsuitable for isolating microbes.

A bundle of kanten, from the Encylopedia of Food (1923).

This was especially vexing to physician and bacteriologist Robert Koch, who, in seeking to culture his bacteria, “bent all his power to attain the desired result by a simple and consistently successful method,” wrote bacteriologist and historian William Bulloch in his 1938 book, The History of Bacteriology. “He attempted to obtain a good medium which was at once sterile, transparent, and solid” and got some results with gelatine.6 But gelatine is easily digested by many microbes and melts at precisely the temperatures at which the disease-causing microbes Koch wanted to study grow best.

The woman who ultimately discovered the superior features of agar as a growth medium and brought it to Koch’s attention was Fanny Angelina Hesse. Her foundational contribution to the nascent field of microbiology is often omitted from textbooks. In other cases, she is unflatteringly referred to as a “German housewife” or as “Frau Hesse,” or dismissed as an unnamed technician.

From Plate to Petri Dish

Fanny Angelina (née Eilshemius, from a Dutch father) grew up in Kearny, New Jersey. During her childhood, her family learned from a Dutch friend or neighbor about agar-agar, a common ingredient in jelly desserts in Java (Indonesia), then a Dutch colony. Her mother and, later, Fanny Angelina herself, began to cook with it.

In 1874, Fanny Angelina married physician and bacteriologist Walther Hesse, an investigator of air quality and, specifically, air-borne microbes. In the Winter of 1880-81, Hesse became a research student with Koch in Berlin and experienced firsthand the difficulty of growing microbes on gelatine and the other growth media used at the time.

While raising three children and taking care of the household, Fanny Angelina Hesse supported, documented, and archived her husband’s work, creating stunning scientific illustrations of bacterial and fungal colonies. During the hot Summer of 1881, she watched as Hesse struggled with gelatine-based growth media. Fanny Angelina, recalling the stability of her agar-based desserts, suggested that they try that instead. Hesse wrote a letter to Koch informing him about the switch, and Koch mentioned agar for the first time in his 1882 groundbreaking paper on the discovery of the tuberculosis bacillus.

Image from a graphic novel about agar and Fanny Angelina Hesse, called "The Dessert that Changed the World." Story by Corrado Nai and artwork by “SHog.” Support on Patreon.

The change to agar was a marked improvement. The jelly is so effective that it is still an invariable ingredient in what is known today as the “Koch’s plating technique” or the “culture plating method.” As Koch himself noted in 1909: “These new methods proved so helpful…that one could regard them as the keys for the further investigation of microorganisms…Discoveries fell into our laps like ripe fruits.”

Once Koch established the methods to grow pure cultures of bacteria like tuberculosis and anthrax, he demonstrated for the first time that microbes can cause diseases, a feat that earned him the 1905 Nobel Prize in Physiology or Medicine.

However, Koch never credited the Hesses for their discovery of bacteriological agar, perhaps because, at the time, he failed to recognize its importance. Even after he received the insight about agar from the Hesses, Koch stuck with gelatine for years. In 1883-84, during his first medical expedition to Egypt and India to investigate cholera, he tried and failed to grow the cholera bacterium on gelatine media in the hot climate of Cairo (despite using a half-open fridge for incubation), only succeeding in the colder winter of Calcutta.

Fanny Angelina Hesse, 1883.

It is difficult to know exactly when the shift from gelatine to agar occurred. As often happens for scientific breakthroughs, agar was likely adopted incrementally alongside the use of other growth media. In 1913, for example, the first diagnosis of Serratia marcescens as a human pathogen was made by growing it on agar as well as on potatoes.

Nevertheless, by 1905, a report on the seaweed industries in Japan noted the “very important use [of pure-grade agar] as a culture medium in bacteriological work.” It’s safe to say that, around the turn of the 20th century, agar had moved from an inconspicuous kitchen jelly to an indispensable scientific substance.

Subscribe now

A Strategic Substance

Several properties of agar render it a superior jelly. Agar isn’t broken down by microbial enzymes apart from a few species (including bacteria living in marine and freshwater habitats), and it dissolves well in boiling water, making it easy to sterilize. The jelly doesn’t react with the ingredients of a broth, whose composition can be adjusted to meet the nutritional requirements of different microbes, and sets to a firm gel without the need for refrigeration.

Agar’s low viscosity also makes it easy to pour into Petri dishes, and its transparency permits observation of microbes growing on its surface.7 Also aiding in this is its low syneresis (extrusion of water from the gel), guaranteeing less surface “sweating”: Once a plate is inoculated, bacterial colonies stay in place and do not mix.

The jelly is chemically inert since no additives are needed for gelation. This allows chemicals dissolved in the jelly’s aqueous phase to diffuse well, a prerequisite for testing if certain species or strains are resistant to antibiotics or antifungals. In these simple assays, zones of growth inhibition of bacteria or fungi (or their absence) point to the effectiveness of (or resistance towards) antibiotics or antifungals.

But agar’s superior qualities come with complex chemistry. “To speak of agar as a single substance of certain (if known) chemical structure is probably a mistake,” wrote phycologist Harold Humm in a 1947 article. According to the Food and Agriculture Organization of the United Nations, agar is merely recognized as “a hydrophilic colloid extracted from certain seaweeds of the Rhodophyceae class.” In terms of its actual composition, agar is mostly a combination of two polysaccharides, agaropectin and agarose, which themselves are complex and poorly-characterized polysaccharides made mostly (but not exclusively) from the simple sugar galactose.8

Agar comes from multiple sources, as many red seaweeds are “agarophytes” (that is, seaweeds containing agar in their cell walls). Species of Gelidium are the most important source of bacteriological (lab-grade) agar. Other main agarophytes, largely used for culinary agar, include red seaweeds from the genera Gracilaria, Pterocladia, Ahnfeltia, and others. Species from the genera Eucheuma, Gigartina, and Chondrus have been used as agarophytes in research during agar shortages.9

Sketches of Japanese algae, by Kintaro Okamura (1913).

One striking characteristic of Gelidium is that it must be wild-harvested rather than farmed. Unlike Gracilaria for culinary agar production, Gelidium grows slowly and thrives only in cold, turbulent waters over rocky seabeds, conditions nearly impossible to replicate in aquaculture. This dependence on wild harvesting explains the need for seaweed collectors during WWII, and continues to make Gelidium a strategically critical resource.

While Gelidium seaweeds can be collected by gathering fragments washed ashore, mass production of agar requires steady, large quantities.10 Harvesters in New Zealand during WWII had to “walk beside a boat, waist to armpit deep in water and feel for the weed with their feet.” Handling large volumes of wet seaweed (which yields less than five percent agar) was challenging. Then as now, when Gelidium is harvested by scuba divers from rocky seabeds, collectors have to understand the life cycle of the algae, find the most likely locations for its growth, and prevent overharvesting to safeguard future yields.

Given its auspicious position on the Atlantic coastline, Morocco has been the main source of Gelidium for at least two decades, and demand for bacteriological agar continues to grow. Yearly global consumption increased from 250 tons to 700 tons between 1993 and 2018, and is currently estimated at around 1,200-1,800 tons per year, according to Pelayo Cobos, commercial director of Europe’s largest producer of agar, Roko Agar.

The Future of Agar

Amid such rising demand, it’s understandable that researchers worried when Morocco reduced exports of agarophytes in 2015. This shortage — due to a combination of overharvesting, climate warming, and an economic shift to internal manufacturing in the North African country — not only caused alarm but a three-fold price increase of wholesale bacteriological agar, which reached $35-45 per kilogram. (At the time of writing this in late 2025, factory agar prices are sitting at about $30 per kilogram, according to Cobos.)

A few years later, in 2024, researchers in multiple labs were horrified to notice toxic batches of agar for reasons as yet unclear. After they observed a worrying lack of microbial growth (impeding their ability to carry out basic experiments), they switched to different agar suppliers, and their results improved.

This was not the first time that microbiologists experienced problems with agar. A phenomenon called “The Great Plate Count Anomaly” baffled researchers in the early 20th century when they observed that the number of cells seen under a microscope didn’t match the actual number of colonies growing on an agar plate. Investigating this discrepancy, researchers found agar itself to be the culprit: when nutrient broths are heated with agar during boiling, harmful byproducts (hydroperoxide) can form due to the reaction of agar with phosphate minerals contained in the media. Researchers can avoid this by autoclaving agar separately from the nutrient broth, or by reducing the amount of agar used.

This anomaly is indicative of the larger challenge of culturing various microbial species, referred to as microbial “unculturability.” This cannot be explained by the use of agar alone or by the substitution of an alternative gelling agent, but rather by the difficulties in consistently recreating on an agar plate the multi-variable environment in which microbes grow naturally. Given such challenges, the risk of shortages, and the vulnerabilities of the agar supply chain, why is it so difficult to find suitable alternatives?

It is not for lack of trying. In some cases, microbiologists have ditched the Petri dish altogether, using microfluidics for manipulating and growing cells. However, these approaches aren’t likely to be adopted at scale as they require less common, less practical, and more expensive devices. So, what about other growth media?

A microfluidics chip enables researchers to manipulate and study individual cells, without the use of agar at all. Credit: Brouzes E. et al. PNAS (2009).

By WWII, scientists had already begun looking at alternative gelling substances for routine use in bacteriology, but concluded that agar was still better as it is both firmer and easier to handle. Today, some specialized microbiology applications use the colloid carrageenan (extracted from red seaweed Chondrus crispus, or “Irish Moss”), a more transparent and less auto-fluorescent alternative to agar (agar emits its own background fluorescence when excited by light). However, for routine bacteriological use, carrageenan is more difficult to dissolve, requires higher concentrations, can degrade at high temperatures, and forms weaker gels, which may result in puncturing its surface during the plating of cells.

In some cases, alternative gelling agents might provide faster results. Researchers observed that bacterial cellulose and another bacterial polysaccharide, Eladium, allow a 50 percent increased growth rate for various bacteria and yeasts (as compared to their growth on agar), including higher biomass yields or faster detectable biofilm formation. However, both substances are still not as cheap and readily available as agar.

Guar gum, a plant colloid, costs less than agar and is better suited for growing thermophilic bacteria, but is also more difficult to handle, being more viscous and less transparent. The bacterial polysaccharide xanthan is cheaper as well but forms weaker jellies that, as with carrageenan, might result in puncturing its surface. Other colloids, like alginate (from brown seaweed) and gellan gum (from a bacterium), don’t set solely based on temperature and require additives for gelation. These additives might interfere with microbial growth and make the preparation of those jellies less handy than agar plates.

Thus, despite much effort, no gelling agent has yet been discovered that possesses all the properties and benefits of agar. Agar continues to be the best all-arounder: versatile, cheap, and established. And, if Gelidium agar should ever run out, and another colloid is not at hand, microbiologists could revert to culinary agar, which, although not as pure and transparent, offers a low-cost alternative to lab-grade agar.

It’s also worth noting that even if alternatives superior to agar were found, scientists are reluctant to abandon established protocols (even when microbiologists do use other jellies, they often still add agar to the mix, for example, to increase the gel strength of the solid media). As agar has been the standard gelling agent in microbiology for around 150 years, an enormous infrastructure of standardized methods, reference values, and quality control procedures has emerged around its specific properties. Switching to a different medium (even a superior one) means results may not be directly comparable to decades of published literature or to other laboratories’ findings.

So it is that agar continues to be the jelly of choice in laboratories around the world. As Humm wrote in 1947: “Today, the most important product obtained from seaweeds is agar, a widely-used commodity but one that is not well known to the general public.” Almost 80 years later, it might be better known, but its importance hasn’t dwindled.

Subscribe now


Corrado Nai has a Ph.D. in microbiology and is a science writer with bylines in New Scientist, Smithsonian Magazine, Small Things Considered, Asimov Press, and many more. He is currently writing a graphic novel about Fanny Angelina Hesse and the introduction of agar in the lab called The Dessert that Changed the World, which can be followed and supported on Patreon.

Thanks to Steven Forsythe for sharing a report on the use of agar seaweed in Britain during WWII, Barbara Buchberger at the Robert Koch Institute for pointing out Koch’s use of gelatine for the identification of cholera, and the surviving relative of Fanny Angelina Hesse for sharing a trove of unpublished material.

Cite: Nai, C. “The Origins of Agar.” Asimov Press (2026). DOI: 10.62211/12pq-97ht

1

A full list of these materials can be found at (psfa0134, pg. 9).

2

Japan halted exports to other countries for fear that agar supported their development of biowarfare weapons. A few years before, Nazi Germany allegedly tested the efficacy of biowarfare attacks with another curious microbe, Serratia marcescens, dubbed “the miracle bacterium.” According to a much-talked about report by investigative journalist Henry Wickham Steed titled “Aerial Warfare: Secret German Plans” members of a secret Luft-Gas-Angriff (Air Gas Attack) Department spread the S. marcescens in the subterranean train networks of Paris and London and measured its reach armed with Petri dishes and agar plates.

3

It wasn’t the first time that nations at war turned to seaweed. During the First World War, the U.S. relied on the giant kelp seaweed (Macrocystis) to boost production of potash (a fertilizer produced in Germany), gunpowder, and acetone.

4

In 2007, Barrangou et al. demonstrated for the first time the function of CRISPR/Cas9 as a defensive mechanism of bacteria against bacteriophage attacks by a technique called “plaquing” which builds upon the technique of “plating” bacteria on agar. Plaques of viruses on agar are areas without growth of bacteria due to viral attacks.

5

The same properties also contributed to Nazi Germany’s strategy against agar’s scarcity, which — besides being supplied from Japan by submarine — relied on large pre-war stocks and on recovery methods to reuse bacteriological agar by autoclaving (boiling at around 121°C, 250°F, in a pressurized container for 30 to 60 minutes), thus liquefying and sterilizing the jelly, before purifying it again.

6

Koch borrowed the idea of using gelatine from mycologist Oscar Brefeld, who had used it to grow fungi. Interestingly, Brefeld also employed carrageenan, another seaweed-derived jelly. Because fungi generally favor growing at ambient temperatures, Brefeld might have been less plagued by the melting of growth media than Koch.

7

Julius Petri once wrote: “These shallow dishes are particularly recommended for agar plates…Counting the grown colonies is also easy.” (Translated by Corrado Nai from the original, 1887 German manuscript.)

8

Agarose is used in electrophoresis, chromatography, immunodiffusion assays, cell and tissue culturing, and other applications. It is the electrically neutral, non-sulphated, gelling component of agar. While its market is smaller, it is fundamental for specialized biochemical and analytical protocols.

9

Gigartina stella and Chondrus crispus, for example, were used as main agarophyte in Britain during WWII, alongside the use of a different colloid, carrageenan (see main text).

10

Washing and drying the bulk raw material to prevent spoilage also isn’t easy. During WWII, volunteers in Britain occasionally dammed natural streams to wash the seaweeds and used hot air from a bakery to dry them. Praising the concerted efforts of volunteers, the UK Ministry of Supply concluded that “all belligerent countries should have a local source” of agar.



Read the whole story
mrmarchant
1 hour ago
reply
Share this story
Delete

The U.S. spent $30 billion to ditch textbooks for laptops and tablets: The result is the first generation less cognitively capable than their parents

1 Share

In 2002, Maine became the first state to implement a statewide laptop program to some grade levels. Then-Governor Angus King saw the program as a way to put the internet at the fingertips of more children, who would be able to immerse themselves in information. 

By that fall, the Maine Learning Technology Initiative had distributed 17,000 Apple laptops to seventh graders across 243 middle schools. By 2016, those numbers had multiplied to 66,000 laptops and tablets distributed to Maine students.

King’s initial efforts have been mirrored across the country. In 2024, the U.S. spent more than $30 billion putting laptops and tablets in school. But more than a quarter century and numerous evolving models of technology later, psychologists and learning experts see a different outcome than the one King intended. Rather than empowering the generation with access to more knowledge, the technology had the opposite effect.

Earlier this year, in written testimony before the U.S. Senate Committee on Commerce, Science, and Transportation, neuroscientist Jared Cooney Horvath said that Gen Z is less cognitively capable than previous generations, despite its unprecedented access to technology. He said Gen Z is the first generation in modern history to score lower on standardized tests than the previous one.

While skills measured by these tests, like literacy and numeracy, aren’t always indicative of intelligence, they are a reflection of cognitive capability, which Horvath said has been on the decline over the last decade or so.

Citing Program for International Student Assessment data taken from 15-year-olds across the world and other standardized tests, Horvath noted not only dipping test scores, but also a stark correlation in scores and time spent on computers in school, such that more screen time was related to worse scores. He blamed students having unfettered access to technology that atrophied rather than bolstered learning capabilities. The introduction of the iPhone in 2007 also didn’t help.

“This is not a debate about rejecting technology,” Horvath wrote. “It is a question of aligning educational tools with how human learning actually works. Evidence indicates that indiscriminate digital expansion has weakened learning environments rather than strengthened them.”

The writing was perhaps already on the wall. Fortune reported in 2017 that Maine’s public school test scores had not improved in the 15 years the state had implemented its technology initiative. Then-Governor Paul LePage called the program a “massive failure,” even as the state poured money into contracts with Apple.

Gen Z will now have to face the ramifications of eroding learning capabilities. The generation has already been hit hard by the transformations of the 21st century’s other technological revolution: generative AI.

Early data from a first-of-its-kind Stanford University study published last year found AI advancements to have “significant and disproportionate impact on entry-level workers in the U.S. labor market.” But a less capable population means more than just poorer job prospects and less promotions, Horvath warned; it endangers how humans are able to overcome existential challenges in the decades to come.

“We’re facing challenges more complex and far-reaching than any in human history—from overpopulation to evolving diseases to moral drift,” he told Fortune. “Now, more than ever, we need a generation able to grapple with nuance, hold multiple truths in tension, and creatively tackle problems that are stumping the greatest adult minds of today.”

Technology’s impact on learning

Classroom technology usage has ballooned in recent years. A 2021 EdWeek Research Center poll of 846 teachers found 55% said they are spending one to four hours per day with educational tech. Another quarter reported using the digital tools five hours per day.

While teachers may be intending for these tools to be strictly educational, students often have different ideas. According to a 2014 study, which surveyed and observed 3,000 university students, students engaged in off-task activities on their computers nearly two-thirds of the time.

Horvath blamed this tendency to get off-track as a key contributor to technology hindering learning. When one’s attention is interrupted, it takes time to refocus. Task-switching also is associated with weaker memory formation and greater rates of error. Grappling with a challenging singular subject matter is hard, Horvath said. For the best learning to happen, it’s supposed to be.

“Unfortunately, ease has never been a defining characteristic of learning,” he said. “Learning is effortful, difficult, and oftentimes uncomfortable. But it’s the friction that makes learning deep and transferable into the future.”

Sustained attention to a singular subject is anathema to how technology today has been deployed, argues Jean Twenge, San Diego State University psychology professor studying generational differences and the author of 10 Rules for Raising Kids in a High-Tech World. More time on screens isn’t just ineffective in facilitating learnings; it’s counterproductive.

“Many apps, including social media and gaming apps, are designed to be addictive,” Twenge told Fortune. “Their business model is based on users spending the most time possible on the apps, and checking back as frequently as possible.”

A Baylor University-led study published in November 2025 uncovered why this is: TikTok required the least amount of effort to use, even less than Instagram Reels and YouTube shorts, by balancing relevant videos with surprising and unexpected content.

Concerns over social media addiction have become so dire that 1,600 plaintiffs, across 350 families and 250 school districts, filed a lawsuit alleging Meta, Snap, TikTok, and YouTube created addictive platforms leading to mental health challenges like depression and self-harm in children. 

Solving the tech crisis

Horvath proposed a swath of solutions to Gen Z’s tech problem, at least as it pertains to classroom use. Congress, he suggested, could impose efficacy standards to fund research on what digital tools are actually effective in the classroom. The legislature could also require strong limits on tracking behavior, building profiles, and collecting data on minors using tech.

Some schools have taken matters into their own hands. As of August 2025, 17 states have cracked down on cellphone use in school, banning the technology during instructional time; and 35 states have laws limiting the use of phones in the classroom. In fact, more than 75% of schools have said they have policies prohibiting cellphone use for non-academic purposes, according to the National Center for Education Statistics, though enforcing those bans have been met with variable success.

Ultimately, Horvath said, the loss of critical thinking and learning skills is less of a personal failure and more of a policy one, calling the generation of Americans educated with gadgets victims of a failed pedagogical experiment. 

“Whenever I work with teenagers I tell them, ‘This is not your fault. None of you asked to be sat in front of a computer for your entire K-12 schooling,’” Horvath said. “That means we…screwed up—and I genuinely hope Gen Z quickly figures that out and gets mad.”

This story was originally featured on Fortune.com



Read the whole story
mrmarchant
1 hour ago
reply
Share this story
Delete

How packaged salads took over America

1 Share

The humble packaged salad is a great American invention and is highly technologically advanced.

The packaged salad is a triumph of food safety innovation, materials science, genetics, supply chain and logistics, vacuum cooling, biology, chemistry, vision systems, robotics, and process improvements.

In 1987, packaged salads were less than 1% of total produce sales.

Today, packaged salads and leafy greens account for nearly 50% of leafy-green shelf space at major retailers like Walmart and Kroger, and almost 70% of the U.S. population eats them.

Producers like Taylor Farms, Dole Food Company, Fresh Express, Earthbound Farm, and Organic Girl dominate the packaged salad category, accounting for a lion’s share of production and sales.

In 2026, the retail value of packaged salads had grown to roughly three times the size of the entire bulk lettuce market. (Bulk lettuce refers to whole-head lettuce or loose salad greens)

Subscribe now

How did we get here?

What’s driving the shift? The most obvious factor is convenience. Packaged salads and greens are easy. The leafy version of ramen. Salad kits with pre-packaged toppings and dressings are actually good.

Salads are healthy. You can buy a packaged salad with all the fixings at your favorite grocery retailer. Salads come in ready to eat, you can customize them with the fixings and dressings, and there is no cutting or cleaning required!

Packaged Salad Wall at my local Safeway in California (Photo by Rhishi Pethe)

We got here thanks to technology that enables producers to sanitize greens and safely extend their shelf life, so that consumers on the other side of the country can enjoy them.

This is an unbelievably time-constrained endeavor. From the moment fresh lettuces and leafy greens are first harvested to when they are packaged and ready to ship, only 24-72 hours have elapsed.

This is a significant amount of time for produce, which deteriorates each day after harvest due to natural biological processes.

A banana, 7 days after you buy it from the store, looks and tastes very different from when you bought it.

Your raspberries look and taste like crap (which is a “legitimate” food science term!) if you leave them outside for too long, or even if they have stayed in your refrigerator for too long.

The expansion of the packaged salad category into a $15.6 billion industry in 2026 is driven by specific technological and process innovations.

So what are the specific steps to get a packaged salad to you?

Journey of a Packaged Salad: From Farm to Consumer. Most packaged salads have a shelf life of 14 days. (Image generated by Nano Banana Pro based on a prompt provided by Rhishi Pethe)

Share

First, we use vacuum cooling to evaporate

Field heat is the latent warmth from the sun and the ground absorbed by the produce. It dramatically accelerates respiration and microbial growth. Rapid heat removal is critical because it effectively halts this degradation, preserving quality and extending the product’s commercial life.

Placing harvested lettuce in a vacuum chamber lowers the boiling point of water. A tiny amount of moisture on the leaf “boils off” at room temperature, pulling heat away from the lettuce’s core.

Most packaged salad producers follow the 4-hour rule from “cut to chill”. (It means no more than four hours should elapse between when the lettuce is cut/harvested in the field and when it is cooled.)

This process lowers the temperature from 70°F to 34°F in 20 minutes. It does so by uniformly dissipating heat without causing damage.

Vacuum Cooling adds 2 to 3 days to shelf life.

Image source: All Cold Cooling

Then we cut it, wash it (three times), and dry it

As shown in the diagram above, leafy greens are cut before being washed, dried, and packaged.

Bacteriophage Cocktails

Instead of relying solely on chemicals to clean and process lettuce, processors are now using bacteriophages-viruses that specifically target and kill ‘bad’ bacteria like Listeria and E. coli.

This approach directly addresses consumer concerns about pathogen risks by providing a natural, targeted method that achieves a 99% reduction in harmful bacteria, ensuring safer salads without compromising quality. Phage treatments are applied as a fine mist post-wash and remain active during storage, offering ongoing protection during transport.

Triple-wash

Modern wash systems are designed to prevent “cross-contamination” (where one bad leaf ruins the whole batch. Rather than relying on high levels of chlorine, processors use wash enhancers that stabilize the water’s pH and ensure the sanitizer remains effective even when the water becomes “dirty” with organic matter (dirt and juice).

In 2026, more facilities are integrating phages, which are “good” viruses that specifically hunt and kill Listeria and Salmonella. The phages are misted onto the leaves as a final safety layer that continues to “work” inside the bag during transport. It acts as a buffer, maintaining a precise pH (typically 6.0–7.0).

Improvements in washing technology have increased shelf life by 2-3 days.

Salad section from my local Safeway (photo by Rhishi Pethe)

Modern production lines use hyperspectral imaging and AI to ‘see’ things humans cannot. Cameras scan leaves in real time, detecting spectral signatures of spoilage, fecal contamination, and foreign materials such as plastic or insects.

This technology ensures only high-quality, safe products reach consumers by reducing contamination risks and increasing the percentage of good products shipped, thereby enhancing food safety and quality assurance.

High-speed air ejectors that fire a burst of air to “flick” the specific bad leaf off the belt handle rejections.

These vision and robotics systems increase the percentage of good products shipped and reduce the risk of product contamination.

Genetics for processability

In the 1990s and 2000s, breeders realized that standard lettuce was too fragile for factories. They began selecting and breeding for “processable” traits. Genetics helped change the cellular architecture so that leaves with smaller, tightly packed cells and stiff walls survived the mechanical spin-dryers in packaged lettuce plants without bruising.

Genetics also played a major role in eliminating “pink rib”, a genetic predisposition in some romaine varieties to turn pink/red when stressed or cut.

Lately, the industry has shifted from traditional cross-breeding to genomic selection to identify and select plants that keep lettuce crisp even in suboptimal fridge temperatures.

These genetic improvements have improved lettuce processability in salad factories and increased the likelihood that it stays fresh and appealing to consumers.

Then we put the lettuce to sleep in a special bag

One of the earliest innovations that extended shelf life and transportability for packaged salads was slowing the biological process by “suffocating” the lettuce and putting it to “sleep”.

In 1989, Fresh Express introduced Modified Atmosphere Packaging (MAP) technology. It uses breathable bags that balance oxygen and carbon dioxide levels, preventing leaves from wilting. MAP uses semi-permeable membranes to create a custom atmosphere.

Normal air is roughly 21% oxygen, 78% nitrogen, and 0.04% carbon dioxide. In a salad bag, the mix is flipped over to put the leaves into “hibernation”. The target oxygen level is set to 1-5%, the target carbon dioxide level is set to 5-15%, and the remaining is nitrogen, used as an inert filter.

This process puts the lettuce to “sleep.” It prevents the leaves from consuming their own sugar reserves too quickly. It slows respiration and prevents spoilage bacteria from growing.

The specific environment delays lettuce oxidation, adding 4-6 days of shelf life to the packaged salad bag.

Modified Atmosphere Packaging extends shelf life by 4-6 days.

Non-packaged greens section in my local Safeway (see how small it is compared to the packaged salad section) (Photo by Rhishi Pethe)

In the early 2000s, packaging moved from simple plastic to laser-perforated films. The bag is a semi-permeable membrane with laser-created holes. Lasers are used to poke holes in the plastic. The specific respiration rate of the salad mix determines the size and frequency of these holes.

For example, the respiration rate of baby spinach is very high. So the perforation strategy requires a high density of larger holes to allow oxygen to be consumed and constant venting to prevent fermentation. The holes in the bag can be as small as 50 to 200 microns.

Romaine or Iceberg Lettuce has a lower respiration rate than spinach, so the perforation strategy calls for a low-to-moderate hole density. The lettuce leaves are structurally tougher and so breathe slowly, whereas too much “air” will cause the cut edges to turn pink.

The perforation strategy adds another 2-3 days to shelf life.

Then we eat

The packaged salad travels through the cold chain to the grocery retailer or wholesaler’s distribution center. It ultimately ends up on a grocery store shelf, in a restaurant kitchen, or in a food service cooler, before it is consumed. At the same time, its freshness is still intact by consumers like you and me.

With all the technological advances, packaged salads can last for up to 14-18 days after packaging.

So the next time, when you are at your desk eating a healthy packaged salad, as one of the 251 million Americans who consume them, and getting ready for your next video call in 12 minutes, take a moment to thank all the technology that made it possible to get the product to you a few thousand miles from where the greens were grown and harvested.


The video shows how a packaged mixed salad is processed and packaged. It shows vacuum cleaning, vision and robotics systems, triple washing, drying, and packaging using specialized bags with pores.


Note: I do want to acknowledge that even though packaged salad is mostly healthy and safe, sometimes we do see recalls due to various concerns. You can look up all FDA recalls and safety issues here.


I want to thank Mike Riggs and Abby Shalek-Briski for their feedback.

Read the whole story
mrmarchant
1 hour ago
reply
Share this story
Delete

The Broken Record

1 Share
The Broken Record

I was amused to read Dan Meyer’s account of the recent AI+Education Summit at Stanford, particularly the remarks made by the university’s former president, John Hennessy, who asked the audience if anyone remembered “the MOOC revolution” and could explain how, this time, things will be different. The panelists all seemed to assert that -- thanks to “AI” -- the revolution is definitely here. The revolution or, say, a tsunami -- the word that Hennessy used back in 2012 when he himself predicted a sweeping technological transformation of education -- a phrase echoed in so many stupid NYT op-eds and pitch decks. Dan recalled the utterance, but no one else seemed to -- at least no one on stage or in the audience seemed to have the guts to turn to Hennessy (or any of the attendees or speakers, many of whom were also high on the MOOC vapors) and call him on his predictive bullshit.

As Dan correctly notes,

Look — this is more or less how the same crowd talked about MOOCs ten years ago. Copy and paste. And AI tutors will fall short of the same bar for the same reason MOOCs did: it’s humans who help humans do hard things. Ever thus. And so many of these technologies — by accident or design — fit a bell jar around the student. They put the kid into an airtight container with the technology inside and every other human outside. That’s all you need to know about their odds of success.

The odds of success are non-existent. There will be no “AI” tutor revolution just as there was no MOOC revolution just as there was no personalized learning revolution just as there was no computer-assisted instruction revolution just as there was no teaching machine revolution. If there is a tsunami, it’s not technological as much as ideological, as the values of Silicon Valley -- techno-libertarianism, accelerationism -- are hard at work in undermining democratic institutions, including school.

The history of failed ed-tech startups and ed-tech schools is long, and yet we’re trapped in this awful cycle where investors and entrepreneurs keep repackaging the same bad ideas.

There was another story this week on Alpha School, this one by 404 Media’s Emanuel Maiberg: “’Students Are Being Treated Like Guinea Pigs:' Inside an AI-Powered Private School.” Back in October, Wired documented the miserable experiences of students, forced into hours of repetitive clicking on drill-and-kill software under incessant surveillance. Maiberg’s reporting, in part, expands on this, as he writes about the goal of building “bossware for kids” -- that is ways to identify “enhanced tracking and monitoring of kids beyond screentime data.”

But much of Maiberg’s story examines the use of technologies to build the “AI curriculum” touted by the school’s founders. Not only does Alpha School’s reliance on LLMs for creating curriculum, reading assignments, and exercises mean these materials are littered with garbled nonsense, but the company seems to also be scraping (i.e. stealing) other education companies’ materials, including those of IXL and Khan Academy, for use in building their own.

While I deeply appreciate Maiberg’s reporting here -- I am a huge fan of 404 Media and am a paid subscriber because I think investigative journalism is important and necessary -- this story is a huge disappointment because it does not push back at all on the underlying ideas of Alpha School. Indeed, this is precisely the problem that keeps us trapped in this “ed-tech deja vu” -- the one that has, just in the last couple of decades, recycled this same idea over and over and over again (funded and promoted, it’s worth noting, by the very same people -- the Marc Andreessens and Reid Hoffmans and Mark Zuckerbergs of the world): Rocketship Education. Summit Learning. AltSchool. And now Alpha School.

Maiberg suggests in his story (and more explicitly on the podcast in which he and the publication’s other co-founders discuss the week’s articles) that Alpha School’s idea of “2 hour learning” is a good idea. But I think that claim -- the school’s key marketing claim, to be sure, before, like everyone else, it started to tout the whole “AI” thing -- needs to really be interrogated. Why are speed and efficiency the goal? These are the goals of the tech industry’s commitment to accelerationism, yes. These are the goals for a lot of video games, where you grind through repetitive tasks to accumulate enough points to level up. But why should these be something that schools embrace? Why should these be core values for education? Does learning -- deep, rich, transformative learning -- ever actually happen this way? (And what else are we learning, one might ask, when we adopt technological systems and world views that prioritize these?)

Let me quote math educator Michael Pershan at length here:

I keep coming around to this: the interesting innovation of Alpha School is not their apps or schedule or Timeback but their relationship to core academics. This is a school that believes that the “core” of schooling should be taken care of as quickly and painlessly as possible so that the rest of the day can be opened up to things that actually matter. Most schools don’t do this! We instead tell kids that history is a way of understanding ourselves and others. Math, we say, can be an absolute joy, full of logical surprises. We tell kids that a good story can open up your heart and mind.

Alpha doesn’t. They aim to streamline and focus on the essentials for skill mastery. Maybe they are showing you can learn to comprehend challenging texts without reading books. Maybe a math education composed of examples and (mostly) multiple choice questions is, in reality, all you need to ace the SAT.

If it turns out they’re succeeding at this, it’s because they’re trying.

And maybe, one day, Alpha or someone else will crack the code for good. It then will be possible to get all students to grind through the skills and move on. With all that extra time, schools will find better things for kids to do than academics. And maybe, at some point, we’ll ask, what’s the point of grinding through things we don’t care about? Do we really need to become great at mathematics when machines can do it? How important is it really to learn how to read novels or fiction? Maybe, one day, this is how books disappear from schools for good.

The schools like Alpha School, AltSchool, Summit, and Rocketship are all strikingly dystopian insofar as they compromise, if not reject, any sort of agency for students; they compromise, if not reject, any sort of democratic vision for the classroom. School is simply an exercise in engineering and optimization: command and control and test-prep and feedback loops. There is no space for community or cooperation, no time for play -- there is no openness, no curiosity, no contemplation, no pause. There is no possibility for anything, other than what the algorithm predicts.

(Kids hate this shit, no surprise. They want to be human; they want to be with other humans, even if tech-bros try to build a world that’s forgotten how.)


Or rather, most kids hate this shit. There are a few who embrace it because if they play the game right, they reckon, they too can join the tech elite. Case in point, yet another profile of Cluely founder Roy Lee, this one by Sam Kriss in Harper’s: “Child’s Play: Tech’s new generation and the end of thinking.”

The Broken Record

I find this insistence from certain quarters that “there is no evidence that social media harms children” to be pretty disingenuous. There’s a lot of evidence -- plenty of research that points to negative effects and sure plenty that points to positive effects of technology, so it’s a little weird to see efforts to curb kids’ mobile phone and social media usage as just some big conspiracy for Jonathan Haidt to sell more books.

Mark Zuckerberg took the stand this week in a California court case that contends that Meta (along with other tech companies such as TikTok and Google) knowingly created software that was addictive, leading to personal injury -- and for the plaintiff in this particular case, leading to anxiety, depression, and poor body image.

That the judge in the case had to chastise Zuckerberg and his legal team for wearing their “AI” Ray-bans in the courtroom just serves to underscore how very little these people care for the norms and values of democratic institutions.

We see this in the courtroom. We see this in the media. We see this in schools -- from Slate: “Meta’s A.I. Smart Glasses Are Wreaking Havoc in Schools Across the Country. It’s Only Going to Get Worse.”

We see this in the billions of dollars that the tech companies plan to funnel into elections this year to try to ensure there are no regulatory measures taken to curb their extractive practices -- $65 million from Meta alone.


What on earth would make you think that tech companies -- their investors, their executives, their sycophants in the media -- want to make education better?

Inside Higher Ed reported this week that the University of Texas Board asked faculty to “avoid ‘controversial’ topics in class.” There weren’t any details on what this meant -- what counts as “controversial” -- or how this might be enforced. (Meanwhile in Florida, college faculty were handed a state-created curriculum and told to teach from it.)

We are witnessing the destruction – the targeted destruction – of academic freedom across American universities. This trickles down into all aspects of education at every level.

And to be clear, again, my god, I'm a broken record too: this is all inextricable from the rise of “AI,” from its injection into every piece of educational and managerial software. The tech industry seeks the monopolization of knowledge; they seek the control of labor – intellectual labor and all labor, “white collar” and “blue collar” is intellectual labor. They worship speed and efficiency, not because these values are democratic, but precisely because they believe they can make us bend our entire beings towards their profitability.


The Broken Record

Perhaps ed-tech is, in the end, simply "optimistic cruelty"; and these cycles that we keep going through are just repeated and failed attempts to replay and harness Ayn Rand's bad ideas, her mean-spirited visions for a shiny, shitty technolibertarian future – one in which children (other people's children, of course) are the grist for the entrepreneurial mill.


More bad people doing bad things in ed-tech:


The Broken Record
(Image credits)

Today’s bird is the pigeon, because yeah, we are still living in B. F. Skinner’s world -- one where people will look you in the eye and say that being “agentic” means handing over all your decision-making to their system, that “freedom” and “dignity” don’t really matter because their brilliant engineering is going to make everything fine and dandy. This time. Really. It’s a revolution. It’s a tsunami. It’s a shit storm.

Russian Startup Hacks Pigeon Brains to Turn Them into Living Drones.”


Thanks for reading Second Breakfast. Please consider becoming a paid subscriber. Your financial support is what enables me to do this work.

Read the whole story
mrmarchant
1 day ago
reply
Share this story
Delete

Books and screens

1 Share

A modern library with tall bookshelves and people reading or using devices in a cosy, well-lit atmosphere.

Your inability to focus isn’t a failing. It’s a design problem, and the answer isn’t getting rid of our screen time

- by Carlo Iacono

Read on Aeon

Read the whole story
mrmarchant
1 day ago
reply
Share this story
Delete
Next Page of Stories