1452 stories
·
2 followers

The Flawed V02 Max Craze

2 Shares

In the past couple of weeks I’ve had 2 patients contact me because they were worried: their V02 max was decreasing. Their data were based on smartwatch imputations, which are notoriously imprecise. But the problem is much bigger than that. In this edition of Ground Truths I’m going to get into the difference between cardiorespiratory fitness and V02 max, which are remarkably different for the way they are measured, the datasets that assess them for functional significance and outcomes for healthy adults, and how we got into this craze.

Subscribe now

A schematic I made with NanoBanana Pro (the use of a treadmill or bicycle is interchangeable but the measurements are altogether different .

How They Are Measured

Cardiorespiratory fitness (CRF) is a real world assessment of a person activities, such as walking or on a treadmill, a reflection of a person’s resting metabolic rate, measured in metabolic equivalent of task (MET) units with 3 recognized levels of intensity : Light (<3.0 METs), example slow walking; Moderate (3.0-5.9 METs), example brisk walking, 3-4 miles per hour; and vigorous intensity (>6.0 METs), example jogging. 1 MET is the energy used in sitting or resting; 10 METs requires 10-fold the energy expenditure. CRF integrates cardiovascular, lung and musculoskeletal functional capacity.

There are multiple methods to calculate your METS, including a standard treadmill MET chart (below left) that plots speed and incline, use a formula if you are doing the Bruce treadmill protocol or the chart below (right), or using heart rate (with any aerobic activity, such as bicycling or jogging) with the formula: METS=0.05 X heart rate+2. So if your HR got to 140 that would be 9 METS. For every increase in heart rate of 10 beats per minute, there’s about a 1 MET increase.

Maximal oxygen uptake (V02 max) is only accurately determined as a performance lab test with a metabolic cart, trained technicians, a specialized tightly fit mask that captures every molecule of inhaled oxygen and exhaled C02 on a ramped treadmill or stationary bike exercise protocol until absolute exhaustion. This is the ceiling of aerobic power achieved via direct gas exchange. A V02 max test costs about $150 for a standard assessment in a university lab.

V02 max by wearables are obviously not measured by gas exchange or directly, but instead through various imputations based upon population-based algorithms of heart rate and movement (GPS/accelerometry). Studies have assessed the Apple Watch, Garmin Fenix 6, and Fitbit with a mean absolute percentage error of 7-16%. Overall, they have been found to consistently underestimate V02 in fit people while overestimate in unfit individuals. They also rely on optical heart rate (which may be inaccurate in people of color), device positioning and wrist anatomy, and can be influenced by such factors as hydration status, altitude, and ambient temperature. Typically, a 6-minute walk is the basis for a wearable to provide the user a new V02 max result. That may not be at all representative of an individual’s exercise capacity.

Share

The Datasets For Assessment

Datasets for Cardiorespiratory Fitness

In JAMA 2009, a meta-analysis of 33 studies of cardiorespiratory fitness was published for the relationship to all-cause mortality in a total of 102,980 participants. A better CRF (per 1 MET higher) was linked with a lower all-cause mortality (Figure) and individuals who had achieved 7.9 METs had substantially less all-cause and cardiovascular mortality. One MET higher CRF was associated with a 14-15% reduction of mortality.

In 2016, the American Heart Association issued a scientific statement on CRF and asserted it should be regarded as a clinical vital sign, reviewing all of the published data to that point.

In 2018, Mandsager and colleagues from the Cleveland Clinic published their data from 122,007 consecutive patients who underwent exercise treadmill testing and had long term follow-up for outcomes.

Here is the Table of METS performance by age and sex. You can see there are 5 categories (columns) from low to elite.

All-cause mortality by sex and the 5 levels of performance are plotted below. The hazard ratio of 1.41 (about 40% increases risk of all-cause mortality) for above average vs below average was the same as the risk of smoking or diabetes. The hazard ratio for mortality from low to elite was more than 5-fold. The favorable impact for women beyond men for METS was seen for each performance group. These results were adjusted for potential confounding variables.

In 2022, Kokkinos and colleagues published CRF exercise treadmill data for over 750,000 veterans aged 30 to 95 years with a mean follow-up of 10.2 years. The analysis was based on 6 categories of MET performance but the hazard ratios were similar to the Cleveland Clinic data (e.g. extremely fit vs lowest 4-fold in this study, 5-fold hazard in the prior one).

Here is a good summary graph of that study. In both there was no risk of increased mortality at the highest fitness strata—in fact it was consistently lower for each age group.

Datasets for V02 Max

There are limited data for direct measurement of V02 max and outcomes. The 2001 Kuopio study from Finland of 1,294 men with 10.7 year follow-up did measure V02 max directly once at baseline along with a symptom-limited exercise tolerance test on a bicycle ergometer. The relationship of V02 max (by quartiles) to all-cause mortality is shown below.

In a 2024 meta-analysis of 42 studies with V02 max (as categorized as “objectively measured CRF”) and estimated CRF for prediction or all-cause and cardiovascular mortality, the results were remarkably similar (cardiovascular mortality, 14% reduction, graph below) but notably there were 234-fold more participants with exercise CRF than by V02 max measurements, or >99% of the data is derived from METS. That is to say, nearly all the data we have for link to outcomes comes from CRF, not V02 max.

Reference standards have been published by age group for V02 max. For more information such as by sex, please check the link.

There are other specific studies in heart failure ,chronic obstructive pulmonary disease, pulmonary hypertension and pre-operative evaluation that show use of V02 max can help guide risk or treatment.

Share Ground Truths

Conflation and the V02 Max Craze

The leading proponent of using V02 max in recent years has been Peter Attia, through his podcast The Drive and book Outlive. He has consistently asserted “V02 max is the singular most powerful marker for longevity.” But the problem is conflation. He cites all of the studies of CRF without measuring V02 max and extrapolates to a V02 max result (see side-by-side Kokkinos study Table and Outlive Figure footnote below) and throughout his discussion of exercise in Chapter 11 of Outlive. For example, he writes: “this number [V02max] turns out to be highly correlated with longevity” citing all studies that did not measure V02 max.

In a recent YouTube video by Joseph Everett and Nick Norwitz entitled “Hidden Data: How the Top Longevity Doctor Tricked Us All” there is a segment about V02 max and this significant issue of conflation, discussed by Chris Masterjohn. Below is the relevant 6 minute clip within the longer video. It includes a bit of the 60 Minutes segment with CBS correspondent Nora O’Donnell doing a V02 max text and Peter’s assertion: “We don’t have a single metric of humans that we can measure that better predicts how long they will live than how high their V02 max is.”

As Masterjohn aptly points out, the fixation on V02 max, which is not actually supported by the data, also misses out on our ideal goal of diversity of exercise, including strength and balance training. Indeed, Kim et al, analyzing over 70,000 UK Biobank participants, for both CRF (submaximal bicycle test) and grip strength with all-cause mortality and concluded: “Improving both CRF and muscle strength, as opposed to either of the two alone, may be the most effective behavioral strategy to reduce all-cause and cardiovascular mortality risk.”

Bottom Line

I’ve never done a V02 max and see no reason to do it with the issues of cost, inconvenience, and the pain. As Attia correctly states about going to maximal exhaustion to get V02 max: “If you’ve ever had this test done, you will know just how unpleasant it is.” For spartan, Olympic, high performance athletes who are in high intensity training, or in patients with heart failure or pulmonary hypertension there may be a place for serial V02 max measurements, providing highly objective “goal standard” physiologic metrics. Otherwise, there are no supportive data for people going out and getting a V02 max and making this the focus of their exercise training. That is the reason I didn’t even mention V02 max in my Super Agers book.

Nearly all of the relevant data related to outcomes are based on exercise on a treadmill or bicycle with METS as the index of cardiorespiratory fitness. We should not be placing much value on our smartwatch data. My Apple Watch gave me encouraging high V02 max data over 6 months to suggest my CRF is well above people in my age group (70+, see reference standards above) but I know the data is woefully unreliable.

The problem now, with so much misplaced hype on V02 max, is that most people are using their smartwatch output for gauging their cardiorespiratory fitness, like the 2 patients I mentioned at the top of the post. It’s free to calculate your METS! And that is the real basis of the relationship to all-cause and cardiovascular mortality that has been firmly established in the peer review literature.

This problem surfaced recently with the introduction of ChatGPT Health. Geoffrey Fowler, the tech journalist at the Washington Post, submitted all his Apple Health and asked for an overall assessment of his health (actual prompt: “give me a single score (A-F) for my cardiovascular health over the last decade including component scores and an overall evaluation of my longevity.”) It gave him an “F.”

Then he gave ChatGPT Health his electric medical record access (portal) and asked the same question again. It gave him a “D’ and attributed that to his V02 max data of 34 ml/kg/min in the past year, below a 45-50 year old male! He also entered his Apple Watch data to Claude Health and it gave him a D+ status Specifically, Claude Health gave him a C- because his V02 max had declined from 41 to 32 ml/kg/min from 2016-2026. But the had over 7,500 step/day throughout the decade.

These outputs are indicative of the problem—the unreliable wearable V02 max data have become unduly emphasized by current AI platforms using smartwatch data! That will only make the problem worse, adding to the confusion, conflation, and emphasis on the wrong metric.

I hope this post helps to sort out what we know and that the datasets for cardiorespiratory fitness, representing real world physical activity— not V02 max —are the basis of the link to improved survival and freedom from cardiovascular mortality.

If we’re going to focus on a metric it ought to be METS, not V02 max. Not only is it free, simple and universally available, but it is the one best studied for health outcomes. And perhaps the better strategy is to be as physically active as possible and not worry about any metric!

NB: No AI was used in any way to write this post. As mentioned in the caption, I got help from Gemini-3 to produce the first Figure. I have nothing to do, no COI, with any company working with cardiopulmonary fitness or V02 max.

********************************************************************

Thanks to Ground Truths subscribers (> 200,000) from every US state and 210 countries. Your subscription to these free essays and podcasts makes my work in putting them together worthwhile. Please join!

If you found this interesting PLEASE share it!

Share Ground Truths

Paid subscriptions are voluntary and all proceeds from them go to support Scripps Research. They do allow for posting comments and questions, which I do my best to respond to. Please don’t hesitate to post comments and give me feedback. Let me know topics that you would like to see covered.

Leave a comment

Many thanks to those who have contributed—they have greatly helped fund our summer internship programs for the past two years. It enabled us to accept and support 47 summer interns in 2025! We aim to accept even more of the several thousand who will apply for summer 2026.

Read the whole story
mrmarchant
33 minutes ago
reply
Share this story
Delete

Say Goodbye to the Undersea Cable That Made the Global Internet Possible

1 Share
The first fiber-optic cable ever laid across an ocean -- TAT-8, a nearly 6,000-kilometer line between the United States, United Kingdom, and France that carried its first traffic on December 14, 1988 -- is now being pulled off the Atlantic seabed after more than two decades of sitting dormant, bound for recycling in South Africa. Subsea Environmental Services, one of only three companies in the world whose entire business is cable recovery and recycling, began the operation last year using its new diesel-electric vessel, the MV Maasvliet, and had already brought 1,012 kilometers of the cable to the Portuguese port of Leixoes by August. TAT-8, short for Trans-Atlantic Telephone 8, was built by AT&T, British Telecom, and France Telecom, and hit full capacity within just 18 months of going live. A fault too expensive to repair took it out of service in 2002. The recovered cable is being shipped to Mertech Marine in South Africa, where it will be broken down into steel, copper, and two types of polyethylene -- all commercially valuable, especially the high-quality copper at a time when the International Energy Agency projects global shortages within a decade.

Read more of this story at Slashdot.

Read the whole story
mrmarchant
34 minutes ago
reply
Share this story
Delete

New Gene Discovery Could Postpone the Bananapocalypse

1 Share

Have you ever wondered why you don’t have to spit out seeds after snacking on a banana? It’s because the Cavendish, the most widely used commercial cultivar, has three copies of chromosomes and can’t produce fertile seeds. Instead, the Cavendish is propagated by cloning, which is convenient for maintaining consistent banana quality, but leaves the plant vulnerable to disease. 

Nautilus Members enjoy an ad-free experience. Log in or Join now .

It’s a fate that befell the Cavendish’s predecessor, the Gros Michel (French for “Big Mike”). This more flavorful cultivar was the most widely available banana for decades until it fell victim to the wilting fungus Fusarium, known by the colloquial name “Panama disease.” By the 1950s, Gros Michel bananas had all but disappeared, and the Cavendish became the dominant cultivar, accounting for 99 percent of banana exports today.

For years, the Cavendish was thought to be resistant to the soil-borne Panama disease, until the 2010s when a virulent strain of the fungus, Tropical Race 4, started spreading. With bananas under threat worldwide, the race to protect the Cavendish from suffering the Gros Michel’s fate began. Now, agricultural scientists are making headway with new research published in Horticulture Research.

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

Read more: “What’s Wrong with Bananas

Andrew Chen and Elizabeth Aitken from the University of Queensland recently pinpointed the location of a gene in the wild banana plant—Calcutta 4—that confers resistance to Fusarium wilt, Sub Tropical Race 4 (STR4).

“We’ve located the source of STR4 resistance in Calcutta 4 which is a highly fertile wild diploid banana by crossing it with susceptible bananas from a different subspecies of the diploid banana group,” Chen explained in a statement. “This is a very significant finding; it is the first genetic dissection of Race 4 resistance from this wild subspecies.”

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

Because the banana plant crosses had to grow for a year before they could be checked for disease-resistance, locating the gene took five years. You may be wondering why we can’t just make the switch from the Cavendish to Calcutta 4 the way we swapped the Cavendish for the Gros Michel. 

“While Calcutta 4 provides crucial genetic resistance, it isn’t suitable as a commercial cultivar because it doesn’t produce fruit which are good to eat,” Chen said.

Going forward, the researchers are developing molecular markers for the gene so banana producers can more efficiently identify and plant resistant seedlings. “This will speed up selection, reduce costs and hopefully ultimately lead to a banana that is good to eat, easy to farm, and naturally protected from Fusarium wilt through its genetics,” Chen said.

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

That is, until the fungus adapts once more.

Enjoying  Nautilus? Subscribe to our free newsletter.

Lead image: Steve Hopson / Wikimedia Commons

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .
Read the whole story
mrmarchant
35 minutes ago
reply
Share this story
Delete

How Can Infinity Come in Many Sizes?

1 Share
Intuition breaks down once we’re dealing with the endless. To begin with: Some infinities are bigger than others.

The post How Can Infinity Come in Many Sizes? first appeared on Quanta Magazine



Read the whole story
mrmarchant
2 hours ago
reply
Share this story
Delete

Living in the Upside Down

1 Share

As you progress in your UI design career, you learn that there are quite a few unsolvable challenges:

  • do you write My Items or Your Items in UI?
  • do you put hand cursors over buttons?
  • for a boolean item (especially in the menu), do you talk about the present state or the future state?
  • do you try to solve for change blindness or change aversion?

I was reminded of one of those today: how do you sort the items in the bottom-aligned menu?

One school of thought is to keep it in the same order as you would a regular top-aligned menu:

On the positive side, this allows to build consistent understanding of how menus are structured: the most important thing is at the top, Quit is always at the bottom. But the downsides are obvious, too – now the most important item is furthest away from where you cursor started, and you have to awkwardly cross all the other items on the way to it.

iOS’s springboard went, literally, the other way:

Here, the bottom aligned menu reverses its item order. This tripped me up today. The dock in macOS was actually more defensible upside down because there, every menu was always going the same way. Here, the inconsistency starts rearing its ugly head.

Of course, the best way to not face an impossible choice is to avoid it altogether. Not sure how one could accomplish it here, though. Placing the menus consistently below would make some of them scrollable, or basically invisible for bottommost icons. You could also slide the entire screen up to make room for the menu, but that would probably feel disorienting.

So, I can’t say this is a wrong solution. The inconsistency might only bother people who use this often, and maybe no one uses this often? Or, perhaps, it was really important to allow to resize widgets and make that item as easy to tap as possible? But still, I think I would have done it the other way – align as needed, but items always in the same order.

Read the whole story
mrmarchant
1 day ago
reply
Share this story
Delete

The Origins of Agar

1 Share
Ella Watkins-Dulaney for Asimov Press.

This essay will appear in our forthcoming book, “Making the Modern Laboratory,” to be published later this year.

In 1942, at the height of British industrial war mobilization, an unlikely cohort scavenged the nation’s coastline for a precious substance. Among them were researchers, lighthouse keepers, members of the Royal Air Force and the Junior Red Cross, plant collectors from the County Herb Committee, Scouts and Sea Scouts, schoolteachers and students. They were looking for fronds and tufts of seaweed containing agar, a complex polysaccharide that forms the rigid cell walls of certain red algae.

The British weren’t alone in their hunt. Chileans, New Zealanders, and South Africans, among others, were also scrambling to source this strategic substance. A few months after the Pearl Harbor attack, the U.S. War Production Board restricted American civilian use of agar in jellies, desserts, and laxatives so that the military could source a larger supply; it considered agar a “critical war material” alongside copper, nickel, and rubber.1 Only Nazi Germany could rest easy, relying on stocks from its ally Japan, where agar seaweed grew in abundance, shipped through the Indian Ocean by submarine.2

Without agar, countries could not produce vaccines or the “miracle drug” penicillin, especially critical in wartime. In fact, they risked a “breakdown of [the] public health service” that would have had “far-reaching and serious results,” according to Lieutenant-General Ernest Bradfield. Extracted from marine algae and solidified into a jelly-like substrate, agar provides the surface on which scientists grow colonies of microbes for vaccine production and antibiotic testing. “The most important service that agar renders to mankind, in war or in peace, is as a bacteriological culture medium,” wrote oceanographer C.K. Tseng in a 1944 essay titled “A Seaweed Goes to War.”3

Agar was first introduced into the laboratory in 1881. Since then, microbiologists have depended on agar to create strong jellies. When microorganisms are streaked or plated onto this jellied surface and incubated, individual cells multiply into distinct colonies that scientists can easily observe, select, and propagate for further experiments. Many of the most important findings in biological research of the last 150 years or so — including the discovery of the CRISPR/Cas9 gene-editing tool — have been enabled by agar.4 Agarose, a derivative of agar, is also essential in molecular biology techniques like gel electrophoresis, where its porous gel matrix separates DNA fragments by size, enabling researchers to analyze and isolate specific genetic sequences.

Agar plates with E.coli growth on various concoctions, including MacConkey, Mueller-Hinton, and Brain Heart Infusion. Credit: HansN.
An agarose gel. Credit: Kadina Almhjell

Agar is so critical that since WWII, scientists have tried to find alternatives in the event of a supply chain breakdown, especially as recent shortages have caused similar alarm. But while other colloid jellies have emerged, agar remains integral to laboratory protocols because no alternatives can yet compete on performance, cost, and ease of use.

Deep writing about biology, delivered to your inbox. Always free.

From Sea to Table

Microbiologists have been growing microbes on agar plates for nearly 150 years, but agar’s discovery dates back to a happy accident in a mid-17th-century kitchen. Legend has it that on a cold winter day, a Japanese innkeeper cooked tokoroten soup, a Chinese agar seaweed recipe known in Japan for centuries. After the meal, the innkeeper discarded the leftovers outside and noticed the next morning that the sun had turned the defrosting jelly into a porous mass. Intrigued, the innkeeper was said to have boiled the substance again, reconstituting the jelly. Since this discovery, agar has become a staple in many Japanese desserts, from yokan to anmitsu.

Industrial production of kanten (the Japanese name for agar, which translates as “cold weather” or “frozen sky”) began in Japan in the mid-19th century by natural freeze drying, a technique that simultaneously dehydrates and purifies the agar. Seaweed is first washed and boiled to extract the agar, after which the solution is filtered and placed in boxes or trays at room temperature to congeal. The jelly is then cut into slabs called namaten, which can be further processed into noodle-like strips by pushing the slabs through a press. These noodles are finally spread out in layers onto reed mats and exposed to the sun and freezing temperatures for several weeks to yield purified agar. Although this traditional way of producing kanten is disappearing, even today’s industrial-scale manufacturing of agar relies on repeated cycles of boiling, freezing, and thawing.

Because of its capacity to be freeze-dried and reconstituted, agar is considered a “physical jelly” (that is, a jelly that sets and melts with temperature changes without needing any additives). This property makes dry agar easy to ship and preserve for long periods of time.5

Anmitsu. Credit: Ocdp

Over the years, agar found its way around the world into many cuisines, including those of China (where it’s called “unicorn vegetable” or “frozen powder”), France (sometimes called gélose), India (called “China grass”), Indonesia (called agar-agar, which translates simply as “jelly”), Mexico (called dulce de agar, or agar sweets), and the Philippines (known as gulaman).

Agar is prized among chefs for its ability to form firm, heat-stable gels at remarkably low concentrations — typically just 0.5-2 percent by weight. Culinary agar is available as powder, flakes, strips, or blocks, and makes up about 90 percent of the global use of agar. Unlike gelatine, which melts at body temperature, agar gels remain solid up to about 185°F (85°C), making it ideal for setting dishes served at room temperature or warmer. It is also flavorless and odorless, vegan and halal, and can create both delicate jellies and firm aspics. Yet, while increasingly employed in kitchens worldwide, agar had not yet entered the laboratory.

Before agar, microbiologists had experimented with other foodstuffs as microbial media. They turned to substances rich in the starches, proteins, sugars, fats, and minerals that organisms need for growth, testing with broths, bread, potatoes, polenta, egg whites, coagulated blood serums, and gelatine. However, none worked particularly well: all were easily broken down by heat and microbial enzymes, and their surface, once colonized, became mushy and unsuitable for isolating microbes.

A bundle of kanten, from the Encylopedia of Food (1923).

This was especially vexing to physician and bacteriologist Robert Koch, who, in seeking to culture his bacteria, “bent all his power to attain the desired result by a simple and consistently successful method,” wrote bacteriologist and historian William Bulloch in his 1938 book, The History of Bacteriology. “He attempted to obtain a good medium which was at once sterile, transparent, and solid” and got some results with gelatine.6 But gelatine is easily digested by many microbes and melts at precisely the temperatures at which the disease-causing microbes Koch wanted to study grow best.

The woman who ultimately discovered the superior features of agar as a growth medium and brought it to Koch’s attention was Fanny Angelina Hesse. Her foundational contribution to the nascent field of microbiology is often omitted from textbooks. In other cases, she is unflatteringly referred to as a “German housewife” or as “Frau Hesse,” or dismissed as an unnamed technician.

From Plate to Petri Dish

Fanny Angelina (née Eilshemius, from a Dutch father) grew up in Kearny, New Jersey. During her childhood, her family learned from a Dutch friend or neighbor about agar-agar, a common ingredient in jelly desserts in Java (Indonesia), then a Dutch colony. Her mother and, later, Fanny Angelina herself, began to cook with it.

In 1874, Fanny Angelina married physician and bacteriologist Walther Hesse, an investigator of air quality and, specifically, air-borne microbes. In the Winter of 1880-81, Hesse became a research student with Koch in Berlin and experienced firsthand the difficulty of growing microbes on gelatine and the other growth media used at the time.

While raising three children and taking care of the household, Fanny Angelina Hesse supported, documented, and archived her husband’s work, creating stunning scientific illustrations of bacterial and fungal colonies. During the hot Summer of 1881, she watched as Hesse struggled with gelatine-based growth media. Fanny Angelina, recalling the stability of her agar-based desserts, suggested that they try that instead. Hesse wrote a letter to Koch informing him about the switch, and Koch mentioned agar for the first time in his 1882 groundbreaking paper on the discovery of the tuberculosis bacillus.

Image from a graphic novel about agar and Fanny Angelina Hesse, called "The Dessert that Changed the World." Story by Corrado Nai and artwork by “SHog.” Support on Patreon.

The change to agar was a marked improvement. The jelly is so effective that it is still an invariable ingredient in what is known today as the “Koch’s plating technique” or the “culture plating method.” As Koch himself noted in 1909: “These new methods proved so helpful…that one could regard them as the keys for the further investigation of microorganisms…Discoveries fell into our laps like ripe fruits.”

Once Koch established the methods to grow pure cultures of bacteria like tuberculosis and anthrax, he demonstrated for the first time that microbes can cause diseases, a feat that earned him the 1905 Nobel Prize in Physiology or Medicine.

However, Koch never credited the Hesses for their discovery of bacteriological agar, perhaps because, at the time, he failed to recognize its importance. Even after he received the insight about agar from the Hesses, Koch stuck with gelatine for years. In 1883-84, during his first medical expedition to Egypt and India to investigate cholera, he tried and failed to grow the cholera bacterium on gelatine media in the hot climate of Cairo (despite using a half-open fridge for incubation), only succeeding in the colder winter of Calcutta.

Fanny Angelina Hesse, 1883.

It is difficult to know exactly when the shift from gelatine to agar occurred. As often happens for scientific breakthroughs, agar was likely adopted incrementally alongside the use of other growth media. In 1913, for example, the first diagnosis of Serratia marcescens as a human pathogen was made by growing it on agar as well as on potatoes.

Nevertheless, by 1905, a report on the seaweed industries in Japan noted the “very important use [of pure-grade agar] as a culture medium in bacteriological work.” It’s safe to say that, around the turn of the 20th century, agar had moved from an inconspicuous kitchen jelly to an indispensable scientific substance.

Subscribe now

A Strategic Substance

Several properties of agar render it a superior jelly. Agar isn’t broken down by microbial enzymes apart from a few species (including bacteria living in marine and freshwater habitats), and it dissolves well in boiling water, making it easy to sterilize. The jelly doesn’t react with the ingredients of a broth, whose composition can be adjusted to meet the nutritional requirements of different microbes, and sets to a firm gel without the need for refrigeration.

Agar’s low viscosity also makes it easy to pour into Petri dishes, and its transparency permits observation of microbes growing on its surface.7 Also aiding in this is its low syneresis (extrusion of water from the gel), guaranteeing less surface “sweating”: Once a plate is inoculated, bacterial colonies stay in place and do not mix.

The jelly is chemically inert since no additives are needed for gelation. This allows chemicals dissolved in the jelly’s aqueous phase to diffuse well, a prerequisite for testing if certain species or strains are resistant to antibiotics or antifungals. In these simple assays, zones of growth inhibition of bacteria or fungi (or their absence) point to the effectiveness of (or resistance towards) antibiotics or antifungals.

But agar’s superior qualities come with complex chemistry. “To speak of agar as a single substance of certain (if known) chemical structure is probably a mistake,” wrote phycologist Harold Humm in a 1947 article. According to the Food and Agriculture Organization of the United Nations, agar is merely recognized as “a hydrophilic colloid extracted from certain seaweeds of the Rhodophyceae class.” In terms of its actual composition, agar is mostly a combination of two polysaccharides, agaropectin and agarose, which themselves are complex and poorly-characterized polysaccharides made mostly (but not exclusively) from the simple sugar galactose.8

Agar comes from multiple sources, as many red seaweeds are “agarophytes” (that is, seaweeds containing agar in their cell walls). Species of Gelidium are the most important source of bacteriological (lab-grade) agar. Other main agarophytes, largely used for culinary agar, include red seaweeds from the genera Gracilaria, Pterocladia, Ahnfeltia, and others. Species from the genera Eucheuma, Gigartina, and Chondrus have been used as agarophytes in research during agar shortages.9

Sketches of Japanese algae, by Kintaro Okamura (1913).

One striking characteristic of Gelidium is that it must be wild-harvested rather than farmed. Unlike Gracilaria for culinary agar production, Gelidium grows slowly and thrives only in cold, turbulent waters over rocky seabeds, conditions nearly impossible to replicate in aquaculture. This dependence on wild harvesting explains the need for seaweed collectors during WWII, and continues to make Gelidium a strategically critical resource.

While Gelidium seaweeds can be collected by gathering fragments washed ashore, mass production of agar requires steady, large quantities.10 Harvesters in New Zealand during WWII had to “walk beside a boat, waist to armpit deep in water and feel for the weed with their feet.” Handling large volumes of wet seaweed (which yields less than five percent agar) was challenging. Then as now, when Gelidium is harvested by scuba divers from rocky seabeds, collectors have to understand the life cycle of the algae, find the most likely locations for its growth, and prevent overharvesting to safeguard future yields.

Given its auspicious position on the Atlantic coastline, Morocco has been the main source of Gelidium for at least two decades, and demand for bacteriological agar continues to grow. Yearly global consumption increased from 250 tons to 700 tons between 1993 and 2018, and is currently estimated at around 1,200-1,800 tons per year, according to Pelayo Cobos, commercial director of Europe’s largest producer of agar, Roko Agar.

The Future of Agar

Amid such rising demand, it’s understandable that researchers worried when Morocco reduced exports of agarophytes in 2015. This shortage — due to a combination of overharvesting, climate warming, and an economic shift to internal manufacturing in the North African country — not only caused alarm but a three-fold price increase of wholesale bacteriological agar, which reached $35-45 per kilogram. (At the time of writing this in late 2025, factory agar prices are sitting at about $30 per kilogram, according to Cobos.)

A few years later, in 2024, researchers in multiple labs were horrified to notice toxic batches of agar for reasons as yet unclear. After they observed a worrying lack of microbial growth (impeding their ability to carry out basic experiments), they switched to different agar suppliers, and their results improved.

This was not the first time that microbiologists experienced problems with agar. A phenomenon called “The Great Plate Count Anomaly” baffled researchers in the early 20th century when they observed that the number of cells seen under a microscope didn’t match the actual number of colonies growing on an agar plate. Investigating this discrepancy, researchers found agar itself to be the culprit: when nutrient broths are heated with agar during boiling, harmful byproducts (hydroperoxide) can form due to the reaction of agar with phosphate minerals contained in the media. Researchers can avoid this by autoclaving agar separately from the nutrient broth, or by reducing the amount of agar used.

This anomaly is indicative of the larger challenge of culturing various microbial species, referred to as microbial “unculturability.” This cannot be explained by the use of agar alone or by the substitution of an alternative gelling agent, but rather by the difficulties in consistently recreating on an agar plate the multi-variable environment in which microbes grow naturally. Given such challenges, the risk of shortages, and the vulnerabilities of the agar supply chain, why is it so difficult to find suitable alternatives?

It is not for lack of trying. In some cases, microbiologists have ditched the Petri dish altogether, using microfluidics for manipulating and growing cells. However, these approaches aren’t likely to be adopted at scale as they require less common, less practical, and more expensive devices. So, what about other growth media?

A microfluidics chip enables researchers to manipulate and study individual cells, without the use of agar at all. Credit: Brouzes E. et al. PNAS (2009).

By WWII, scientists had already begun looking at alternative gelling substances for routine use in bacteriology, but concluded that agar was still better as it is both firmer and easier to handle. Today, some specialized microbiology applications use the colloid carrageenan (extracted from red seaweed Chondrus crispus, or “Irish Moss”), a more transparent and less auto-fluorescent alternative to agar (agar emits its own background fluorescence when excited by light). However, for routine bacteriological use, carrageenan is more difficult to dissolve, requires higher concentrations, can degrade at high temperatures, and forms weaker gels, which may result in puncturing its surface during the plating of cells.

In some cases, alternative gelling agents might provide faster results. Researchers observed that bacterial cellulose and another bacterial polysaccharide, Eladium, allow a 50 percent increased growth rate for various bacteria and yeasts (as compared to their growth on agar), including higher biomass yields or faster detectable biofilm formation. However, both substances are still not as cheap and readily available as agar.

Guar gum, a plant colloid, costs less than agar and is better suited for growing thermophilic bacteria, but is also more difficult to handle, being more viscous and less transparent. The bacterial polysaccharide xanthan is cheaper as well but forms weaker jellies that, as with carrageenan, might result in puncturing its surface. Other colloids, like alginate (from brown seaweed) and gellan gum (from a bacterium), don’t set solely based on temperature and require additives for gelation. These additives might interfere with microbial growth and make the preparation of those jellies less handy than agar plates.

Thus, despite much effort, no gelling agent has yet been discovered that possesses all the properties and benefits of agar. Agar continues to be the best all-arounder: versatile, cheap, and established. And, if Gelidium agar should ever run out, and another colloid is not at hand, microbiologists could revert to culinary agar, which, although not as pure and transparent, offers a low-cost alternative to lab-grade agar.

It’s also worth noting that even if alternatives superior to agar were found, scientists are reluctant to abandon established protocols (even when microbiologists do use other jellies, they often still add agar to the mix, for example, to increase the gel strength of the solid media). As agar has been the standard gelling agent in microbiology for around 150 years, an enormous infrastructure of standardized methods, reference values, and quality control procedures has emerged around its specific properties. Switching to a different medium (even a superior one) means results may not be directly comparable to decades of published literature or to other laboratories’ findings.

So it is that agar continues to be the jelly of choice in laboratories around the world. As Humm wrote in 1947: “Today, the most important product obtained from seaweeds is agar, a widely-used commodity but one that is not well known to the general public.” Almost 80 years later, it might be better known, but its importance hasn’t dwindled.

Subscribe now


Corrado Nai has a Ph.D. in microbiology and is a science writer with bylines in New Scientist, Smithsonian Magazine, Small Things Considered, Asimov Press, and many more. He is currently writing a graphic novel about Fanny Angelina Hesse and the introduction of agar in the lab called The Dessert that Changed the World, which can be followed and supported on Patreon.

Thanks to Steven Forsythe for sharing a report on the use of agar seaweed in Britain during WWII, Barbara Buchberger at the Robert Koch Institute for pointing out Koch’s use of gelatine for the identification of cholera, and the surviving relative of Fanny Angelina Hesse for sharing a trove of unpublished material.

Cite: Nai, C. “The Origins of Agar.” Asimov Press (2026). DOI: 10.62211/12pq-97ht

1

A full list of these materials can be found at (psfa0134, pg. 9).

2

Japan halted exports to other countries for fear that agar supported their development of biowarfare weapons. A few years before, Nazi Germany allegedly tested the efficacy of biowarfare attacks with another curious microbe, Serratia marcescens, dubbed “the miracle bacterium.” According to a much-talked about report by investigative journalist Henry Wickham Steed titled “Aerial Warfare: Secret German Plans” members of a secret Luft-Gas-Angriff (Air Gas Attack) Department spread the S. marcescens in the subterranean train networks of Paris and London and measured its reach armed with Petri dishes and agar plates.

3

It wasn’t the first time that nations at war turned to seaweed. During the First World War, the U.S. relied on the giant kelp seaweed (Macrocystis) to boost production of potash (a fertilizer produced in Germany), gunpowder, and acetone.

4

In 2007, Barrangou et al. demonstrated for the first time the function of CRISPR/Cas9 as a defensive mechanism of bacteria against bacteriophage attacks by a technique called “plaquing” which builds upon the technique of “plating” bacteria on agar. Plaques of viruses on agar are areas without growth of bacteria due to viral attacks.

5

The same properties also contributed to Nazi Germany’s strategy against agar’s scarcity, which — besides being supplied from Japan by submarine — relied on large pre-war stocks and on recovery methods to reuse bacteriological agar by autoclaving (boiling at around 121°C, 250°F, in a pressurized container for 30 to 60 minutes), thus liquefying and sterilizing the jelly, before purifying it again.

6

Koch borrowed the idea of using gelatine from mycologist Oscar Brefeld, who had used it to grow fungi. Interestingly, Brefeld also employed carrageenan, another seaweed-derived jelly. Because fungi generally favor growing at ambient temperatures, Brefeld might have been less plagued by the melting of growth media than Koch.

7

Julius Petri once wrote: “These shallow dishes are particularly recommended for agar plates…Counting the grown colonies is also easy.” (Translated by Corrado Nai from the original, 1887 German manuscript.)

8

Agarose is used in electrophoresis, chromatography, immunodiffusion assays, cell and tissue culturing, and other applications. It is the electrically neutral, non-sulphated, gelling component of agar. While its market is smaller, it is fundamental for specialized biochemical and analytical protocols.

9

Gigartina stella and Chondrus crispus, for example, were used as main agarophyte in Britain during WWII, alongside the use of a different colloid, carrageenan (see main text).

10

Washing and drying the bulk raw material to prevent spoilage also isn’t easy. During WWII, volunteers in Britain occasionally dammed natural streams to wash the seaweeds and used hot air from a bakery to dry them. Praising the concerted efforts of volunteers, the UK Ministry of Supply concluded that “all belligerent countries should have a local source” of agar.



Read the whole story
mrmarchant
1 day ago
reply
Share this story
Delete
Next Page of Stories