Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Sunday March 14 2021, @12:29AM   Printer-friendly

Global heating pushes tropical regions towards limits of human livability:

Humans’ ability to regulate their body heat is dependent upon the temperature and humidity of the surrounding air. We have a core body temperature that stays relatively stable at 37C (98.6F), while our skin is cooler to allow heat to flow away from the inner body. But should the wet-bulb temperature – a measure of air temperature and humidity – pass 35C, high skin temperature means the body is unable to cool itself, with potentially deadly consequences.

“If it is too humid our bodies can’t cool off by evaporating sweat – this is why humidity is important when we consider livability in a hot place,” said Yi Zhang, a Princeton University researcher who led the new study, published in Nature Geoscience. “High body core temperatures are dangerous or even lethal.”

The research team looked at various historical data and simulations to determine how wet-bulb temperature extremes will change as the planet continues to heat up, discovering that these extremes in the tropics increase at around the same rate as the tropical mean temperature.

[...] Dangerous conditions in the tropics will unfold even before the 1.5C threshold, however, with the paper warning that 1C of extreme wet-bulb temperature increase “could have adverse health impact equivalent to that of several degrees of temperature increase”. The world has already warmed by around 1.1C on average due to human activity and although governments vowed in the Paris climate agreement to hold temperatures to 1.5C, scientists have warned this limit could be breached within a decade.

This has potentially dire implications for a huge swathe of humanity. Around 40% of the world’s population currently lives in tropical countries, with this proportion set to expand to half of the global population by 2050 due to the large proportion of young people in region. The Princeton research was centered on latitudes found between 20 degrees north, a line that cuts through Mexico, Libya and India, to 20 degrees south, which goes through Brazil, Madagascar and the northern reaches of Australia.

Journal Reference:
Yi Zhang, Isaac Held, Stephan Fueglistaler. Projections of tropical heat stress constrained by atmospheric dynamics, Nature Geoscience (DOI: 10.1038/s41561-021-00695-3)


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 1) by khallow on Monday March 15 2021, @07:44PM

    by khallow (3766) Subscriber Badge on Monday March 15 2021, @07:44PM (#1124551) Journal
    The flaws remain glaring:

    To take one example, Hausfather points to a famous 1988 model overseen by then–NASA scientist James Hansen. The model predicted that if climate pollution kept rising at an even pace, average global temperatures today would be approximately 0.3°C warmer than they actually are. That has helped make Hansen’s work a popular target for critics of climate science.

    Hausfather found that most of this overshoot was caused not by a flaw in the model’s basic physics, however. Instead, it arose because pollution levels changed in ways Hansen didn’t predict. For example, the model overestimated the amount of methane—a potent greenhouse gas—that would go into the atmosphere in future years. It also didn’t foresee a precipitous drop in planet-warming refrigerants like some Freon compounds after international regulations from the Montreal Protocol became effective in 1989.

    When Hausfather’s team set pollution inputs in Hansen’s model to correspond to actual historical levels, its projected temperature increases lined up with observed temperatures.

    In other words, when we take into account actual sinking of greenhouse gases (what they term "pollution levels changed in ways Hansen didn’t predict"), the models are relatively accurate. So when are these model-makers going to put in accurate "pollution levels" into their models? Keep in mind these models made claims that a given level of emissions would result in a given level of warming. Now, they're walking back those claims by redoing the calculation with existing CO2 (and other greenhouse gases) concentrations rather than existing CO2 emissions.

    This game gets played over and over again with claimed warming overshooting actual warming consistently. The physics is solid, but there's a lot of inputs (like pollution levels) that can be and are gamed.

    Why this matters is that substantial greenhouse gas sinks mean substantial negative feedbacks are likely being ignored in climate models and that results in excessively high climate sensitivity estimates.

    Moving on, let's consider the second link you posted:

    Climate models have to be tested to find out if they work. We can’t wait for 30 years to see if a model is any good or not; models are tested against the past, against what we know happened. If a model can correctly predict trends from a starting point somewhere in the past, we could expect it to predict with reasonable certainty what might happen in the future.

    So all models are first tested in a process called Hindcasting. The models used to predict future global warming can accurately map past climate changes. If they get the past right, there is no reason to think their predictions would be wrong. Testing models against the existing instrumental record suggested CO2 must cause global warming, because the models could not simulate what had already happened unless the extra CO2 was added to the model. All other known forcings are adequate in explaining temperature variations prior to the rise in temperature over the last thirty years, while none of them are capable of explaining the rise in the past thirty years. CO2 does explain that rise, and explains it completely without any need for additional, as yet unknown forcings.

    Notice the bolded sentence. There is plenty of reason to think a model that gets the past perfectly right can spectacularly self-destruct on the future because it's not hard to make such things happen (classic example is approximating near constant data say like a short span of human population with a polynomial - it'll fail once you get outside the range of approximation). Extrapolation is notoriously hard. This glib dismissal of that difficulty should raise red flags. Second, notice that they talk about the past 30 years and the future 30 years. That's barely the scale of climate which is typically defined on time scales of 30 years.

    The elephant in this room is the amazing disconnect [soylentnews.org] between human emissions of CO2 which have gone up substantially versus the concentration of CO2 in Earth's atmosphere. TL;DR: the apologists are glossing over greenhouse gases sinks. It doesn't matter how shiny and solid your radiative models are, when the problem is that you're greatly overestimating greenhouse gases concentrations.