Today's heat waves feel a lot hotter than heat index implies:
If you looked at the heat index during this summer's sticky heat waves and thought, "It sure feels hotter!," you may be right.
An analysis by climate scientists at the University of California, Berkeley, finds that the apparent temperature, or heat index, calculated by meteorologists and the National Weather Service (NWS) to indicate how hot it feels — taking into account the humidity — underestimates the perceived temperature for the most sweltering days we're now experiencing, sometimes by more than 20 degrees Fahrenheit.
[...] The finding has implications for those who suffer through these heat waves, since the heat index is a measure of how the body deals with heat when the humidity is high, and sweating becomes less effective at cooling us down. Sweating and flushing, where blood is diverted to capillaries close to the skin to dissipate heat, plus shedding clothes, are the main ways humans adapt to hot temperatures.
[...] The heat index was devised in 1979 by a textile physicist, Robert Steadman, who created simple equations to calculate what he called the relative "sultriness" of warm and humid, as well as hot and arid, conditions during the summer. He saw it as a complement to the wind chill factor commonly used in the winter to estimate how cold it feels.
His model took into account how humans regulate their internal temperature to achieve thermal comfort under different external conditions of temperature and humidity — by consciously changing the thickness of clothing or unconsciously adjusting respiration, perspiration and blood flow from the body's core to the skin.
[...] The heat index has since been adopted widely in the United States, including by the NWS, as a useful indicator of people's comfort. But Steadman left the index undefined for many conditions that are now becoming increasingly common. For example, for a relative humidity of 80%, the heat index is not defined for temperatures above 88 F or below 59 F. Today, temperatures routinely rise above 90 F for weeks at a time in some areas, including the Midwest and Southeast.
To account for these gaps in Steadman's chart, meteorologists extrapolated into these areas to get numbers that, Romps said, are correct most of the time, but not based on any understanding of human physiology.
[...] That and a few other tweaks to Steadman's equations yielded an extended heat index that agrees with the old heat index 99.99% of the time, Romps said, but also accurately represents the apparent temperature for regimes outside those Steadman originally calculated. When he originally published his apparent temperature scale, he considered these regimes too rare to worry about, but high temperatures and humidities are becoming increasingly common because of climate change.
(Score: 0) by Anonymous Coward on Sunday August 21 2022, @12:16AM
It's odd because I grew up in the mid-Atlantic and you almost never got high 90s with high 90s humidity. You can look up the historical climate data yourself, and of course you'd have temporary excursions (heat waves), but I'm afraid your memory might be failing you if you think the combination of high 90s temperature and humidity were anything but unusual outlier events. Take a city in the mid-coast with a lot of weather stations (Washington DC) [weather-us.com] and it has average humidity year round that is in the 70s. Go down to somewhere like Savannah, GA and you'll find the average seasonals are not much different, and in fact as you go further south, the summers are a tad drier humidity wise than they are in the late winter or early spring. Washington historically averaged less than 40 days above 90F a year and only one or two above 100F, but that has been increasing these last 10 years.
I'll agree that a difference today is that, I suppose, if you point out that it was routinely 95F to 100F with 95% to 100% humidity back in the day, it might not be right, but it would get you on Fox News.