Jeremy P. Shapiro, a professor of psychology at Case Western Reserve University, has an article on The Conversation about one of the main cognitive errors at the root of science denial: dichotomous thinking, where entire spectra of possibilities are turned into dichotomies, and the division is usually highly skewed. Either something is perfect or it is a complete failure, either we have perfect knowledge of something or we know nothing.
Currently, there are three important issues on which there is scientific consensus but controversy among laypeople: climate change, biological evolution and childhood vaccination. On all three issues, prominent members of the Trump administration, including the president, have lined up against the conclusions of research.
This widespread rejection of scientific findings presents a perplexing puzzle to those of us who value an evidence-based approach to knowledge and policy.
Yet many science deniers do cite empirical evidence. The problem is that they do so in invalid, misleading ways. Psychological research illuminates these ways.
[...] In my view, science deniers misapply the concept of “proof.”
Proof exists in mathematics and logic but not in science. Research builds knowledge in progressive increments. As empirical evidence accumulates, there are more and more accurate approximations of ultimate truth but no final end point to the process. Deniers exploit the distinction between proof and compelling evidence by categorizing empirically well-supported ideas as “unproven.” Such statements are technically correct but extremely misleading, because there are no proven ideas in science, and evidence-based ideas are the best guides for action we have.
I have observed deniers use a three-step strategy to mislead the scientifically unsophisticated. First, they cite areas of uncertainty or controversy, no matter how minor, within the body of research that invalidates their desired course of action. Second, they categorize the overall scientific status of that body of research as uncertain and controversial. Finally, deniers advocate proceeding as if the research did not exist.
Dr. David "Orac" Gorski has further commentary on the article. Basically, science denialism works by exploiting the very human need for absolute certainty, which science can never truly provide.
(Score: 0) by Anonymous Coward on Thursday November 14 2019, @08:09PM (6 children)
Rather than try to read the plot, why not look at the full 1990 report, chapter 1 [www.ipcc.ch]. On page 7, they report the current CO2 concentration as 353 ppm with an annual increase of 1.8 ppm. (The scan is surely missing the decimal point.) A unchanged linear trend would result in 389 ppm in 2020.
Also: a full treatment should consider all greenhouse gases. The article I referenced specifically cited methane emissions as being different from the default 1990 scenario.
(Score: 0) by Anonymous Coward on Friday November 15 2019, @06:13AM (5 children)
Thanks for the data! I think this section from the paper is critical (leaving the white space for legibility) :
And it doesn't stop there. They also chose to also try to use a similar 'trick' as the site you linked to, to argue for using a reduced emissions model, in spite of the fact that emissions have increased dramatically since 1990! So let's get back to that 'trick.' The graph you mentioned is using not using the same data. The graph I mentioned, table 4 (page XVII/17) in the paper [archive.ipcc.ch], specifically assumes constant emissions, not increasing emissions. It was used to demonstrate their assumptions of how even emissions fixed at 1990 levels would result in an increasing trend of atmospheric CO2 levels. The caption for their graph states:
The IPCC undoubtedly believed that a certain level of emissions would result in much higher levels of atmospheric CO2 than it actually did in reality.
Ultimately it's self evident that after extensive "massaging" and arbitrary widening of ranges you can manage to manage to squeeze reality into pretty much any prediction. I'd wrap up here with one interesting thing emphasizing how much that paper distorted the values. After all of their 'modifications', they observe that "if anthropogenic forcings had been held at 1989 levels over the past two decades the resulting [trend would be] 0.10–0.48C". So the bottom end of their trend would be 0.05C/decade. This poses a major red flag. The IPCC paper notes (page XII/12) "mean surface air temperature has increased by 0.3C to 0.6C over the last 100 years". Notably 0.05C/decade results in less than a 0.6C change/century. In other words, the paper "tweaked" the predictions so hard, that fixed 1989 emission levels would predict warming within the range of of the 1800s with their new adjusted model, in spite of a hundred years of sharply increasing emissions in the interim. Doesn't that strike you as dodgy?
(Score: 0) by Anonymous Coward on Friday November 15 2019, @09:30AM (4 children)
Furthermore, the article I gave you specifically cites methane and not CO2. Have you tried comparing the methane scenarios (page 13/xix of the policymaker's summary) with what actually happened (page 20/232 of this article [springer.com])?
Note: science is hard. Fully understanding a prediction from 1990 would require understanding everything that goes into it. However, it sometimes takes less than fifteen minutes to verify specific claims. To reiterate, this is the one (from the article assessing the 1990 predictions) I checked:
About the reassessment article: I should emphasize the key points:
It would be interesting to repeat this analysis up to 2020, but I have neither the time nor the expertise.
(Score: 0) by Anonymous Coward on Friday November 15 2019, @02:40PM (3 children)
UNIFICATION! :)
We should definitely make sure to double check each others math. In double checking both graphs (as you reasonably proposed), I just noticed a mistake you made. You stated that,
The base date was 1990, but you accidentally set it as 2000. The actual atmospheric concentration it would give if we continued at 1990 levels is almost exactly what I ballparked: 353 + (2019-1990) * 1.8 = 405.2ppm!
So I think this should resolve the 'unification' of our two different graphs! I think it also clarifies beyond doubt that the IPCC was indeed substantially overestimating what increasing emissions would do to the atmospheric concentration of CO2! As we dramatically increased our emissions since 1990, yet our atmospheric CO2 is only about 407ppm. The net result here is that they, now I think without doubt, substantially underestimated our CO2 emissions under the "business as usual" scenario. However, they also simultaneously substantially overestimated the negative impact of increasing CO2 emissions believing that a much smaller amount of CO2 would result in a much higher level of atmospheric CO2. So apparently two wrongs do make a right sometimes!
So yeah, now to methane. I suppose the question is should we even start? In particular would us having higher/lower methane emissions justify swapping to a more carbon friendly model (than the business as usual one), given the above conclusion (which I think/hope we can now agree on)?
(Score: 0) by Anonymous Coward on Friday November 15 2019, @07:01PM (2 children)
But I think you're still making considerable logical leaps, inferring properties of the modelling that could be instead checked. And I think you're assuming that one thing being large implies other things must be large without properly establishing the connection. For instance, suppose we take the crude model that increases in atmospheric CO2 concentrations are proportional to total human emissions. This implies that a significant increase in emissions will produce a significant increase in the growth rate of CO2 concentration. But it takes time for this increased growth rate to be visible in the CO2 level itself. One can see curvature in the Keeling curve [wikipedia.org], but the effect over thirty years is not huge. (Of course, the extrapolation can be scary.)
(Score: 0) by Anonymous Coward on Friday November 15 2019, @08:45PM (1 child)
It's all good. In the process of this all I feel I've learned a great amount and isn't that what this is all about? Though I have to say I don't think I'm making any meaningfully complex inferences here. Let me bring things down to the most fundamental to see if/where we disagree:
1) Our current atmospheric CO2 levels are around 407ppm.
2) We have increased atmospheric emissions by around 60% since 1990
3) The IPCC offered data predicting what atmospheric CO2 levels would like if CO2 emissions remained permanently at 1990 levels
4) This resulted in an atmospheric CO2 level of 405
Assertion A) The IPCC substantially overestimated the impact of human emissions on overall atmospheric CO2 levels. That a 60% increase would have nearly no relative difference is only explained by human error.
5) The example emissions in the business as usual scenario indicated a doubling in CO2 emissions from ~1985 to ~2040
6) Our actual CO2 emissions have already increased more than 82% since 1985
7) Our emissions will double (relative to 1985 levels) long before 2040
Assertion B) The IPCC underestimated the amount of CO2 we would produce under the business as usual scenario.
---
Writing things out like this makes this all so much more clear. Makes one wonder if longform language is really the right way to present research! Formalize it enough and you could even have automated logical validation of papers.
Getting back to the point (and longform...), my one and only argument here is that the 1990 IPCC predictions do not match reality. The way we got into the more complex discussion is in discussing whether or not it would be appropriate to change their predictions to make them more closely fit reality. The primary justification given for this suggestion is the claim that actual emission levels were lower than predicted by the IPCC, and so it would be appropriate to use a lower emission scenario. However that claim does not seem to be valid (Assertion B). The other, more refined claim, is that because the result of our emissions had a smaller atmospheric impact than expected, we should use a lower emissions tier. This argument may be more refined but is, in my opinion, even less well supported. That is trying to take advantage of a major error in the predictions (Assertion A) and spin it into a positive. The emissions scenarios were clearly defined in terms of emissions and not atmospheric concentrations in any case.
(Score: 0) by Anonymous Coward on Sunday November 17 2019, @06:01PM
This concerned a discrepancy between an observed 30-year warming of 0.36°C — which you then accepted would be better represented by the difference of decadal averages (being less affected by year-to-year fluctuations), 0.54°C — and the predicted range of 0.6–1.5°C. I agree that this is slightly outside the predicted range, but I pointed you to an article from a few years ago where a researcher did care about this and investigated the reasons for the discrepancy. This was attributed in part to different-than-predicted levels of forcing from greenhouse gases. As the IPCC can't predict things like economic crashes or attempts to control emissions, their predictions about warming surely must be contingent on the level of anthropogenic emissions. In addition, on short timescales there can be significant effects from unpredictable natural events such as the eruption of Mount Pinatubo.
Now, if one wants to go rejecting a particular hypothesis, it is usually best to have an alternative. I think the anthropogenic global warming hypothesis is holding up much better than the null hypothesis of random temperature fluctuations. Furthermore, science is an iterative process. The IPCC has issued several reports, and a proper analysis of whether they are junk science would test, for instance, whether their predictions are improving (which would happen as a result of better understanding and better data) or not (which could happen if the report were purely politically motivated).
Coming back to the issue of emissions: if you accept that measured CO2 concentrations have been below the 1990 business-as-usual scenario, then this supports the explanation that the prediction of warming was too high because it assumed higher atmospheric levels of greenhouse gases than actually occurred. Now, perhaps the predicted relationship between CO2 concentrations and CO2 emissions was off. But it appears this is an area of active research: in particular, ocean uptake of CO2 is a significant effect and and there is a significant effort (e.g. this team [noaa.gov]) to understand it. I think it's a bit hyperbolic to say "nobody seems to care about this". Unfortunately I don't have the time, but it would be interesting to see how the IPCC description of the carbon cycle has evolved since the first report.
Finally, I want to emphasize that climate scientists do serious work and it's a bit ugly to poke and prod at their data at a very superficial level. I would never reject an entire field without hearing the response from scientists in that field. And they've had to deal with an enormous level of politicization, thanks to the trillions of dollars' worth of assets whose value is threatened by the notion that using those assets is harmful and should be eliminated. (It's been pointed out [thenation.com] that this could be compared in scale to abolitionists asking the U.S. South to give up its slaves, which led to the U.S. Civil War.) Despite the enormously wealthy interests working against climate scientists, I think mainstream climate science has held up and the critiques seem either obviously wrong or concern relatively minor details. Actually, that's the subject of the article we are supposed to be discussing: