Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 17 submissions in the queue.
posted by martyb on Monday February 09 2015, @09:45PM   Printer-friendly
from the heated-discussion dept.

The Telegraph reports "The fiddling with temperature data is the biggest science scandal ever"

From the article:

"When future generations look back on the global-warming scare of the past 30 years, nothing will shock them more than the extent to which the official temperature records – on which the entire panic ultimately rested – were systematically “adjusted” to show the Earth as having warmed much more than the actual data justified."

It seems that the norm in science may well be to cherry pick the results, but the story points to evidence that some climate data may have been falsified to fit the theory.

Sure, it's clickbait, but we've recently discussed cases where science and scientific consensus has gotten it so very wrong. Can we trust the science if we can't trust the data?

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Informative) by hubie on Tuesday February 10 2015, @12:40AM

    by hubie (1068) Subscriber Badge on Tuesday February 10 2015, @12:40AM (#142941) Journal

    You've directly addressed my point. When I look at the output of my thermocouple it gives me something in volts. No matter how many thermocouples I purchase, they're going to keep giving me volts. I have to do something to convert it to a temperature, I have to apply some mathematical formula to do the conversion. Now, if I go to the hardware store and buy a multimeter that reads thermocouples, it will tell me a temperature converted to my favorite degree scale. However, if I plug in a different thermocouple, I'll find it gives me a slightly different temperature reading because my multimeter has some standard conversion programmed into it that doesn't accurately represent the thermocouple I want to use. However, if you get by dealing with the accuracy of a reading out of a multimeter, then the standard conversion is plenty good. If you are trying to make very accurate measurements, then you'll apply a proper calibration curve to the instrument you have.

    Starting Score:    1  point
    Moderation   +3  
       Insightful=1, Interesting=1, Informative=1, Total=3
    Extra 'Informative' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5  
  • (Score: 2, Redundant) by The Mighty Buzzard on Tuesday February 10 2015, @01:43AM

    by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Tuesday February 10 2015, @01:43AM (#142954) Homepage Journal

    Which is all fine and good to calibrate your thermocouple reader. Myself, I'd expect this to be done on site so the readings were accurate. I have never heard of leaving one uncalibrated and adjusting the numbers later. Not in any of the fields I've worked in and several of them it has come up. It is not however fine and good to come along after accurate readings have been taken for a station and adjust them in any way whatsoever, which is what was alleged.

    --
    My rights don't end where your fear begins.
    • (Score: 5, Insightful) by c0lo on Tuesday February 10 2015, @02:32AM

      by c0lo (156) Subscriber Badge on Tuesday February 10 2015, @02:32AM (#142966) Journal

      I have never heard of leaving one uncalibrated and adjusting the numbers later.

      I guess you didn't live in South America during '60-ies either. Nor did you go change the thermometer in the Arctic in submarine infested North Sea. Even more, I don't think one used thermocouples and digital multimeters to read the temperature in those times.
      At most you can do now is ask those measurements be discarded.

      --
      https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
    • (Score: 3, Informative) by moondrake on Tuesday February 10 2015, @10:09AM

      by moondrake (2658) on Tuesday February 10 2015, @10:09AM (#143065)

      The adjustments are needed in many cases. It depends on what you are doing with the numbers. For example, suppose you want to calculate a global average. But your temperature measurements are done at 10:00 AM in some countries, but at 15:00 in others, and depending on daylightsaving or whatever, it could even change over time. So you need to somehow correct those numbers. You could estimate the temperature at noon based on an estimate of the temp fluctuations at that stations. Or you could estimate the max temperature during the day for all stations.

      Then there are looking differences. Perhaps those guys in Paraguy put there thermometer at 200m above sealevel, but in Peru they were at 2000 m....see what I am getting at?

      • (Score: 1, Redundant) by The Mighty Buzzard on Tuesday February 10 2015, @12:18PM

        by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Tuesday February 10 2015, @12:18PM (#143098) Homepage Journal

        Yep, but that's an excuse not a good reason. You don't guess on numbers in science beyond estimating one significant decimal place if possible. If you do it's no longer science.

        --
        My rights don't end where your fear begins.
        • (Score: 5, Informative) by moondrake on Tuesday February 10 2015, @02:08PM

          by moondrake (2658) on Tuesday February 10 2015, @02:08PM (#143125)

          I am sorry but I think that is a rather short-sighted comment. Calibration is quite prevalent in science, and though I would agree it is often a nuisance and a source of errors, but it is also impossible to do many things without using it.

          Contrary to what you may believe, most machines (including thermometers) have actually problems with doing measurements in standardized ways and there are lots of sources for drift, variation, etc, that can and need to be accounted for.

          The example I quickly gave was not even correcting for known issues with a device but simply a consequence of the need for standardization when you are pooling data. It has nothing at all to do with guessing, so I am rather annoyed why you would suggest something silly like that in this discussion.

          If you would just treat all numbers as-is, the situation would often be worse, as you would induce variation (and possibly bias) into your data. Accounting for this kind of thing, either by taking the measurement in a standardized way or by correcting for deviations from standard conditions is exactly what distinguish a scientist from a person who just reads the thermometer output. I am the first to admit that correcting afterwards is always worse than actually doing the measurement better, but in the real world, this just is not always possible.

          And, to be frank, this is also why society need scientists that do their work correctly, because data are rarely 100% made-up, it is exactly in this kind of calibrations that a scientist might err from the "right path". We are still figuring out protocols and ways-of-conducts to prevent the latter (it is far more a problem now than in the past, especially in political sensitive fields). But mindlessly claiming all science that calibrates data is wrong is simply unreasonable.

          I am sorry if this does not fit into your personal view how science should be.