Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Thursday November 15 2018, @06:49AM   Printer-friendly
from the frog-in-a-pot dept.

The San Diego Union-Tribune is one of a few sources reporting: https://www.sandiegouniontribune.com/news/environment/sd-me-climate-study-error-20181113-story.html

Researchers with UC San Diego's Scripps Institution of Oceanography and Princeton University recently walked back scientific findings published last month that showed oceans have been heating up dramatically faster than previously thought as a result of climate change.

The original paper indicated that oceans were warming 60 percent more than outlined by the IPCC and was widely published and remarked. The significantly increased warming conclusion was quickly challenged by an English mathematician looking at the methodologies used.
The authors promptly confirmed the issue thanking him for pointing it out, and have redone their calculations and submitted corrections to the journal Nature. Per one of the authors after reviewing and correcting:

"Our error margins are too big now to really weigh in on the precise amount of warming that's going on in the ocean," Keeling said. "We really muffed the error margins."

The article continues:

While papers are peer reviewed before they're published, new findings must always be reproduced before gaining widespread acceptance throughout the scientific community, said Gerald Meehl, a climate scientist at the National Center for Atmospheric Research in Boulder, Colorado.

"This is how the process works," he said. "Every paper that comes out is not bulletproof or infallible. If it doesn't stand up under scrutiny, you review the findings."

The same author indicates "the ocean is still likely warmer than the estimate used by the IPCC"


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by maxwell demon on Thursday November 15 2018, @07:13PM (1 child)

    by maxwell demon (1608) on Thursday November 15 2018, @07:13PM (#762302) Journal

    This is not true in general. It is only true if the internal state of the generator is bounded (i.e., for finite state machines only). Most practical RNGs meet this condition

    Not most. All. Simply because all machines we've ever built are physically finite. And a physically finite machine has only finitely many reliably distinguishable states.

    --
    The Tao of math: The numbers you can count are not the real numbers.
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 0) by Anonymous Coward on Thursday November 15 2018, @08:36PM

    by Anonymous Coward on Thursday November 15 2018, @08:36PM (#762331)

    Not most. All. Simply because all machines we've ever built are physically finite. And a physically finite machine has only finitely many reliably distinguishable states.

    The generators described previously have unbounded state, as in the storage requirements increase without bound as the generators continue to crank out more and more numbers. However, this is different from infinite: the amount of storage is always finite, and it in practice the algorithms described can be implemented on real computers. When your disk fills up you just add another disk and crank out more numbers until that one fills up, and so on.

    Both generators are easy to program in a language with unbounded integers (basically anything modern). In both examples given, the storage and time requirements increase in proportion to the logarithm of the number of iterations so far. In reality that means the generators are fast and consume little memory proportional to the amount of output they produce, because the logarithm function increases very slowly.

    Just for fun, I hacked together an implementation of the second one in Haskell.


    -- thue_morse n computes the nth element of the Thue-Morse sequence.
    thue_morse :: Integral a => a -> Bool
    thue_morse 0 = False
    thue_morse n
            | n < 0 = error "domain error"
            | otherwise = odd n /= thue_morse (n `div` 2)

    -- Generate "random" bits by following the Thue Morse sequence.
    generator :: Integral a => Integer -> (Integer, a)
    generator n = (n+1, fromIntegral . fromEnum $ thue_morse n)

    -- Generate "random" Int values on the interval [0, 2^29-1]
    generate_int :: Integer -> (Integer, Int)
    generate_int s0 = (next, val) where
            gen = take 29 $ eval_machine generator s0
            next = fst (last gen)
            val = sum $ zipWith (*) (map snd gen) (map (2 ^) [0..])

    -- Evaluate a state machine for a given initial state by returning a list of
    -- (state, value) pairs, where each entry in the list is the result of the
    -- machine applied to the previous state.
    eval_machine :: ( a -> (a, b) ) -> a -> [(a, b)]
    eval_machine m s0 = tail $ iterate (m . fst) (s0, undefined)

    -- Print the sequence of Int values produced by generate_int. The resulting
    -- sequence has no period.
    main :: IO ()
    main = mapM_ (print . snd) $ eval_machine generate_int 0