Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Tuesday May 29 2018, @06:07AM   Printer-friendly
from the flood-insurance-FTW dept.

Common Dreams reports

A Maryland city was devastated [May 27] after 6-inches of heavy rain caused a downtown flash flood. Major damage is reported and many cars have been swept away.

Ellicott City was still recovering from a flash flood two years ago that killed two and forced the historic city to rebuild much of its Main Street. Residents said Sunday's flood seemed even worse than the storm in July 2016--which was called an extremely rare "one-in-1,000 year event", and cost the city tens of millions of dollars in damages.

Additional information at:
The Baltimore Sun
The Washington Post
USAToday


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by istartedi on Tuesday May 29 2018, @08:27PM (5 children)

    by istartedi (123) on Tuesday May 29 2018, @08:27PM (#685851) Journal

    The town appears to have had many floods [wikipedia.org]:

    "Ellicott City has had major devastating floods in 1817, 1837, 1868,[58] 1901, 1917, 1923, 1938, 1942, 1952, 1956, 1972 (Hurricane Agnes), 1975 (Hurricane Eloise), 1989, 2011, 2016, and 2018"

    --
    Appended to the end of comments you post. Max: 120 chars.
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2, Informative) by Anonymous Coward on Tuesday May 29 2018, @09:15PM (4 children)

    by Anonymous Coward on Tuesday May 29 2018, @09:15PM (#685899)

    From the other post [soylentnews.org]:

    At time durations of 5 minutes to 3 hours, the observed rainfall at the Ellicott City gauge has a probability of occurrence of less than or equal to 1/1000. This does not mean this extreme rainfall will only occur once in a thousand years. However, it is a rare and unlikely event. Based on statistical analysis, there is a 0.1% chance or less of this rainfall occurring in these time durations and location in any given year.

    So you can see that #1 it is about rainfall, and #2 they distinguish between the "extreme events happening once in a thousand years" and the "event having 1/1000 chance of happening in any given year based on their statistical analysis". I think they can't tell us the probability of it happening once, twice or three times in a thousand years according to the analysis because their numbers are based on a byzantine procedure that nobody understands and those calculations were not part of the original design. However the simplest analysis would be:

    n = 1000
    x = 1:5
    y = dbinom(x, n, 1/n)
    data.frame(Ntimes = x, Prob = round(y, 4))

    Resulting in:

      Ntimes   Prob
    1      1 0.3681
    2      2 0.1840
    3      3 0.0613
    4      4 0.0153
    5      5 0.0030

    So there is a 36% chance it happens once, 18% chance it happens twice in a thousand years, etc. If you sum it up you get the geometric distribution 1 - (1-1/1000)^1000 ~ 0.63 probability of it happening at least once in the thousand years.

    In short, this whole "1,000 year flood" concept is going to be very confusing to the uninitiated even in the simplest case, and the real case is literally too complicated for anyone alive to determine.

    • (Score: 1) by istartedi on Wednesday May 30 2018, @08:01AM (3 children)

      by istartedi (123) on Wednesday May 30 2018, @08:01AM (#686159) Journal

      I get that "1000 year flood" means 1/1000 chance each year and your statistical approach makes sense. We only have 200 years of observation, during which 16 floods occurred. Their 1/1000 is for the rain, not the flood itself; but let's set that aside and assume you can't have the flood without rain that meets the criteria. Given that we have 16 floods in 200 years, let's also extrapolate out and assume that it was flooding like that before the town was built. Perhaps that's a bad assumption too; but we just don't have the data. It's all I've got though. It makes me want to plug N=80 into your program. It would be non-zero, but very small. In fact yes, the nature of statistics is that the odds of this could be *even lower* than 1/1000, and we could still get a cluster of 16 floods in the last 200 years; but it seems rather disingenuous to call it a "1000 year event" in the media when it happens that often.

      Maybe it's all lies. Maybe it's damned lies. Maybe it's... well, you know.

      --
      Appended to the end of comments you post. Max: 120 chars.
      • (Score: 0) by Anonymous Coward on Wednesday May 30 2018, @04:15PM (2 children)

        by Anonymous Coward on Wednesday May 30 2018, @04:15PM (#686322)

        Not sure what you mean. If you want to assume that the rate of these floods is 80 per 1000 years it would be this:

        n = 1000
        x = 0:120
        y = dbinom(x, n, 80/n)

        dat = data.frame(Ntimes = x, Prob = round(y, 4))

        plot(dat$Ntimes, dat$Prob)

        It'll look like a normal curve peaking at 80.

        • (Score: 2) by istartedi on Wednesday May 30 2018, @05:23PM (1 child)

          by istartedi (123) on Wednesday May 30 2018, @05:23PM (#686360) Journal

          No, what I'm saying is that if we start with the premise that it's a 1/1000 event, it's possible to have one every year but as each year goes on it's a very small probability. At some point, the original assumption of 1/1000 starts to look suspect.

          For a real-world example, there are cases where people have won multi-million dollar lottery prizes *twice*. This is obviously within the realm of probability. It's even within the realm of probability for them to win 10 multi-million dollar lottery prizes in one year. At some point though, you start realizing that the odds of that are so long, that the fix must be in.

          In the case of this flood it's obviously not "fixed"; but I'm saying that the underlying assumption of 1/1000 has to be flawed somehow when you have 16/200 from actual data. OK, the 1/1000 is for the rain not the flood; but who cares about the rain? It's the flood that impacts people's lives. They're using the 1/1000 to determine what gets rebuilt, when they should be using the 16/200. They're encouraging people to rebuild, when they should be coming up with a relocation plan.

          --
          Appended to the end of comments you post. Max: 120 chars.
          • (Score: 0) by Anonymous Coward on Wednesday May 30 2018, @06:08PM

            by Anonymous Coward on Wednesday May 30 2018, @06:08PM (#686385)

            I understand now. Here it is for 16 out of 200 if the probability is 1/1000 (this is in R):

            > dbinom(16, 200, 1/1000)
            [1] 1.407112e-25

            Here it is for at least 16:

            > sum(dbinom(16:200, 200, 1/1000))
            [1] 1.422514e-25

            So yea you could probably come up with a better model than binomial with p = 1/1000 with little effort. Perhaps even p = 1/1000 is correct but some other assumption behind the model is wrong. Who knows?