Common Dreams reports
A Maryland city was devastated [May 27] after 6-inches of heavy rain caused a downtown flash flood. Major damage is reported and many cars have been swept away.
Ellicott City was still recovering from a flash flood two years ago that killed two and forced the historic city to rebuild much of its Main Street. Residents said Sunday's flood seemed even worse than the storm in July 2016--which was called an extremely rare "one-in-1,000 year event", and cost the city tens of millions of dollars in damages.
Additional information at:
The Baltimore Sun
The Washington Post
USAToday
(Score: 1) by istartedi on Wednesday May 30 2018, @08:01AM (3 children)
I get that "1000 year flood" means 1/1000 chance each year and your statistical approach makes sense. We only have 200 years of observation, during which 16 floods occurred. Their 1/1000 is for the rain, not the flood itself; but let's set that aside and assume you can't have the flood without rain that meets the criteria. Given that we have 16 floods in 200 years, let's also extrapolate out and assume that it was flooding like that before the town was built. Perhaps that's a bad assumption too; but we just don't have the data. It's all I've got though. It makes me want to plug N=80 into your program. It would be non-zero, but very small. In fact yes, the nature of statistics is that the odds of this could be *even lower* than 1/1000, and we could still get a cluster of 16 floods in the last 200 years; but it seems rather disingenuous to call it a "1000 year event" in the media when it happens that often.
Maybe it's all lies. Maybe it's damned lies. Maybe it's... well, you know.
Appended to the end of comments you post. Max: 120 chars.
(Score: 0) by Anonymous Coward on Wednesday May 30 2018, @04:15PM (2 children)
Not sure what you mean. If you want to assume that the rate of these floods is 80 per 1000 years it would be this:
It'll look like a normal curve peaking at 80.
(Score: 2) by istartedi on Wednesday May 30 2018, @05:23PM (1 child)
No, what I'm saying is that if we start with the premise that it's a 1/1000 event, it's possible to have one every year but as each year goes on it's a very small probability. At some point, the original assumption of 1/1000 starts to look suspect.
For a real-world example, there are cases where people have won multi-million dollar lottery prizes *twice*. This is obviously within the realm of probability. It's even within the realm of probability for them to win 10 multi-million dollar lottery prizes in one year. At some point though, you start realizing that the odds of that are so long, that the fix must be in.
In the case of this flood it's obviously not "fixed"; but I'm saying that the underlying assumption of 1/1000 has to be flawed somehow when you have 16/200 from actual data. OK, the 1/1000 is for the rain not the flood; but who cares about the rain? It's the flood that impacts people's lives. They're using the 1/1000 to determine what gets rebuilt, when they should be using the 16/200. They're encouraging people to rebuild, when they should be coming up with a relocation plan.
Appended to the end of comments you post. Max: 120 chars.
(Score: 0) by Anonymous Coward on Wednesday May 30 2018, @06:08PM
I understand now. Here it is for 16 out of 200 if the probability is 1/1000 (this is in R):
Here it is for at least 16:
So yea you could probably come up with a better model than binomial with p = 1/1000 with little effort. Perhaps even p = 1/1000 is correct but some other assumption behind the model is wrong. Who knows?