Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Tuesday May 29 2018, @06:07AM   Printer-friendly
from the flood-insurance-FTW dept.

Common Dreams reports

A Maryland city was devastated [May 27] after 6-inches of heavy rain caused a downtown flash flood. Major damage is reported and many cars have been swept away.

Ellicott City was still recovering from a flash flood two years ago that killed two and forced the historic city to rebuild much of its Main Street. Residents said Sunday's flood seemed even worse than the storm in July 2016--which was called an extremely rare "one-in-1,000 year event", and cost the city tens of millions of dollars in damages.

Additional information at:
The Baltimore Sun
The Washington Post
USAToday


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Informative) by Anonymous Coward on Tuesday May 29 2018, @04:06PM (2 children)

    by Anonymous Coward on Tuesday May 29 2018, @04:06PM (#685690)

    First of all, the 1/1000 years was referring to the "rainfall", not the flood:

    The National Weather Service's Hydrometeorological Design Studies Center (HDSC) has completed an exceedance probability analysis for this rainfall event, based on the rainfall estimates and observed data shown above. A link to their analysis can be found here. At time durations of 5 minutes to 3 hours, the observed rainfall at the Ellicott City gauge has a probability of occurrence of less than or equal to 1/1000. This does not mean this extreme rainfall will only occur once in a thousand years. However, it is a rare and unlikely event. Based on statistical analysis, there is a 0.1% chance or less of this rainfall occurring in these time durations and location in any given year. For more details on the calculations, see the full report [noaa.gov].

    https://www.weather.gov/lwx/EllicottCityFlood2016 [weather.gov]

    From the full report (linked above) we see this was specifically regarding having "~6 inches of rainfall in 3 hours 3 hours at the ELYM2 gauge":

    Figure 1 shows how the maximum observed rainfall amounts compared to corresponding rainfall frequency estimates for AEPs up to 1/1000 (0.1%) for durations from 5 minutes to 6 hours for a rain gauge in Maryland - ELYM2 Ellicott City (39.27333°N, 76.80444°W). The rain gauge is part of the Hydrometeorological Automated Data System (HADS). The AEPs are estimates from NOAA Atlas 14 Volume 2. As can be seen from Figure 1, observed rainfall amounts have probabilities of less or equal to 1/1000 for durations up to 3 hours.

    Also, what are these AEPs:

    ANNUAL EXCEEDANCE PROBABILITY (AEP) - The probability associated with exceeding a given amount in any given year once or more than once; the inverse of AEP provides a measure of the average time between years (and not events) in which a particular value is exceeded at least once; the term is associated with analysis of annual maximum series (see also AVERAGE RECCURENCE INTERVAL).

    http://www.nws.noaa.gov/oh/hdsc/glossary.html [noaa.gov]

    So its a "probability associated with rainfall of at least a certain amount within a certain timeframe occurring at least once per year." I don't like that weasel word "associated with". I'm having trouble finding an actual example calculation here.

    Starting Score:    0  points
    Moderation   +4  
       Informative=3, Underrated=1, Total=4
    Extra 'Informative' Modifier   0  

    Total Score:   4  
  • (Score: 0) by Anonymous Coward on Tuesday May 29 2018, @04:44PM (1 child)

    by Anonymous Coward on Tuesday May 29 2018, @04:44PM (#685709)

    Ok, well figuring out where exactly these numbers came from looks like it is not possible in a reasonable time frame (and maybe not possible at at all):

    3.3 Approach
    The approach used in this project largely follows the regional frequency analysis using the method of
    L-moments described in Hosking and Wallis (1997). This section provides an overview of the
    approach. Greater detail on the approach is provided in Section 4.2.
    NOAA Atlas 14 introduces a change from past NWS publications by its use of regional frequency
    analysis using L-moments for selecting and parameterizing probability distributions. Both annual
    maximum series and partial duration series were extracted at each observing station from quality
    controlled data sets. Because of the greater reliability of the analysis of annual maximum series, an
    average ratio of partial duration series to annual maximum series precipitation frequency estimates
    (quantiles) was computed and then applied to the annual maximum series quantiles to obtain the final
    equivalent partial duration series quantiles.
    Quality control was performed on the initial observed data sets (see Section 4.3) and it continued
    throughout the process as an inherent result of the performance parameters of intermediate steps.
    To support the regional approach, potential regions were initially determined based on
    climatology. They were then tested statistically for homogeneity. Individual stations in each region
    were also tested statistically for discordancy. Adjustments were made in the definition of regions
    based on underlying climatology in cases where homogeneity and discordancy criteria were not met.
    A variety of probability distributions were examined and the most appropriate distribution for
    each region and duration was selected using several different performance measures. The final
    determination of the appropriate distributions for each region and duration was made based on
    sensitivity tests and a desire for a relatively smooth transition between distributions from region to
    region. Probability distributions selected for annual maximum series were not necessarily the same as
    those selected for partial duration series.
    Quantiles at each station were determined based on the mean of the data series at the station and
    the regionally determined higher order moments of the selected probability distribution. There were a
    number of stations where the regional approach did not provide the most effective choice of
    probability distribution. In these cases the most appropriate probability distribution was chosen and
    parameterized based solely on data at that station.
    [...]
    Hosking and Wallis (1997) describe regional frequency analysis using the method of L-moments.
    This approach, which stems from work in the early 1970s but which only began seeing full
    implementation in the 1990s, is now accepted as the state of the practice. The National Weather
    Service has used Hosking and Wallis, 1997, as its primary reference for the statistical method for this
    Atlas.
    The method of L-moments (or linear combinations of probability weighted moments) provides
    great utility in choosing the most appropriate probability distribution to describe the precipitation
    frequency estimates. The method provides tools for estimating the shape of the distribution and the
    uncertainty associated with the estimates, as well as tools for assessing whether the data are likely to
    belong to a homogeneous region (e.g., climatic regime).
    The regional approach employs data from many stations in a region to estimate frequency
    distribution curves for the underlying population at each station. The approach assumes that the
    frequency distributions of the data from many stations in a homogeneous region are identical apart
    from a site-specific scaling factor. This assumption allows estimation of shape parameters from the
    combination of data from all stations in a homogeneous region rather than from each station
    individually, vastly increasing the amount of information used to produce the estimate, and thereby
    increasing the accuracy. Weighted averages that are proportional to the number of data years at each
    station in the region are used in the analysis.
    The regional frequency analysis using the method of L-moments assists in selecting the
    appropriate probability distribution and the shape of the distribution, but precipitation frequency
    estimates (quantiles) are estimated uniquely at each individual station by using a scaling factor,
    which, in this project, is the mean of the annual maximum series, at each station. The resulting
    quantiles are more reliable than estimates obtained based on single at-site analysis (Hosking and
    Wallis, 1997).

    http://www.nws.noaa.gov/oh/hdsc/PF_documents/Atlas14_Volume2.pdf [noaa.gov]

    It goes on about the various interpolations, adjustments, etc like this for a long time. It sounds like they are even resorting to moving the location of the stations (their most accurate info?) for some reason:

    adjustments of regions, such as moving stations from one region to another or subdividing a region, were made to reduce heterogeneity.

    • (Score: 0) by Anonymous Coward on Tuesday May 29 2018, @04:54PM

      by Anonymous Coward on Tuesday May 29 2018, @04:54PM (#685712)

      Same AC. Another general takeway here is these estimates are coming from models that are totally statistical. There is no physics going on here at all, which I didn't expect. I really expected them to take into account elements of the local water cycle, etc.

      It looks more like if you plugged in data to an ML algo and had it optimize for homogeneity (ie physically closer stations should have more similar patterns than far away stations), then let this feed back to tuning the data prep until the results looked "realistic" to an ensemble of humans.