Flood Watch: Predicting the Storm

May 29, 2015

Editor’s note: This article originally appeared in Stormwater Magazine, January/February 2005 issue. With the recent flooding events in Texas, we realized this topic is extremely timely and important for storm control professionals, and the general population. In the coming days, we will be publishing additional stories from our archives relevant to flooding, flood prevention, stormwater quality monitoring, and related topics.

Anyone who has seen the effects of a major flood knows the importance of municipalities employing the latest technologies in weather monitoring systems. Throughout the United States, many cities, counties, and flood control districts are engaged in both short-term weather monitoring—in an effort to identify when a big storm is approaching and to anticipate areas where flooding is likely to be a problem—and long-term tracking activities to better predict how much rainfall is expected in a given season.

In the long run, communities are able to establish new development in such a way that residential and commercial properties will not be as severely affected by flooding: Stormwater facilities can be planned not only for water-quality treatment, but also for flood control considerations; hydraulic models can be created to determine flood-prone areas; and federal and state agencies can work hand-in-hand to create floodplain maps for the Federal Emergency Management Agency (FEMA), insurance companies, and private developers.

Fort Collins, CO

Throughout its history, Fort Collins, CO, has faced serious flooding threats. In 1864, the Camp Collins military post was destroyed by a massive flood that stormed through the Poudre River. The settlement was rebuilt on higher ground and renamed Fort Collins.

Seven years ago, five people were swept away and killed in a massive flash flood that occurred over a two-day period in July when back-to-back heavy storms dumped more than 16 inches of rain in some parts of the region. The flash floods caused estimated damage worth more than $200 million. Fourteen storms from 1904 through 1997 resulted in fatalities.

In June 2004, Fort Collins updated its drainage basin master plan and is now in the process of revising its floodplain regulations through remapping. Those who developed the master plan determined that during a 100-year storm event, some 2,745 structures would be damaged at a cost of nearly $140 million. If nothing is done to reduce damages, over the next 50 years they could amount to $350 million.

The city has experienced extensive road flooding as well, which restricts emergency access and puts drivers at risk.

The master plan includes many provisions by which to stem potential problems: guidance to enhance riparian habitat along stream corridors to improve water quality and stabilize streams where necessary; cost-effective projects to remove properties from floodplains, reducing risk and street flooding; and guidance for new development in the basins to prevent problems from intensifying.

In that vein, improvements over the next 25 years will entail the removal of more than 2,300 structures from the 100-year floodplain, reducing potential costs from flood damages by more than $289 million.

When addressing the floodplain regulations, city staff discovered that floodways or conveyance areas had been an important element missing from the mapping of the basins. By delineating a floodway—considered the most hazardous section of a floodplain because it has the greatest depths and velocities—the city can direct development along the less dangerous fringe of the floodplain, rather than in the section through which the flood typically passes.

Marsha Hilmes-Robinson, the floodplain administrator for Fort Collins, says much has been learned from the city’s past floods. “I think the biggest thing is that the ’97 flood was the first real major flooding we have had,” Hilmes-Robinson says. “It was greater than a 500-year event. We had smaller floods throughout our history since the founding of Fort Collins, but the one thing we found that was really missing from our overall stormwater system was a flood warning system.

“We had spent a lot of money on acquisition projects and general stormwater improvements—upsizing culverts, bridges, and channels—but the flood warning piece was missing.”

The city has since added the system. Presently, Fort Collins is using a software program from OneRain Inc. to help in monitoring efforts.

“It’s taking all of the data coming in from the gauges and showing it in a user-friendly environment on a PC,” explains Hilmes-Robinson. “It’s very graphic with maps as well as hydrographs, and it also has paging capabilities so that we can set alarm thresholds, which is a real key for us,” she says, adding that there are low, medium, and high settings for each gauge and the system pages city officials when particular thresholds are met.

The system uses a combination of OneRain’s data acquisition software and flood forecasting/floodplain mapping software from David Ford Consulting. OneRain’s DIADvisor software package receives data from rainfall and stream gauges in the Fort Collins area, and the rain gauges report every millimeter (0.04 inch) of rainfall.

When a millimeter of rain is measured, a battery-operated radio transmitter sends a message to the Fort Collins operations center, where a radio receiver gets the message and decodes it into two parts: a gauge identification number to tell where the data came from and the rain data themselves.

This information is fed into a computer running the DIADvisor software, which checks the data for quality and files them into a database. The system also performs ongoing monitoring of accumulating rainfall and rising water levels. When levels exceed preset thresholds, alarms sound and key responders are notified. Users can access the data using DIADvisor, display the information on a variety of maps, chart the data, and run a variety of statistical reports.

During a storm event, the flood forecast software queries the DIADvisor database to obtain rainfall data, which are applied to a flood forecast model of key Fort Collins watersheds. Flow rates are computed and then converted to water elevations at key points and used to draw maps showing which areas will be inundated by the flood waters. All of this is done automatically and in real time.

As a companion to the software program, the city has created an emergency operation plan. A “to-do” list affiliated with each of the alarm thresholds gives information to the stormwater and office of emergency management staffs.

“Each one is customized for that particular side—what the tasks are that should be done,” notes Hilmes-Robinson, adding that actions may include warning a particular neighborhood about a flood.

Fort Collins also has a water-quality monitoring program. For instance, one gauge was originally set up as a water-quality gauge and the city later tapped into it to use as a flood warning gauge.

“Several of our gauges in our real urban old town area have very, very low thresholds set—those that are paged to a Colorado State University graduate student who does water-quality sampling to get that first flush of the storm—and they are using that to acquire long-term data for this area,” she says.

The city has installed several large storm sewer projects over the past few years in an effort to relieve some of the flooding problems at the outfalls of some of the large storm sewers. Ponds have been created to treat the water before it goes into the Poudre River. Crews are monitoring the outflow from the stormwater pipes.

All map updates have been completed now, Hilmes-Robinson says. “We are now working with FEMA to adopt those new maps for the FEMA basins,” she says. “We have both city-designated floodplains and FEMA-designated floodplains. There are some basins that are real small drainages that FEMA has not mapped, but the city still has felt it is important to map the flood hazard in those areas. All the maps are adapted and are being used on a day-to-day basis.” The work left to be done involves working through FEMA to update old FEMA maps with the newer data.

Fort Collins now has a more restrictive stance on new development. “We’ve thought about the regulations; in the floodway you are going to have more restrictive regulations than you do on the flood fringe,” Hilmes-Robinson says. “That’s pretty standard, but we are going more restrictive than even the FEMA minimum standard in those areas.”

For instance, residential development will have more restrictive regulations than non-residential development, and regulations for new development will be more restrictive than that for existing structures. Instances in which the city will be less restrictive relate to areas such as a substantial improvement or addition to an existing house.

“With new development, there is much more regulation in place, really keying on the high hazards of mobile homes, so there will be no new mobile home parks and no replacement of individual mobile homes that may be out there—only those that are in existing parks they bring in and replace.”

In the long term, Hilmes-Robinson says, Fort Collins officials want to get a good set of data to be able to have a better look at trends across the city. “We are not close to that yet,” she says. “We have only had the system in place for about five years. Once we get up into the 10- or 15-year range, we will feel like we have a better data set to work with, and will be able to help answer people’s questions a bit more.

“We’ve had some folks say, ëOh, it rains harder on the west side of town,’ or ëIt rains more on the west side of town.’ Well, in some cases maybe it does rain harder or more, but maybe not both. That means different things for planning purposes. We’ve just not had that sort of detailed data to be able to make any kind of accurate conclusions about that. Hopefully, with this long-term record we can, and it will be useful for calibrating our models and refining things more as time goes on.”

For now, she says, the most significant benefit of the weather system is that it has helped the city improve its overall preparedness when storms occur. “We are able to have accurate information, to be able to say how hard it really is raining. We’ve been fooled ourselves a number of times. We think, ëIt’s really coming down out there,’ and then we look at the intensity and see that it really isn’t that much. Even though we deal with it all the time, your eyes can fool you, and this really helps us put some sort of a number to it and helps us be a lot more organized in our response to events, and in providing information to the emergency responders. They are the ones who have to make the ultimate decisions: Do we evacuate? Do we close the schools? Do we open shelters? This has been a great tool for that.”

North Carolina

John Dorman is the mapping administrator and director of the floodplain mapping program for North Carolina. “After Hurricane Floyd [in 1999], it was determined that 80% of the homes that were either damaged or destroyed were not shown accurately in the floodplain,” Dorman says. “At the time, the floodplain mapping program hadn’t been created. In the state planning office where I worked, the administrator was asked by the general assembly to come up with a plan.”

The planners came up with a two-fold strategy. One step was to develop a statewide program that would collect new and accurate elevation data and use those data to develop modeling for floodplain mapping. New digital and hard-copy flood insurance rates would follow.

“The second hand-in-glove strategy was a FEMA program called the Cooperating Technical Partners Program. We sign an agreement with them saying we are going to share all of our data to have the best available floodplain delineated,” he explains. North Carolina signed on as the first cooperating technical state.

“FEMA gave North Carolina the primary responsibility for all the new maps—for updating and maintaining them,” Dorman says. The program started in the fall of 2000 and uses a three-phase approach in what is one of the largest, most comprehensive floodplain projects in the United States. The first stage focuses on six eastern river basins that had been damaged earliest or suffered the most damage during Hurricane Floyd. The second stage encompasses six areas in the northern Piedmont basin, and the third stage takes in five western basins.

“We are doing hydrologic and hydraulic modeling on a basin-by-basin study and then doing what I call ëcookie cutting’ out on a county-by-county basis to map,” Dorman says. Three levels of floodplain mapping include detail studies, redelineation of existing flood-hazard data, and limited-detail studies.

The state is using a laser sensor and has collected data from approximately 80% of the state. “We’ve had an accuracy rating on that at about 25 centimeters to bare earth in all of the counties,” Dorman says. “We have a very good set of elevation data, and at this point, we’ve studied five of the six stage-one basins.”

The state has done about 11,000 stream miles of study and has put out 40 sets of maps. “Of those 40, 12 counties have ëgone effective,’ which means the maps can be used now for insurance purposes,” says Dorman. “They can, when they get it in preliminary form, start using it for floodplain management.”

The Cape Fear River Basin, the last of the six in phase one, is expected to be completed by March 2005. The state has received money to move into phase two and is expecting additional monies for the third phase. The planned project completion date is 2007.

The limited-detail study involved Watershed Concepts technologies in a streamlining process that emanates from the collection of base data—with survey information of all hydraulic structures, delineating basins, drawing cross sections, extracting elevation data for cross sections and stream lines, building a HEC-RAS (Hydrologic Engineering Centers River Analysis System) model for each stream, calibrating the model and mapping the boundaries, and determining Base Flood Elevations. Structure information is imported into Watershed Concepts software for use in modeling.

By using NASA-developed light detection and ranging (LiDAR) and the latest geographic information system (GIS) and global positioning system technologies, the state is able to depict the watersheds and know where the water flow will go, Dorman says.

Going forward, the program is expected to save the state money. “It was determined through a cost-benefit analysis back in 2000 done by the United States Geological Survey [USGS] that if we had up-to-date flood maps, there would be a $56-million-a-year cost avoided from flood damage for the state as a whole,” Dorman says. “That definitely sells the program. The LiDAR data in many cases can be used by communities and by the state for preliminary design for road construction. It can also be a supplemental set of information to deal with stormwater management.” Dorman says the information can also help with wetland delineation and assigning buffers for the Clean Water Management Trust Fund and water quality.

The state plans to have a website for the Tar-Pamlico River Basin, where a pilot program is being conducted that allows officials to know the geometry of the stream, the flow of the water, and where it will inundate. “We’re putting new stream gauges out through USGS in their network. As the stream gauge goes up every half-foot, that information goes to the satellite and comes back to our system,” says Dorman.

“We are developing libraries of maps for each of those gauges that will show in real time where the water is,” he adds. “We will be able to show which roads, bridges, communities, and farms are being flooded at that point in time, and with the National Weather Service looking at the gauge and developing forecasts, we will be able to tell you where it will be inundating over the next 72 hours.”

While forecasting is still under development, Dorman expects this program will have a statewide application, extending to the Neuse River Basin and the Lumber River Basin. “We believe this will be a very strong component for evacuation plans and for the North Carolina Department of Transportation (DOT) to close roads and bridges,” Dorman says, pointing out that 40 people died during Hurricane Floyd while attempting to drive across flooded roads.

“We are not just doing a mapping component,” says Dorman. “We are also doing an alert component for each of the gauges. If a gauge gets to a point above sea level and we know that is where it will start impacting the public safety, that can trigger an alert that can go out to whoever needs to know it, either by phone or page or the weather service.”

Ventura County, CA

Darla Wise, water-quality manager for the Watershed Protection District in Ventura County, CA, points out different approaches the county uses to monitor storm events. One is through a contract service with a weather-monitoring consultant, Fox Weather Services. Fox monitors weather parameters and offers a forecast of probability of rain events throughout different areas of the county.

“Because we are a coastal region, we tend to have real extreme rain patterns that fall within Ventura County. If you are on the coastal plain where it is very flat, you can often see the weather event move in onshore and just skip right over that flat coastal plain,” says Wise. “It often doesn’t build until it hits the coastal mountain range, which is 10 to 15 miles inland. At that point, all the clouds bump up against the mountain. They condense and then release their moisture in the mountain range.”

Those in the foothills may see up to 5 inches of rain, while those on the coastal plain will experience perhaps a half-inch, she notes.

“[Fox] will give us a probability of a rain event throughout the region, and [it] will also give us periodic updates—a morning update and an afternoon update—on those probabilities as the rain event approaches,” Wise says. If potential shifts in rain patterns occur, bringing more or less rain or a shorter event or longer event, that is noted as well.

“We’ve been working now with [Fox] for quite a long time, and we’ve learned to look at the data [it] provides us, and we couple that with what I hear on the news and what I get through the National Weather Service for projections,” says Wise. Based on that information, Ventura County has 60-year rain event tables that help provide estimations on how a rain event will influence the hydrograph at the river systems.

“These tables of hydrograph influence are based on this long-range record of rainfall data,” Wise says. “If we are anticipating a 2-inch rain event in 24 hours, these hydrograph volume tables we have can tell us what the volumetric flow in the river will be. Based on that, we will program our samplers.

“We have a hydrology division of the Watershed Protection District that has alert flood-warning systems throughout Ventura County,” continues Wise. “That’s an extensive monitoring system for flood conditions, which includes rain volumes and precipitation amounts.” River stage and flow measurements come from 60 monitoring stations throughout the county. “As the rain event actually takes place, we also call up the hydrographers’ alert system to give us the real-time data on the actual rain volumes that we got during the rain event.”

The approach not only offers rainfall predictions and forecasts, but also provides the county with the information needed to program its equipment to capture the storm events. Ventura County is under a National Pollutant Discharge Elimination System stormwater permit that mandates the county do mass emissions monitoring for the major river systems within the county.

“We are looking at the actual load, in pounds, of pollutants in stormwater runoff to the receiving waters. That is an effort to identify pollutant loading to the river system and also to understand the water-quality characterization of the surface-water systems within Ventura County,” says Wise. “We are looking at current conditions as well as evaluating trends and changes in water-quality conditions as time goes on. We continue to build our database.”

At the monitoring stations, Ventura County is using a variety of Teledyne Isco samplers and flow meters and implementing state-of-the-art water-quality monitoring as part of its stormwater program. “We’re using automated equipment to help us sample flow-proportional samples that represent water quality throughout the hydrograph of these storm events,” says Wise. “That allows us to calculate a mass loading of pollutants to the waters associated with these particular storm events.”

Scott Holder is the interim senior hydrologist for Ventura County, which uses real-time gauges with telemetry. “We follow the guidance of the National Weather Service, but we also get guidance from a private meteorologist,” he says. “He gives us a quantitative precipitation forecast, so down to the one-hour increment we know what kind of rainfall we can get.” That information is used in a couple of models that offer the timing and the magnitude of the peaks that the county may get, Holder says. “We relay that information back to Darla so they know when they need to be out sampling or have other people out sampling,” he adds.

Another aspect of the program is the ability, through telemetry, to monitor what’s going on during a storm using real-time rainfall and stream flow gauges out in the field. “We also get data from the US Geological Survey, the California Department of Water Resources, and from neighboring counties,” says Holder. “We get a big picture of what is going on in rainfall amounts and on what the streams are doing, and we compare that to the model.”

Holder says the consequences of storms in Ventura County are similar to those in Phoenix and Las Vegas. “Theirs are more from monsoonal thunderstorm-type storms. But we have the same issues, which are rapid rise in the water and flash flooding,” he says. “The longest period we have for one of our rivers to respond is probably about 10 hours after peak rainfall. That’s not that long. Most of them are three hours or less.

“Basically, as the storm is happening—and especially since we got hit by the fires—that significantly increased our runoff and debris from the streams,” says Holder. “We get an added impact from that.” Thousands of acres in southern California were burned by wildfires in the fall of 2003 and throughout 2004.

Holder says one of the most notable flooding events in Ventura County happened in 1992, and was broadcast on CNN. “Our Ventura River actually went into an RV park that was in the floodway,” he says. “We had no control over the RV park being put there; the City of Ventura put it in. The RVs were supposed to be there only temporarily. Some of them were there for years. They didn’t even run, or their hookups were rusted, so all of a sudden the river overflowed and there were some shots from the helicopter that showed these RVs being pulled into the river and smashed against a bridge. It was a real famous video, but unfortunately not one of our best days. Luckily, nobody from the RV park was injured or killed. However, there were some homeless people who unfortunately lost their lives in the storm,” says Holder.

The county now has a Web site, www.vcwatershed.org, which features a link to the latest weather and rainfall information. “This is from our real-time system, which Darla and the people in her group use quite frequently when we get storms,” says Holder. “They can see what is going on—all they need is an Internet connection. When they need to look at more detail, they call their stations to get that data.”

South Carolina

Bob Steele is the senior hydrologist for the LPA Group Inc. in Columbia, SC. His company works for the state’s DOT, performing flood studies on DOT bridges. His company uses Haestad Methods’ StormCAD technology for stormwater modeling, as well as CulvertMaster, a culvert design and analysis program; FlowMaster, a utility program; and PondPack, a program for detention pond design and urban hydrology modeling. Steele is also beta-testing CivilStorm Dynamic—a program for simulating the operation of storm sewer systems, inlets, channels, and other structures—by applying the program to existing studies to ascertain whether he gets the same results and if not, why.

“We do a lot of floodplain studies,” says Steele. “Many times, they are associated with already-existing FEMA-designated areas, so we work in conjunction with FEMA, providing new calculations and new changes to mapping because of flood intervention.”

Steele says as a result of newer data and criteria, the floodplain map over the years has changed. “With water quality, we are touching the hem of the garment,” he says. “We are probably getting into water quality on each project with regards to sand and erosion control. Sometimes we are beginning to address a little more to where we are looking for additional water-quality runoff coming from paved areas, and how we try to handle that and capture that.”

Steele uses a variety of programs for riverine flood analysis, including HEC-2 and HEC-RAS, which he uses when working for FEMA. “These programs are used for flood studies as experienced in developed areas, for example with localized flooding in residential neighborhoods,” says Steele, adding that modeling of the existing storm drainage system is done with XP-SWMM, developed by XP Software. He says it is akin to CivilStorm Dynamic.

“This is a hydrodynamic program in which pipes, channels, ditches, ponds, pumps, and so on can be modeled in one program,” he explains. “It can also be used for riverine flood analysis and is approved by FEMA. If we get a complaint from a community that we have a flooding problem, we will use that tool to help us analyze and verify that there is a problem and then use the same tool to fix the problem,” says Steele.

Though Columbia does not experience severe flooding, Steele says his company has been working on a project for a town 100 miles south, Bamberg, which was “literally built in a bowl,” he says. “They had a horrible problem trying to get the water out of this city. You go 2 miles north or 2 miles south to one of the rivers or swamps at a very flat grade.” LPA is designing improvements to the town’s drainage system.

LPA also finished a study in the town of Cayce, which has experienced flooding problems. “We are looking at putting in a system to try to alleviate that and get the water out,” says Steele. “Many times we get called by the DOT to look at certain areas. It’s an ongoing study in the problems we are always addressing.”

Steele says his firm addresses issues from a roadway standpoint but also considers adjacent properties. “We may do a roadway project to improve the drainage system, but really the bottom line is that we are eliminating a drainage problem for a multiblock area within the town,” he says.

Perhaps the weakest link in all of the work that Steele’s firm conducts is the monitoring of mechanisms, he says. “We still use the same rainfall data we’ve used for many, many years. Now newer data is becoming available, and as it does, we will utilize that. The client, like the Department of Transportation, does not have the funds to start to implement monitoring to get rainfall data.

“If the client doesn’t have it, we don’t have it, so we still have to refer back to our NOAA [National Oceanic and Atmospheric Administration] charts we’ve used for many years that are 15 and 16 years old, and to our local weather reporting stations,” says Steele.

South Carolina recently implemented what Steele calls a “good stormwater management” program that puts the responsibility on developers and the DOT for new roadways. New projects must consider preconstruction runoff versus postconstruction runoff and provide some type of detention device so the project will not cause any increases in runoff. However, the scenario is based on a 10-year storm, Steele says.

“You’ve got your 25-year storms, your 50-year storms, and your 100-year storms,” he says. “Some municipalities go as far as to say you have to deal with the monitoring all the way up to the 100-year; many do not. Here in South Carolina, we basically work with the 10-year design storms.

“So, if you have the 25-year, the 50-year, and the 100-year, what are you going to do? The roadway drainage system is not designed to pick up that water, so the roadway system would flood and stay flooded until the water could get into the system.” Once the stormwater gets in, it would go through the pond it was designed for, Steele says. “You could have a situation where the water gets there by a secondary routing. The system can’t carry the [runoff from a 25- or 50-year storm], but overland flow, when it floods, is still going to get to the pond.”

Some runoff will be passed downstream, however, because the design storm for the ponds is also a 10-year event. Therefore, most likely there will be some downstream flooding. He notes that one county adjacent to the Columbia area required a 25-year storm design.

Every new development that takes place ends up with many ponds controlling a particular storm, Steele says. “All they are doing is detaining it until it can pass at no higher rate than what the preconstruction rate would have been,” he says. “When that hits the streams and begins to accumulate and go down, now the peak-rate factor in the streams has changed, which may be detrimental instead of beneficial. Therefore, there could possibly be flooding where there would not have been before.”

Presently, a plan in which developers are required to study the whole watershed is under way. Developers have to make assurances that what they do will not affect the area 2 miles downstream.

“The methodology continues to be enhanced, but there is not going to be a handle on it until we start looking at what we call the regional or ëtotal watershed-‘type monitoring basis,” says Steele.

He praises contemporary technology for offering a strength in stormwater monitoring that has not existed to this point. A dynamic modeling tool enables those tracking storms to monitor what is happening from beginning to end.

“It automatically takes into account the amount of storage volume in pipes and channels, or whatever reduction we have to artificially try to simulate that,” says Steele. “These types of tools are more intense and more expensive and are the better tools to date. We still rely heavily on our StormCAD-type products that designed that local drainage system along a city street or parking lot, or an airport system where we may have a multiple catch-basin pipe system. We still rely upon something to help us design that system.”

Boulder, CO

Chad Kudym is a GIS coordinator and a certified floodplain manager for HDR Inc. in Denver. The City of Boulder is the firm’s primary client. In South Boulder Creek, HDR is using DHI’s MIKE FLOOD software to simulate the effect of rain of varying degrees. Boulder is performing a flood-inundation study from rainfall that flows into South Boulder Creek, and part of the project encompasses weather alert systems at the top of the mountain. Kudym says it is an extensive project.

“In Boulder, citizens tend to get pretty involved with the projects going on that affect them in terms of flood insurance rate maps, and also with the level of infrastructure that is going to go in there,” he says. “A lot of the people are opposed to any kind of mitigation structures.”

Kudym notes that previous studies, which produced flow rates using existing models, suggested possible mitigation options. One of these was a large flood control reservoir just upstream.

“The people aren’t very fond of that,” he says. “There also is the issue of CU Boulder—the University of Colorado—which has a portion of property adjacent to the floodplain, but protected by a levee that is not FEMA-certified at the present time.”

A group of citizens is opposed to the development. FEMA is pushing to have the floodplain remapped, because when CU Boulder purchased the property, the university had to do some modeling to demonstrate it was out of the floodplain.

“That modeling showed there were some flow splits where a lot of the flood waters would leave the main channel in South Boulder Creek and enter an area called the West Valley overflow,” Kudym explains. “The previous study indicated approximately 1,000 structures that should be in the floodplain and currently are not designated” as such. He says this is because a 1986 flood insurance study cut off the study area at a certain interchange of US 36.

HDR is reviewing and identifying what the flood hazard is and which structures should be designated as being in the 100-year floodplain. Kudym notes there has been an extensive public involvement process, including several public meetings.

In the weather aspect of the project, HDR studied the climatology of the area to determine the largest floods that have occurred in the area and tried to reconstruct as far as possible what the rainfall amounts were in different areas during those floods. Then the firm created a rainfall grid based on those observations to use in the modeling effort.

“We’ve looked at storms from 1938, from 1969, and two smaller events just for calibration purposes from 1998 and 1999. We had some radar data available that we could use, rather than just point observations,” Kudym says. “We also looked at the NOAA Atlas.” The atlas provides guidelines as to what constitutes a 100-year rainfall event. These data were used in the modeling efforts. “It’s been 30 years since that study was done, so they wanted to update the frequency and check to see how much that has changed with 30 more years of record.” It wasn’t significantly different, so HDR went with the currently published value, which is about 5 inches of rain in a 24-hour period, Kudym says.

A USGS employee has been doing several studies along the Colorado front-range area and has provided input in terms of where the heaviest rains have happened in the basin. “We’ve used some of his guidance to figure out, if a thunderstorm was going to happen in the basin, where would it most likely fit in,” says Kudym. “We also studied similar types of events along the Colorado front range from north of the Wyoming border south down to about Colorado Springs.” Those rainfall events measured between 3.5 and 7 inches from 1994 to 2000.

“We used that as well to figure out what the intensity of the storm, duration, and the placement of that should be,” says Kudym. “That was a ëdesign thunderstorm.’ We also had a ëdesign general storm,’ which is more of a prolonged storm rather than intense rain over a short period of time.

“We’ve used the 1969 rainfall event to help guide both the temporal distribution and the space distribution of that rainfall amount. Then we did some statistics to figure out what the 100-year, 72-hour rainfall event would be in terms of precipitation, and that’s about 7 inches of rain over a three-day period.”

Kudym’s firm has used both to examine the risks associated with both of those types of rainfall events and the runoff that occurs from them, using the MIKE FLOOD software to do the modeling.

Looking ahead, Kudym says HDR’s intent is to define the problem but not engage in the politics of solving the problem in terms of infrastructure. “That’s one of the things that really caused problems on the last project,” he says. “People felt they hadn’t completely identified what the problem was, yet they were trying to solve it. We are providing them with all the information we think they are going to need in order to move forward with that next step. They will hire someone else to come in and analyze what we’ve done and come up with some potential solutions in terms of structural or nonstructural mitigation.”

Burlington, VT

The University of Vermont is engaged in two research efforts that are part of a larger project, “Redesigning the American Neighborhood,” which is funded from an EPA grant the university received in late 2003.

Alex Hackman is a research assistant working under the direction of Dr. William Breck Bowden at the Rubenstein School of Environment and Natural Resources at the university in Burlington. “We are running two separate but related research projects,” says Hackman. “The first involves storm-event sampling, using an Isco autosampler, and discharge monitoring in a small stream running through a residential neighborhood. We have an Onset HOBO data-logging rain gauge there to help us relate local precipitation to changes we observe in the stream flow.”

The second project involves an assessment of fundamental ecological functions in three stormwater-impaired streams and three streams matched for size, substrate, canopy cover, drainage area, and other factors that are in attainment condition under the state biocriteria standards.

“We’re running continuous whole-stream metabolism experiments at each site, and use a HOBO sensor to monitor the photosynthetically active radiation [PAR],” says Hackman. The university also has two additional rain gauges, which are offering an estimate of area-wide rainfall distribution.

“We’re doing continuous discharge monitoring at each of these six ëfunctional assessment’ sites and that rainfall data allows us to better understand the rainfall-runoff-stream flow dynamics,” adds Hackman. “We are also running solute injection experiments at those six sites to evaluate nutrient spiraling.”

In the residential project, the university has set up two monitoring stations: one at the top of the neighborhood where the tributary enters, and one at the bottom of the tributary where it leaves the neighborhood.

“The intent is to calculate pollutant loading associated with the neighborhood,” Hackman says. “We look at what is coming in and what is going out that’s associated with the neighborhood itself to establish a comprehensive assessment of baseline conditions at this location so we can track changes in the future.”

The Onset data-logging rain gauge allows the university to accurately relate local rainfall patterns to the observed changes in stream flow observed at the site. “We do get a lot of rain here and the nearest rain gauge we are able to access is a couple of miles away at the Burlington International Airport,” so having the Onset gauge at its present location is helpful, Hackman says.

The university is working with local residents, the city, and a few agencies in a collaborative effort to discuss its findings and map out plans for future alternatives for better stormwater management. There are flooding problems there, though minor.

“Some folks have basement drains that go into this small tributary,” Hackman says. “When the water level gets high enough, we have backups in the basements. This is a fairly degraded, deeply incised, very small tributary, so we haven’t seen it flow over its banks and cause any area-wide flooding. The flooding is limited to people’s basements at this point. But it’s obviously important to the people and has caused some damage, so it’s something we talk about to the residents and we’re trying to help with that.”

In the second part of the project, the university originally planned to do two functional assessment experiments within the small tributary, involving whole-stream metabolism and an evaluation of nutrient uptake.

“What we found was the tributary was basically too small to run those experiments,” says Hackman. “At the same time, a lot of work has been going on in Vermont on developing new stormwater regulations. My advisor has been involved in working with the state in an advisory capacity to look at the new stormwater regulations, and one of the things they’ve been looking at is comparing water bodies that are in containment or reference condition.

“We got the idea of expanding the functional assessment as a piece of our water-quality program into a broader geographic scope, and instead of six streams, we look at three larger streams that are on the state’s list of impaired waters for urban runoffs, matching those three with three other streams that are in attainment condition for biocriteria standards.”

Hackman and his associates visited every stream within a 40-mile radius of Burlington looking for appropriate streams.

“It’s critical we have several variable controls for it,” he says. “We have tried to have all of our streams have similar canopy cover because sunlight is critical to the primary productivity studies we are running.”

For instance, the substrate type is critical, as well as the transbasin, flow stream size, and anticipated discharge.

“We couldn’t have any tributaries coming into our stream because of our site-injection experiment requirements,” adds Hackman. “After a month of hard looking, we found six streams we feel are very appropriately matched. At those six sites, we are doing a lot of continuous monitoring of the oxygen, temperature, conductivity, stage height, and also photosynthetically active radiation. It’s a very roundabout way of using the Onset equipment, but the PAR is a critical component of the whole-stream metabolism experiment.”

In the whole-stream metabolism experiment, the researchers are looking at the dissolved oxygen budget in stream rates.

Another experiment determines the amount of oxygen being transferred to the water column from the atmosphere; researchers can then estimate any groundwater input that may affect the oxygen budget.

“The PAR monitoring allows us to accurately set the day/night cycle more accurately and then also estimate the intensity of light at the various sites,” says Hackman.

With two rain gauges at two of the six sites, combined with gauges in the neighborhood and airport sites, researchers have four rain gauges spread over a 30-mile radius.

“In that way, we are able to look at area-wide rainfall distribution,” says Hackman. “Because we are continuously monitoring these streams, we also relate rainfall patterns to observe changes in the stream flow.”

From both of the experiments, the university hopes to compare ecological processing behaviors and productivity of the matched streams.

“Additionally, at these six sites we’re doing rapid geomorphic using the Vermont standard procedures and biological assessments—macroinvertebrate sampling,” says Hackman. “Our intent is to compare how these structural metrics of stream condition compare and contrast to the functional metrics.”

About the Author

Carol Brzozowski

Carol Brzozowski specializes in topics related to resource management and technology.