Read the Original Article at http://www.informationweek.com/news/showArticle.jhtml?articleID=231500603
The next time the Guadalupe River in Texas floods, IBM researchers may be able to warn area residents and officials beforehand.
Disaster recovery will be front and center for IT teams this week, given the earthquake that struck the East Coast Tuesday, plus the impending arrival of Hurricane Irene. (See our related coverage of how NYSE IT held up during the earthquake.)
IBM, working in conjunction with researchers at the University of Texas at Austin, has demonstrated the ability to predict river behavior 100 times faster than real-time. In theory, this should allow warnings about impending floods several days before rivers actually overflow.
Such flood prediction, like any weather modeling, requires significant calculating and analytical power. IBM's approach is particularly calculation-intensive because it includes river tributaries in its number-crunching. Traditional approaches to flood prediction focus on main river stems and overlook tributaries. Hence, the project's reliance on IBM Power 7 systems.
IBM has posted a video on YouTube that demonstrates its flood prediction technology.
"The challenge here is to be able to model the flow of water in very large scale river networks, like the Mississippi River Basin or even the whole continental United States," said David R. Maidment, professor of civil engineering at the University of Texas, Austin, in a phone interview. "And that capacity has existed for the weather community for a long time. Even in the oceans community...we can model how [hurricanes] move through the Gulf of Mexico. But there's never been any capability to model similar scale things in hydrology, especially when it comes to solving the real equations of river flow on a proper GIS representation of the rivers. That's what IBM has succeeded in doing."
Maidment says that models exists for the main stems of rivers, but not the flows coming in through the tributaries. He notes that it was the tributaries that flooded in the recent Mississippi flooding. "The flooding that happened down around Vicksburg, Miss., was because the tributaries couldn't get into the Mississippi," he said.
Maidment is working with IBM not only to improve flood modeling capabilities, but also to develop flood modeling as a service. Such services, he suggests, could be useful to keep emergency responders aware of moving flood boundaries and to warn residents of flooded areas when necessary.
Flood modeling as a service is being made possible by the development and adoption of a specification called WaterML, an XML-based markup language for communicating hydrology data. In recent years, the USGS and the National Weather Service have adopted WaterML and are publishing their water-related data in that format.
"What's gradually happening is the water information of the country, particularly that related to flooding, is coming together in a common language," said Maidment. "And what we're doing with IBM is to develop what we're calling virtual observation points all along the river, where the values of the water level and the flow rate are being calculated rather than being measured."
Present flood systems are based on only a few data collection points. The goal is to have widespread virtual observation points for better situational awareness during floods.
InformationWeek Analytics has published a report on backing up VM disk files and building a resilient infrastructure that can tolerate hardware and software failures. After all, what's the point of constructing a virtualized infrastructure without a plan to keep systems up and running in case of a glitch--or outright disaster? Download the report now. (Free registration required.)