Sunday, January 23, 2011

HAZUS-MH CDMA training

The Indiana Department of Homeland Security is sponsoring a free three-day L317: Comprehensive Data Management for Hazus-MH course from Tuesday, May 24 to Thursday, May 26, 2011 on the Valparaiso University Campus in northern Indiana. is not sure how/if IDHS will support non-Indiana folks who want to attend this course.

This course focuses on updating the HAZUS-MH inventory. It is recommended for GIS analysts, database administrators, and others who will be responsible for migrating local data into the HAZUS-MH database structure.

The course provides an overview of the methodologies that were used to develop and compile the HAZUS-MH provided inventory. However, emphasis is placed on the use of the Comprehensive Data Management System (CDMS) and other tools developed by FEMA for HAZUS-MH users interested in improving the accuracy of hazard loss estimations by integrating local building, infrastructure, and population related data into the HAZUS-MH analysis process. Numerous hands-on exercises will empower participants with the skills to use CDMS as well as other tools to update both HAZUS-MH state geodatabases and study regions. The course will also help participants effectively prepare for their own data updating projects by identifying those inventory elements that have the most impact on the estimation of losses for flood, earthquake, and hurricane events.



Saturday, January 15, 2011


The USGS Multi Hazards Demonstration Project (MHDP)’s second full scenario, called ARkStorm, addresses massive U.S. West Coast storms analogous to those that devastated California in 1861–62.

The ARkStorm is patterned after the 1861–1862 historical events but uses modern modeling methods and data from large storms in 1969 and 1986. The ARkStorm draws heat and moisture from the tropical Pacific, forming a series of Atmospheric Rivers (ARs) that approach the ferocity of hurricanes and then slam into the U.S. West Coast over several weeks.

HAZUS-MH was used extensively for the ARkStorm analysis.

MORE INFO: USGS Multi-Hazard West Coast Winter Storm Project

MORE INFO: Overview of the ARkStorm Scenario

The overview report summarizes a winter storm scenario called ARkStorm (for Atmospheric River 1,000). Experts have designed a large, scientifically realistic meteorological event followed by an examination of the secondary hazards (for example, landslides and flooding), physical damages to the built environment, and social and economic consequences. The hypothetical storm depicted here would strike the U.S. West Coast and be similar to the intense California winter storms of 1861 and 1862 that left the central valley of California impassible. The storm is estimated to produce precipitation that in many places exceeds levels only experienced on average once every 500 to 1,000 years.

Report: Overview of the ARkStorm Scenario (.pdf)...


Thursday, January 13, 2011

DHS Adopting Integrated Risk Management Approach

A recent article in Public Entity Risk Institute newsletter discusses the evolving policy for DHS to work with its partners to use Integrated Risk Management as an approach to address the uncertainty inherent in its complex mission space, and to help make the tough decisions necessary to keep the nation resilient and secure with limited resources. The policy is based on the premise that partnerships can enable the most effective risk management. READ MORE ...


Monday, January 10, 2011

In the News: Devils Lake Threat to Minnewaukan, ND

Christina Cummings, a graduate student in UND’s Department of Geography, has spent the past two years writing her master’s thesis using HAZUS-MH on the flooding threat Devils Lake poses to Minnewaukan, ND. When completed, her work will become part of the Devils Lake risk assessment study. READ MORE


Monday, January 3, 2011

First Release of OpenQuake Engine

Version 0.2 of OpenQuake has just been released; the first public release of a series leading up to a comprehensive computational engine that will power seismic hazard and risk assessment applications. The current version is an alpha-version aimed at code-developers, and newer versions are being released each 3 months with new or expanded features and functionality. Initial user-interfaces are planned for the V0.3 release of March 2011, to better enable the user community to participate in the OpenQuake project.

OpenQuake was created as part of the global collaborative effort GEM (Global Earthquake Model), as an engine for GEM’s risk assessment platform OpenGEM, which aims to serve a full spectrum of users by 2013 in assessing and modeling earthquake risk, and in communicating it through maps and other types of output, so that it can be shared. OpenQuake’s initial development is based on the requirements that emanate from the development of models, databases and standards by scientists and practitioners from around the world, within the scope of GEM.


Global Earthquake Model (GEM)


Labels: ,

Can Hurricanes Trigger Earthquakes?

Scientists from the University of Miami presented a theory at the American Geophysical Union meeting last month linking the devastating Haiti earthquake in January 2010 to strong tropical storm systems that struck the region in 2008.

The hypothesis put forth is that the mass of sediment removed from Haiti’s uplands (and deposited in adjacent lowlands) influenced the stresses on the Léogâne Fault zone enough to cause it to rupture, resulting in the devastating Jan. 12, 2010 earthquake. The cause of such rapid erosion, according to the abstract, was the combined effects of two hurricanes and two tropical storms in 2008 on a severely deforested landscape.


Labels: ,

Natural disasters 'killed 295,000 in 2010'

In their annual report, Munich Re reports that the Haiti earthquake and floods in Pakistan and China helped make 2010 a significant year for natural disasters, killing 295,000 and costing $130 billion.

The last time so many people died in natural disasters was in 1983, when 300,000 people died, mainly due to famine in Ethiopia.



Report Summary