BlueGene/L Supercomputer in Livermore, Calif.
BlueGene/L Supercomputer in Livermore, Calif. Reuters

As Hurricane Sandy barrels her way up the Eastern Seaboard, supercomputers fortified by high-speed analysis software are preparing models of potential damage and its costs.

While nobody can know ahead of time just how much destruction a category 1 hurricane like Sandy will do, specialty companies that specialize in catastrophic risk analysis are armed with historical data, as well as links to national computer centers that track the storm in real time.

Last year's Tropical Storm Irene caused $4.3 billion in damages, the Insurance Information Institute concluded. Damage from Hurricane Sandy, which hasn't even made landfall, could be higher, especially because its storm surge is predicted to be much bigger.

CoreLogic, a private forecaster, said as many as 284,000 New York-area properties could see as much as $88 billion in damage from a high Sandy storm surge. Eqecat, a private forecaster in Oakland, Calif., estimates damage from Sandy will range between $10 billion and $20 billion.

“That storm wreaked havoc with people's lives,” said New York Gov. Andrew Cuomo on Monday, referring to Sandy. “People need to take warnings about Sandy seriously.”

Indeed, if the hurricane makes landfall in New Jersey, a surge could damage refineries for the major oil companies like Exxon Mobil Corp. (NYSE: XOM) that are close to the shoreline, hurting U.S. energy supplies.

The U.S. government itself, through the Weather Service and the National Oceanographic and Atmospheric Administration, has access to mega-computer bases, which also crunch data from low-level satellites, buoys floating in the Atlantic and Caribbean and aircraft flying over the storm.

The Weather Service's dire predictions for Sandy are exactly what triggered all of the warnings last week, as well as the weekend's evacuation of coastal areas on the Eastern Seaboard, shutdown of mass transit in Washington, D.C., Philadelphia and New York, closing of the New York Stock Exchange and Nasdaq and other fallout.

There's a huge market for the kind of material that relates to storms and damage, said a spokesman for the Cornell Theory Center, a national supercomputer center at Cornell University.

A handful of private catastrophe risk analysis companies, such as Eqecat, Risk Management Solutions and Air Worldwide, specialize in the sector. Top clients are the insurance industry, as well as some are government agencies.

Indeed, ahead of Sandy, leading casualty insurance companies like Allstate (NYSE: ALL), Hartford Financial Services (NYSE: HIG) and private Fireman's Fund are writing contracts with reinsurance brokers and companies based on the latest findings.

As reports say Sandy's winds are already as strong as 75 miles per hour, insurance companies are seeking to allay their risks as much as possible. That provides new business for the reinsurers, which are mainly based in the Bahamas.

That may be hard for them to do if Sandy's force escalates and she makes a direct hit on places like New York City or Long Island's Nassau and Suffolk counties, which were badly damaged in 1938 when far fewer people lived there.

Insurance and reinsurance companies are keeping a close eye on this event, said a meteorologist with Eqecat in New Jersey, in an interview. They had a soft market for four years until last year's Midwest floods, and then Irene created a flood of new claims.

Lines are buzzing right now between the companies like Eqecat, based in Oakland, Calif., insurance companies, insurance brokers and reinsurance companies, representatives said.

Eqecat and its competitors are able to tap into publicly available data furnished by the National Weather Service and NOAA, Eqecat representatives added.

The public data, generated by supercomputers from makers like IBM (NYSE: IBM), Cray (Nasdaq: CRAY), Silicon Graphics International (Nasdaq: SGI) and NEC (Tokyo: 6701), is then fed into proprietary models. Software that it crunches comes from the manufacturers, as well as Oracle's (Nasdaq: ORCL) Sun Microsystems unit.

Then, using internal analysis, the providers use stochastic data to model how Irene might cause damage based on historical data of prior Eastern Seaboard hurricanes.

“We look at what has happened and what will happen,” an Eqecat representative said. Using stochastic data, or data that measures random variables, experts can try to predict how Irene will move and what can of damage she can inflict on the coast of North Carolina or New Jersey, then pass it onto a client.

A spokesman for the Property and Casualty Insurers of America said the industry group can't issue a prediction, only data collected after an event, such as the August 2011 5.8 magnitude Virginia earthquake or Hurricane Katrina.

The catastrophe modelers also maintain proprietary software for other events, such as earthquakes, fires, tornadoes, thunderstorms and terrorism. That's why real-time computing and software are crucial to predictions, as well as why storm forecasts are far more accurate than the past.

Michael Kistler, a civil engineer with Risk Modeling Solutions, said his firm specializes in the possibility of an event occuring and the intensity of an event but can't necessarily predict exact numbers.

In the 1980s, the U.S. government set up centers at Cornell, Princeton, University of Illinois at Urbana-Champaign and University of California at San Diego to create a national supercomputer network. Later extensions came to the University of Pittsburgh and the University of Nevada, L.V.

Last year, the National Security Agency, which monitors global traffic, started work on a new supercomputer center at its headquarters at Fort Meade, Md., at a cost of nearly $900 million.