The next big thing in computing could be systems to save the world from drowning in a rapidly rising sea of data, the head of deep computing at IBM told Reuters in an interview.

Dave Turek said powerful new tools were urgently needed to deal with the volume of data now generated by machines such as sensors, RFID systems and medical instruments -- as well as people -- that is threatening to become overwhelming.

In 2006, the world generated, captured and replicated about 161 billion gigabytes -- about 3 million times the information in all the books ever written -- according to market research firm IDC.

IDC predicts that number will increase more than six-fold by 2010, as the amount of digital bits churned out by scientific institutions, the financial and medical sectors as well as media file sharing over the likes of YouTube accelerates.

A lot of the time people talk about what's the next killer application, you know, voice over IP or something like that, Turek said, referring to the sector's constant search for the holy grail that will drive sales of a new technology.

I actually think there is a new class of applications emerging that has the potential to change everything and those are applications that I would put in the category of real-time stream processing.

Speaking at the International Supercomputing Conference in Dresden, Turek said simply storing all the data and then analyzing it later was not a solution, since many industries needed to be able to use them to get fast answers.

I think the issue isn't the cost of storage, I think the issue is the volume of data and the requirement to get answers quickly. Because you see if your model is: 'I'm going to generate data and then store it,' you're sort of wasting time.

Turek cited the examples of financial traders wrestling with multiple data streams while making split-second decisions or high-tech medical institutions needing to make fast decisions on the basis of clinical data.

IBM, whose BlueGene/L was confirmed this week as the world's most powerful computer system for the fourth time running, has been researching the problem of real-time analysis for the past four years and has spent hundreds of man-years on the project.

Nobody knows the answers to any of these questions. All I'm saying is that when you ask me about where the future's going, this as a thematic area is going to be a big area, he said.

It may take two or three years to figure this out but somebody will figure this out and I think this will be a major, major part of what the industry's all about.