Data virtualization: Like rocket fuel for your mainframe

It goes without saying that mainframes are powerful. These computers can perform more operations per second than any other commercial system, which is why most banks (not to mention many government agencies, insurance companies, retailers and other businesses that manage massive amounts of data) rely on Big Iron for their indispensable data analytics functions.

And to say analytics are indispensable is underselling their value. Data analysis is an absolutely integral part of the new economy, and any organization seeking an edge needs an edge in analytics. Mainframes are a good match to provide the speed that leading companies are looking for, but many companies are still held back by their software.

+ Also on Network World: Why banks love mainframes +

When it comes to analyzing data, too many companies still use extract, transform, load (ETL) processes and software to draw information from other systems. ETL essentially copies data packets from their original hosts to the machine doing the analysis. And while mainframes have the storage space to host these copies, doing so misses the entire point of having such a musclebound machine. It’s like that old myth about how human beings use only 10 percent of their brains: A mainframe using ETL is taking advantage of only a small percentage of its own capacity.

But there is a new way to analyze data without copying it first. That system is called data virtualization, and it’s the perfect fuel for mainframes—or really any system that relies on real-time analysis of data.

Leave a Reply

Your email address will not be published. Required fields are marked *