It goes without saying that mainframes are powerful. These computers can perform more operations per second than any other commercial system, which is why most banks (not to mention many government agencies, insurance companies, retailers and other businesses that manage massive amounts of data) rely on Big Iron for their indispensable data analytics functions.
And to say analytics are indispensable is underselling their value. Data analysis is an absolutely integral part of the new economy, and any organization seeking an edge needs an edge in analytics. Mainframes are a good match to provide the speed that leading companies are looking for, but many companies are still held back by their software.
+ Also on Network World: Why banks love mainframes +
When it comes to analyzing data, too many companies still use extract, transform, load (ETL) processes and software to draw information from other systems. ETL essentially copies data packets from their original hosts to the machine doing the analysis. And while mainframes have the storage space to host these copies, doing so misses the entire point of having such a musclebound machine. It’s like that old myth about how human beings use only 10 percent of their brains: A mainframe using ETL is taking advantage of only a small percentage of its own capacity.
But there is a new way to analyze data without copying it first. That system is called data virtualization, and it’s the perfect fuel for mainframes—or really any system that relies on real-time analysis of data.
What data virtualization does
Data virtualization extracts only those pieces of data that are necessary to run analytics, “virtualizing” the information rather than creating a full copy. This creates so many advantages over ETL in terms of speed, security and access that the mainframe will start to feel like it’s suddenly using 100 percent of its brain. Companies using mainframes without data virtualization are missing out on the main thing that makes mainframes so useful: their raw power.
How much faster did the keyword search feature make research? Instead of poring over an entire document, going over unnecessary facts, users could simply scan for a keyword and immediately access the pertinent information. That’s what data virtualization means for analytics: the ability to look only at the information applicable to the current operation. So, not only can a data virtualization system scan data much faster, but it can extract and store much more data at a time because each virtualized fragment is much smaller than a full copy. This lets a mainframe do a truly unprecedented amount of number-crunching in a very short time.
Speed and security
Faster analytics means more current information, which is the gold standard in big data. Until we can read the future, stakeholders want real-time information based on current data to make informed decisions for their business today and tomorrow. Data virtualization works so quickly that each virtualization is updated in real time as it changes at its source. In contrast, ETL creates a static copy of the data as it existed at the moment the operation began, creating a snapshot of a particular moment, which, when you consider the speed of analytics, is already in the distant past.
Data virtualization also offers a major advantage in security. I mentioned that ETL makes copies. How great an idea is it to make multiple copies of sensitive information? A data virtualization system extracts only the data it needs to analyze, rather than copying the entire file, and then stores it in a data vault. Even if a hacker were to breach a vault powered by a mainframe, virtualized data never really leaves its original system, so technically there is no breach.
Too good to ignore
Ongoing upgrades are a fact of life, not only in the tech world but in all aspects of business and industry. But knowing which upgrades to make is a better than racing to grab every new option that comes along. The jump from ETL to data virtualization is so dramatic that I’d call it mandatory. If a mainframe is the most powerful computer available for data analysis, then you need data virtualization to see just how mighty Big Iron really is.
This article is published as part of the IDG Contributor Network. Want to Join?