Welcome to the rise of the digital enterprise, where vendors and customers engage via applications and data is the new currency. Digital enterprises operate with radically different datanomics than conventional physical businesses. Here, digital information is the business.
A successful digital enterprise is constantly updating applications in response to user context, market and environment—all of which is quantified, measured and delivered with data. Everything and everyone is personified by a digital footprint. Learning that the user just bought a new house requires change in recommendation from renters to home insurance. Reduction in price by a competitor or a new promotion needs a fast response. It is increasingly clear that a company’s ability to generate high-quality apps more rapidly is a critical differentiator. Fast is the new big.
+ Also on Network World: Top 10 DevOps tools +
In response to the rapidly changing business requirements, application developers adopt the Continuous Integration and Continuous Delivery (CI/CD) model where applications are constantly updated, tested and released. The key to delivering higher quality applications faster is instant availability of data needed for development and testing of applications.
And data is rapidly getting bigger, more distributed and a lot more continuous. The traditional operational model of managing and delivering data to developers through a manual “supply-chain” process fails miserably. Digital enterprises operate in a DevOps model for a more agile relationship between the “data vendor” (operations) and the “data consumer” (developers). This DevOps model of a digital enterprise needs a very different technology to manage, protect and access data to meet the needs of speed, scale and datanomics. This gave birth to data virtualization technology where operations manage data as a digital infrastructure.
IT transitions to support the new datanomics model
In the first blog, we asked the question: How did radical changes in datanomics drive the adoption of data virtualization technologies, resulting in the complete disruption of the storage industry? Let’s look at a few case studies.
Rosetta Stone, the global leader in language learning, transitioned from delivering CDs and DVDs to online offerings, resulting in an explosion in the size of the data needed to be stored. First, there was a practical problem of not enough physical space in the data center for all the storage systems needed to store the data. Then there was the real problem of managing, protecting and making this data available as fast as possible to the developers.
Rather than continuing to treat data as a physical entity where more data meant more storage, Rosetta Stone chose to virtualize their data, eliminating redundancies that resulted in a significantly smaller footprint of unique data that could be re-used for multiple use cases. Data virtualization also helped them decouple their data from infrastructure, allowing them to create a “data cloud” and the freedom to easily move any part of their operations to the cloud. The result? A business that was agile, a development team that delivered faster applications and an operations team that manages their data as a service on any infrastructure.
Oh, and they eliminated a significant footprint of storage infrastructure in their data center.
Lifescript, a women’s health website, went through a similar transformation process as it transitioned to a digital enterprise. They began the journey by moving to an all-flash storage data center model to accelerate data access speed. To then accelerate access, management and protection of rapidly growing data, they adopted data virtualization technology to support their DevOps initiative. Their developers now have self-serve, instant access to data, while the operations team is able to deliver agile, resilient data as a service.
Oh, and they eliminated a majority of their spinning disk storage infrastructure and reduced the overall storage footprint.
Datanomics of a digital enterprise requires adoption of data virtualization technology, allowing operations teams to deliver self-serve, instant access to data for their consumers, anywhere, using any infrastructure. The resulting reduction in storage footprint, decoupling of data from infrastructure and the operational ease of managing the data life cycle has created a massive disruption to the storage infrastructure industry.
Fast, secure data access: the only game in town
With every aspect of our world becoming digital, there is a rapid rise of analytics, machine learning and deep learning applications to help businesses serve their users better and become more efficient. In this journey, data becomes the single-most sustaining competitive advantage for any business. And delivery of this data as a service using data virtualization technology has become the new normal.
This article is published as part of the IDG Contributor Network. Want to Join?