The cloud continues to grow in popularity as businesses look to take advantage of digital trends. However, the term “cloud” is multi-definitional and means different things to different types of organizations.
In the small business segment, cloud likely means software as a service (SaaS), as those organizations want turnkey applications offered on a pay-as-you-go model. For larger companies, cloud means public infrastructure as a service, such as Amazon Web Services or Microsoft Azure.
Private clouds are alive and kicking
For large businesses, the cloud likely means hybrid where private data centers make up most or even all of the cloud infrastructure. The ZK Research 2018 Global Cloud Forecast projected that by 2020, there would be more workloads in private clouds than in public clouds or available as legacy on-premises workloads. (Note: I am an employee of ZK Research.)
I predict that the adoption of private clouds will cause a rise in demand for co-location services as companies look to quickly build infrastructure without the associated complexity and risk of having to tie together all the components of a modernized data center.
Co-location providers have gaps with visibility and control
While co-location providers address a number of the challenges associated with running a data center, there are gaps in the areas of visibility and control. Critical applications are running in the facility, and when there is a problem, IT professionals need to see what is going on and then have the right information to take quick action and fix the problem.
This has been an ongoing problem ever since there have been data centers. In the early 2000s, things were done ad hoc by looking through log files, scrubbing them, and looking for trends manually. This worked for small deployments, but it certainly didn’t scale, so Information Technology Infrastructure Library (ITIL) was developed. This is a detailed set of best practices for managing the data center. ITIL certainly helped provide visibility into data center operations and actions that need to be taken to optimize performance, but things change.
ITIL is no longer enough
Technology evolution is like a train that keeps on rolling and rolling faster than ever. Those old ITIL standards may have worked a decade ago, but they don’t today. Several new technologies, such as machine learning, artificial intelligence (AI), and blockchain, are disrupting data centers.
To many, those technologies may seem futuristic and for only leading-edge companies. The fact is, though, things like AI are coming quickly, and businesses that don’t adopt them will be extinct in the blink of an eye.
Automation to the rescue
The reliance on data centers and the growth in data combined with the speed that businesses operate today mean that manual troubleshooting and remediation is too slow to be effective and puts businesses at risk. What’s required now is better automation that can make day-two operations almost autonomous. Ideally, the data center provider would have API access to the infrastructure, enabling it to interoperate with public clouds so customers can migrate data or workloads from cloud to cloud if they choose.
Visibility is also a key requirement. Unlike visibility tools of the past that show simple up/down status of infrastructure, what’s required today is a complete view of public and private cloud assets across all infrastructure down to the component level. This will benefit operations but also provide a view into any area that could cause a future problem or might be a security risk.
For most large enterprises, private clouds are as important, if not more important, as public clouds. Co-location providers such as QTS, Digital, and Equinix provide an excellent alternative to building these in house and are worth looking into. Some are starting to use this as a point of differentiation.
On its most recent earnings call, QTS CEO Chad Williams stated, “Our Service Delivery Platform, or SDP, remains a key differentiator and further establishes QTS as a lead innovator within the hybrid colocation market.”
It’s good to see a move away from speeds and feeds to the software that provides visibility and the ability to automate operations. This will minimize downtime without having to chew up valuable IT time.