We hate to break up the party, but for those who haven’t yet noticed, the great debate between private vs. public cloud deployments is drawing to an end. Gartner predicts that 50 percent of large enterprises will have hybrid cloud deployments by the end of 2017. Here’s why hybrid cloud deployments will win the day:
1. The “security issue” becomes manageable: Sensitive data is maintained behind the firewall, while less sensitive data is released to a public cloud.
2. Balanced workload and related costs: It’s much cheaper for companies to spike at peak loads to a public cloud than trying to plan a complete transition to a public cloud. From a financial perspective, enterprises will maximize existing investment in their data centers.
3. Regulations and geography: Complete flexibility to specify what data is stored where and under which terms and conditions.
4. Heterogeneous devops teams: Cloud deployments will be less reliant on an elite team to build and maintain every element of your deployment.
So going hybrid, for most enterprises makes sense based on expense, flexibility, and dedicated resources, both human and non-human. But a significant gray area remains around application deployment interoperability that companies will need to resolve. And they can do so by addressing the following questions:
- Where can I see my complete deployment?
- Which resources are being used and by whom?
- Which elements of the deployment should be moved to a public facility vs. kept on-premise?
- What are the direct, fixed and variable costs types?
- What would the deployment configuration look like, and what will it take to ensure SLA targets are met?
And the list goes on.
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
This is where granular visibility into multi- and hybrid cloud environments becomes essential. Governance and control monitoring solutions will encompass entire cloud deployments and provide chief information officers (CIOs) with key performance indicators (KPIs) such as levels of security, availability, utilization, cost, and performance.
CIOs will use these KPIs to make informed decisions such as workload placement, capacity purchasing, budget planning, backup, and business continuity strategy. Connecting these metrics with financial metrics like revenues will, for the first time, provide CIOs with insights to existing cloud investments and forecasted cloud resource requirements.
Picture this scenario: a CIO can purchase just the right amount of capacity from the right mix of cloud vendors based on anticipated workload spikes from an upcoming marketing campaign.
Clearly, choosing that right mix of vendors is no easy feat. In the public cloud business, the war is on among Amazon Web Services, Google Compute Engine, and Microsoft Azure, with many other big vendors also joining the front lines.
OpenStack, VMware, Citrix’s CloudStack, on-premises Joyent, and others are also battling it out for the private-cloud business. OpenStack is the open-source rising star, while others are proprietary platforms. Some, like VMware, are quite expensive, and most address only a fraction of a typical customer’s needs. Though less mature, OpenStack is the fastest developing and most widely accepted platform. In fact, multiple public-cloud vendors (Rackspace, HP, IBM, etc.) use OpenStack’s application programming interface (API) to manage their own clouds and present this as a competitive advantage.
While OpenStack is free, building and maintaining an OpenStack cloud is not. CIOs wishing to build their own cloud must either work with a dedicated systems integrator or hire in-house talent. Devops people who are capable of establishing and running their own private cloud data centers are as rare and expensive as gold dust. Early adopters include large multinational banks, Expedia, Yahoo, eBay, HP, IBM, Cisco, and thousands more.
In dynamic organizations with distributed development, where new application releases are deployed to the cloud every day, agility is the name of the game. CIOs are seeking tools that will take them beyond operational monitoring dashboards and guide them on how to resolve problems faster.
This is where companies like Cloudyn accelerate the decision-making process. Last week, we announced a natural extension of our existing platform to support hybrid clouds. We are the first and only provider of multi- and hybrid cloud monitoring and optimization solutions. Our self-service technology helps CIOs view, project and plan their cloud costs and consumption. This includes tackling workload comparisons between clouds based on a business unique deployment, showing KPI trends, and allocating cloud resources between environments.
Sharon Wagner is co-founder and chief executive of Cloudyn, a provider of cost monitoring and optimization for cloud deployments. Before joining Cloudyn, Sharon was senior principal consultant for CA Technologies’ Cloud Connected Enterprise business unit.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More