Skip to main content [aditude-amp id="stickyleaderboard" targeting='{"env":"staging","page_type":"article","post_id":1760573,"post_type":"guest","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"cloud,dev,enterprise,security,","session":"D"}']
Guest

10 myths about enterprise adoption of Docker

At Docker's DockerCon conference in San Francisco on June 23.

Image Credit: Jordan Novet/VentureBeat

In the past 12 months, enterprises considering Docker have been presented with a daunting series of steps on their journey from virtualization to containerization. Docker has made the first step easy and free. For the enterprise, it’s the next steps where the pain begins.

In a market that’s been dominated by open source and developer-driven initiatives, separating the hype from reality has been difficult. The showcase early adopters such as Google, Twitter, Facebook, and Netflix have set an idealized vision that is far from the raw reality of the enterprise. This vision has been further emboldened by the promise of a developer-defined, microservice-dominated future.

[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":1760573,"post_type":"guest","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"cloud,dev,enterprise,security,","session":"D"}']

Approaching the post-hype era

Hype has been a critical ingredient to launch the container movement. It has driven significant investment and unprecedented attention to the potential of containerization.

But as we approach the post-hype era, a bigger challenge lies ahead: creating practical paths for the enterprise to evolve from virtualization to containerization.

AI Weekly

The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.

Included with VentureBeat Insider and VentureBeat VIP memberships.

The future will be different, for sure, but the lessons learned from VMware should not be lost on the likes of Docker, Google, CoreOS, and the rest of our peers. A hype-free, clear-headed approach to this problem must combine VMware-like control with Docker-enabled freedom, flow, and collaboration.

On the journey from virtualization to containerization: top 10 Docker myths

As your travel guide on this journey from virtualization to containerization, we’d like to destroy a few myths to make your journey easier:

Myth 1: IT operations is dead

IT ops is alive and well and has a role with Docker. Docker truly enables a separation of concerns, but that does not mean that we shouldn’t be concerned with IT and operations anymore. What the enterprise is looking for is VMware-like IT control that they can leverage to unlock the business value of Docker-enabled freedom, flow, and collaboration. The issues with the PaaS market have taught us that some level of infrastructure transparency is a requirement for most enterprise applications today. That lesson should not be lost on Docker.

Myth 2: Microservices are mandatory

Relax, “macroservices” are OK! The journey of a thousand services begins with just one container. The actual first step of many customers is to take their monolithic application and simply put it in a container. This provides the perfect opportunity to learn Docker, orient yourself with the concept of services, and gain a foothold on a successful migration.

[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":1760573,"post_type":"guest","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"cloud,dev,enterprise,security,","session":"D"}']

Myth 3: Enterprises are ready to load up the container ship

In Docker’s ideal world, everything is standardized and shipped in the same size containers. The reality is, however, that the enterprise today looks more like an ark — it has two of everything (at least). Enterprises run most applications behind the firewall, while some have made it to the cloud, and many still are hosted in colocation facilities or across multiple data centers. The hosts are diverse and need to be assigned and provisioned to groups, roles, and projects by business policy. We have some miles left in the journey before the enterprise puts full faith into the container ship.

Myth 4: Many small servers beat a few big hosts

There’s actually an emerging trend — almost retro in nature — towards consolidating servers running VMware and virtual machines down to a smaller number of bare-metal servers running containers with no VMware. While the initial vision of containers preached ephemeral microservices over hundreds of spot instances — very much a Netflix or Amazon Web Services model — the potential of further consolidation of servers and services on big iron and bare metal may be easier to manage and cheaper to deploy for many, many enterprises.

[aditude-amp id="medium2" targeting='{"env":"staging","page_type":"article","post_id":1760573,"post_type":"guest","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"cloud,dev,enterprise,security,","session":"D"}']

Myth 5: VMware’s time is short

Not so fast! VMware has set the standard for IT management and has led the vision of software-defined everything. What enterprises really want is the best of both worlds: the control and confidence that VMware delivered but packaged in a less costly, more efficient, application-oriented way. It’s a long game and we’re only in the top of the first inning. Sure, Docker has a bunch of developers on base (and a big-league budget, too), but it doesn’t look like they’ve scored any runs yet as VMware’s pitching and defense is top-notch. Stay tuned.

Myth 6: Docker is not secure

Docker is getting more and more secure with every patch, and there are best practices that can make Docker security fit into enterprise policies today. The first step is to get Docker users out of the command line and to prevent them from needing privileged access on hosts to do their job. To make Docker secure, tie users to enterprise security services such as LDAP and use a Docker graphical user interface (GUI) or dashboard as a layer between Docker and the end users in development and IT. Make sure there is authenticated access across the management plane, for cluster and resource membership, and for service discovery. And keep an audit trail of activity on who does what and when.

[aditude-amp id="medium3" targeting='{"env":"staging","page_type":"article","post_id":1760573,"post_type":"guest","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"cloud,dev,enterprise,security,","session":"D"}']

Myth 7: Developers want to deploy apps and define infrastructure

Not really. Most developers want to code, commit, and repeat. Devops methodologies put more responsibility on developers to deploy and own the results of their code. This is a good thing, but that doesn’t mean they’re all good at it or want to spend time at it. Yes, the concept of developer-defined infrastructure is sexy and transformative, but the enterprise reality is that there’s a large population of deployment engineers, configuration managers, and IT ops professionals that own it today and are good at what they do. They need to be brought into the process.

Myth 8: Enterprises are flocking to Google-scale data centers run with Borg-like automation

Enterprises run a heterogeneous mix of private data centers, clouds, hosted and legacy environments, and colocation facilities run with VMware-centric tools. That’s pretty far from the microservice-based, cloud-native applications running on commodity hardware in a highly automated, software-defined environment automated by Kubernetes or Mesos. Sure, perhaps we’ll all get there one day, but on the journey, containerizing as you go can provide immediate cost, efficiency, and business benefits.

[aditude-amp id="medium4" targeting='{"env":"staging","page_type":"article","post_id":1760573,"post_type":"guest","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"cloud,dev,enterprise,security,","session":"D"}']

Myth 9: Docker is a cloud-centric solution

Docker is running behind the firewall perhaps as much as it is running in public cloud providers such as Amazon or Google. In fact, the excitement around running Docker across on-premises data centers or on bare-metal servers is picking up major steam. The promise of containers is that they can run anywhere. You don’t need to move to the cloud. You can run your stuff where you are right now and decide what’s right for you over time. Cloud is cool. On premises is cool.

Myth 10: Docker will solve all the things

Docker cannot create all the pieces alone. Unlike Hadoop (big data) and OpenStack (infrastructure management), the journey from virtualization to containerization has the opportunity to transform something much bigger: how enterprises develop applications, consume infrastructure, and run IT. In fact, this is not Docker’s journey alone to lead. This journey belongs to the developer, devops, and IT communities to not only change the tools they use but more importantly to transform the processes and the culture needed to build the next foundation for enterprise applications and IT infrastructure.

[aditude-amp id="medium5" targeting='{"env":"staging","page_type":"article","post_id":1760573,"post_type":"guest","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"cloud,dev,enterprise,security,","session":"D"}']

Bob Quillin is chief executive and a cofounder of StackEngine.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More