The technology industry is entering a phase of transformation as significant as the advent of the Internet itself: It is about software defining everything.
[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":580759,"post_type":"guest","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"cloud,","session":"C"}']As Marc Andreessen noted in 2011, “software is eating the world.” More and more every year, our life is becoming software: our communication with others is software, our productivity comes from software, our entertainment is software, our maps are software, our health is software, our wars are software, even our cars and planes are software.
You all know of Google’s driverless car, which might even be safer than human drivers. But I did not know about planes. Last week I had a chance to talk with the pilot of my 777 flight. He explained to me that the time is long gone when there were actual cables between the control wheel and the rudder, ailerons, and elevators. What the pilot feels in his hands is software-driven. And if on a Boeing, the pilot can still force a decision, according to my pilot, on a modern Airbus the plane’s software is the ultimate decision maker.
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
All this digital activity has profound implications for the information technology industry.
IT will need to scale at a much different level, to support billions of humans creating, exchanging, and tracking millions of pieces of information apiece every day. With such scale, the advent of the cloud and the “software defined datacenter” is ineluctable.
The IT industry will transform as fast as possible, which probably means over a time span of 15 years. When you think of a $3.5 trillion industry turning itself on its head, 15 years is extremely fast: It means that over $200 billion of spending will change form every year, from devices to software, from to data center design to infrastructure.
With such disruptive changes, some established players will die, and some innovative start-ups will win. One small example: Nicira, which was acquired for $1.26 billion by VMware earlier this year. Now, $1 billion is a tiny drop in the overall transformative flood, but it is a huge success for a start-up.
Look at the telecommunications industry, which went through similar transformation with the move from PSTN networks to full data in 15 years. None of the giants of yesterday have survived. AT&T is still called AT&T, but it has little to do with the AT&T of our parents’ generation.
[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":580759,"post_type":"guest","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"cloud,","session":"C"}']
There are still people who wonder if it will really happen, and people who continue investing in the old paradigm. HP’s write off of $8.8 billion of its Autonomy acquisition comes as no surprise to me. One of our telecommunications customers tried to deploy Autonomy in the context of its cloud offering. Eventually, they gave up. It was simply too complex, and with the wrong business model.
In the 1990s, I met many people doubting that the Internet could become a ubiquitous replacement for all networks (including wide area network data, enterprise, voice, and video distribution). Many felt it was insecure and unreliable, not suited for mission-critical applications. Yet the trend was already visible, and proved to be inescapable.
The same is happening today with those resisting the cloud and the software-defined data center.
Let’s look at the segment closest to my heart, storage. Within IT, the storage market is about $100 billion. It has traditionally been considered as part of the “hardware” category by analysts, who typically sub-categorize it into high-end, mid-range, and low-end. The market includes SAN (higher-end storage area networks) and NAS (network-attached storage, now increasingly also a consumer technology). This industry has been essentially stable for the past 20 years, with successive refinement of these paradigms.
[aditude-amp id="medium2" targeting='{"env":"staging","page_type":"article","post_id":580759,"post_type":"guest","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"cloud,","session":"C"}']
Now the storage market is about to go through a complete disruption.
In the software-defined datacenter era, storage is going to be a software application running on generic servers. These servers will leverage different technologies and storage media to provide the right performance profile at the right cost.
Two designs will co-exist in the software-defined datacenter:
- The converged architecture model, offered by start-ups like Nebula and Scale Computing, will combine the storage, networking, and compute components into a single appliance, which can be horizontally scaled out. We expect this model to prevail for most applications, which do not require a huge amount of storage.
- For the 20 percent of the use cases, those in which storage is the determining scaling factor, compute and storage will be separate, and the storage will be provided as a multi-form, unified, and scaleable storage pool.
At Scality, we have been working on software-defined storage long before the term was trendy. Our vision is that for a large scale datacenter what is needed is a storage pool which delivers scalability, control and cost effectiveness. The old, siloed paradigm of a dedicated storage for each application is dead. What is required a storage pool which is multi-form, can connect to applications without requiring them to be re-written, can deliver the performance and capacity profile required by each application, while protecting each application from one another.
[aditude-amp id="medium3" targeting='{"env":"staging","page_type":"article","post_id":580759,"post_type":"guest","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"cloud,","session":"C"}']
Without going into too much detail, we have built just that: a flexible, powerful architecture, which we’ve been delivering as Tier 1 storage for two years.
When we started building towards our vision, the servers required to deliver such unified, enterprise-class storage service at a cost-effective price did not even exist. But with recent announcements from HP, SGI, Penguin Computing, Super Micro, and others, we now have all the hardware we need, for now and for years to come.
As the scale grows, saving energy will become more and more critical, and we see hard drives that can be stopped and started on demand on the horizon, essentially achieving long-term storage for close to zero watts.
We are not the only ones to have this vision. Ashish Nadkarmi, who is a research director within IDC’s Storage Systems research practice, just posted an article sharing pretty much the same vision, and I continue seeing new start-ups coming out of stealth and singing the praise of enterprise class, software-defined storage.
[aditude-amp id="medium4" targeting='{"env":"staging","page_type":"article","post_id":580759,"post_type":"guest","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"cloud,","session":"C"}']
The world is changing. Our data centers need to get ready for it.
Disclosure: Scality is one of the sponsors of CloudBeat 2012.
Top photo credit: Leonardo Rizzi via photopin
[aditude-amp id="medium5" targeting='{"env":"staging","page_type":"article","post_id":580759,"post_type":"guest","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"cloud,","session":"C"}']
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More