06-06-14 | Blog Post

Speed of change: Enterprise business technology advancing daily (and faster!)

Blog Posts

Note: This is the first of three blog entries from Online Tech Director of Infrastructure Nick Lumsden reflecting on his key takeaways from EMC World 2014: 1. Speed of Change, 2. Shift in Ownership of IT Dollars, 3. Transition to IT-as-a-Service.

In 1965, Intel co-founder Gordon Moore wrote a paper about computer chip performance doubling every 18 months. Today, we call that Moore’s law. Kryder’s law says memory efficiency doubles every 12 months. Nielsen’s law says bandwidth doubles every 21 months.

We’re going to need new laws, because the speed of change for business technology is continuing to advance.

Twenty years ago, if you had stood in the CIOs office and claimed that enterprise applications would eventually see updates multiple times a day you would have generated laughter from your colleagues at the obvious joke. Technology change came at the rate of once a year — and it was painful! — with the goal of moving to twice a year, maybe eventually once a quarter.

Fast forward to the introduction of Agile and a significant paradigm shift occurred in software development — the rate of change advanced to once per month, moving toward bi-weekly. Fast forward again to the rise of DevOps and continuous integration and the rate of change is now advancing to daily and faster. (There are already organizations claiming dozens — even hundreds — of deployments each day).

This speed of change puts pressure on infrastructure and IT organizations to accept change quickly. It is no longer acceptable for changes to take days to complete — even several hours is becoming too long in more advanced organizations. And these IT organizations need the tools to accomplish that speed of change.

This is why “software-defined” services have developed: software-defined networks (SDN), software-defined storage (SDS), software-defined infrastructure (SDI), software-defined data center (SDDC), etc. VMware introduced this capability years ago by abstracting the intelligence from the hardware, bringing it into a software layer, then providing first a CLI and later an API into that common abstraction layer. Server hardware no longer matters – you do not need a Dell solution, HP solution, IBM solution, etc.

EMC and VMware are proposing the same is going to happen to network and storage platforms. EMC released a product (ViPR) to accomplish this and VMware has already built the network virtualization stack (NSX) into its version 5 releases.

Behind each of these three transitions toward software-defined is a recurring theme: Standardize > Virtualize > Automate. (Personally, I would modify this to be more accurately Standardize > Abstract > Automate.) This means having:

  • Standard set of as-a-service offerings;
  • Enforced reference architectures;
  • Automated configuration and management (Execution/Automation Engine);
  • Policy-based Management (Policy Engine);
  • Workflow/Process Orchestration;
  • On-Demand Capacity (Self-Service);
  • Cost transparency; and
  • Tools abstracted from the infrastructure.

This will commoditize hardware further and provide a common software platform to develop against (APIs agnostic of underlying hardware). Hardware/vendor-brand will not be a competitive advantage. As it matures — adopting orchestration, policy engines and execution engines — the technology will allow for anything to be made into a service (XaaS).

EMC claims this is an industry transformation — think mainframe to client server. EMC calls it the third platform — abstraction to software intelligence + hardware agnostic (hence, the term as-a-service); heavy emphasis on mobility and elasticity.

So, buckle up. The speed of change in IT isn’t slowing down anytime soon.

CEOs describe the encrypted cloud: A high-performance, easy-to-buy machine
Lower TCO & business continuity rise as top arguments for the private cloud

Nick Lumsden is a technology leader with 15 years of experience in the technology industry from software engineering to infrastructure and operations to IT leadership. In 2013, Lumsden joined Online Tech as Director of Infrastructure, responsible for the full technology stack within the company’s five Midwest data centers – from generators to cooling to network and cloud infrastructure. The Michigan native returned to his home state after seven years with Inovalon, a healthcare data analytics company in the Washington D.C. area. He was one of Inovalon’s first 100 employees, serving as the principal technical architect, responsible for scaling its cloud and big data infrastructure, and protecting hundreds of terabytes worth of sensitive patient information as the company grew to a nearly 5,000-employee organization over his seven years of service.

Overwhelmed by cloud chaos?
We’re cloud experts, so you don’t have to be.

© 2024 OTAVA® All Rights Reserved