Will I need Capacity Optimization for SDDC?

Software-defined data center (SDDC) is the vision of IT being able to achieve higher levels of scalability, flexibility and resilience by extending powerful concepts such abstraction, pooling and automation from virtualization to the entire data center, at all (in particular compute, storage and network) layers. A SDDC promises to enable “IT as a service” (ITaaS), that is to provide customers the ability to manage their services and easily adapt to changing business needs without being constrained by the underlying specifics of the IT infrastructure level.

There are several vendors working in this direction but it is difficult to predict how long it would take for the IT industry to make this dream come true. While we wait for this vision of SDDC to become reality, there is at least one question related to Capacity Management which I think can be discussed today, independently from how SDDC will be implemented tomorrow. The question simply put is: “Will Capacity Management be still a required capability for SDDC“? My simple answer is “Yes, and even more“. Before leveraging a more authoritative opinion to support my personal one, please allow me to look at this question from an “historical” perspective.

You will remember that a similar question was raised when new virtualization technologies (what we could now call “software-defined computing”) were introduced about 15 years ago. However, as soon as people started adopting these technologies for hosting critical services, they started to realize that Capacity Management solutions was still required for addressing old (e.g. resource contention and over-commitment) and new challenges (e.g. VM sprawls, idle VMs, VM snapshots, etc) and even more then ever for properly managing, planning and optimizing their virtual infrastructures (what we could now call “software-defined computing”). More recently, the same question resurfaced when the cloud computing paradigm – Clouds were initially associated to availability of infinite resources (supposedly at almost no cost). And again, it soon became clear how important Capacity Management is both for Cloud Admins to properly plan and control clouds infrastructures and for provide capacity and cost visibility (or chargeback) capabilities to Tenants. Now it is true that both virtualization and cloud computing deeply transforming the discipline of Capacity Management while injecting new life into it. And this is what we can expect to happen, likely to a further degree, when SDDC will be adopted. However, there is no such a thing as a free lunch – and only a good Capacity Management discipline can help you to stay healthy and in shape.

See for example what Torsten Volk (leading Analyst at EMA) writes in his blog “State of the Software Defined Data Center“ when describing what he considered the Key Areas of Enterprise IT Investments in 2014: “Capacity management: Seemingly an old and boring discipline, capacity management is staging a comeback in 2014. Really, we shouldn’t call it “comeback”, but “metamorphosis” from the ugly duckling, conducted by a small number of data center gearheads, to a truly critical data center discipline that constitutes an essential part of every infrastructure and application management planning, deployment and management decision. Within the massively heterogeneous environment of the SDDC, capacity management has to truly “understand” the specific requirements of each individual application workload. Therefore, I’m happy to declare 2014 the year of truly dynamic and application-aware capacity management.

Over the last 20 years, BMC has continuously innovated its Capacity Optimization solutions to allow customers to automatically control and plan the capacity of their infrastructures – whether mainframe or distributed, physical or virtual, converged or cloud – based on the ability to accurately model workload profiles for each different application and to estimate expected demand as represented by Business KPIs and reservations for new projects. Please come join us at BMC Engage user conference and learn about our plans for making sure that your next IT infrastructure will better support your digital services!

These postings are my own and do not necessarily represent BMC's position, strategies, or opinion.

Share This Post


Giuseppe "G" Nardiello

Giuseppe "G" Nardiello

My role is Senior Manager in the Performance & Availability Product Management team and Product Family Lead for Capacity Optimization. I joined BMC Software in October of 2010 as part of the Neptuny acquisition, where I held both Product Manager and Business Development Manager roles. Previously, I worked for IBM/Tivoli, HP and BMC partners. I have about 20 years of experience in System Management and Capacity Management and I hold a degree in Physics/Cybernetics and a PhD in Computer Science.