IT as Code

Why the Buzz on Software-Defined Everything (SDx)?

You’ve likely heard the buzz getting louder the last few years around software-defined everything (SDx). You’re probably finding a swarm of companies knocking on your door selling a host of services around implementing software-defined networking (SDN), software-defined storage (SDS) and software-defined data center (SDDC). What you may not realize is that this software-defined movement is truly a disruptive technology shift that is dramatically changing the way an organization accesses storage, networks and data centers.

The overarching mission of SDx is to break down the discrete IT silos (compute, storage and networking) by using software to bridge the technological and organizational gaps. By giving software systems a starring role in managing different kinds of hardware, the premise is that productivity will increase through more robust network access from both portable and traditional devices. 

What are the true potential and outcomes of SDx that an IT administrator can tout to management? For starters, early adopters have reported being able to deliver significant cost savings and greater speed of delivery. They cite benefits such as decreased hardware costs, greater automation, more rapid deployment of IT resources and shorter timelines to implementation—to name the key ones. 

Let’s take the traditional data center, where most organizations grapple with the high costs of ongoing management and maintenance of network switches, servers, storage devices and other hardware. Keeping these resources updated can be complex and expensive for an IT department. Moving to an SDDC, for example, can lower the total cost of ownership by reducing hardware costs (servers, racks, disk and tape, routers, etc.), and minimizing the data center size and the use of power, cooling and ongoing maintenance. With an SDDC, the IT department enjoys the benefits of a more simplified process that increases the speed of appropriating and provisioning networking. Software-automated provisioning can enable application deployment within minutes, for example.

Instead of the IT department being considered a hindrance, SDDC allows developers to create applications that are best for the business, instead of what works based on the hardware.

“Every technology organization needs to be able to transform ideas into working functionality that creates value for customers. That means you need not only world-class development processes but infrastructure processes as well—and that’s where software-defined data centers become absolutely relevant,” said Gene Kim, co-author of  “The Phoenix Project: A Novel About IT, DevOps, and Helping Your Business Win.”

The main elements of an SDDC include:

  • A software implementation of a computer, which is called virtualization.
  • An SDN where hardware and software resources are merged into a software-based virtual network.
  • An SDS that includes storage virtualization.
  • Automation software that allows an IT administrator to provision, control and manage all SDDCs components.

The second big driver of SDx adoption by IT departments is the increased agility they experience in responding to business demands. Much of this is made possible with SDS, which is an innovative way to restructure your infrastructure by separating hardware from software, and using software in place of disks and networking devices. SDS uses standard servers to manage storage requests by virtualizing your physical storage. This gives you more control over where and how data is stored.

Some of the key benefits of SDS over traditional storage include:

  • Automatic storage management capabilities reduce the number of manual tasks for your storage administrators. They adapt to different needs without any human oversight or new hardware requirements.
  • Flexibility to increase your storage requirements while decreasing the capacity of your existing hardware.
  • Lower costs since SDS solutions run on standard hardware, and the automation reduces the number of administrators required.
  • More scalability options, depending on your business requirements, by adding extra storage arrays or including additional CPUs and memory.

 

 

As business requirements continue to increase in speed and number, forward-thinking IT departments are moving away from hardware commitments and rigid architectures that constrict their ability to react and adjust to changing needs. Software-defined technologies are becoming the innovative powerhouses behind building more efficient and agile IT services for today and well into the future.

Melinda Cross

Melinda Cross

Melinda Cross is co-founder of Venture\90 and president at Melinda Cross Communications.

Recent Posts

AIOps Success Requires Synthetic Internet Telemetry Data

The data used to train AI models needs to reflect the production environments where applications are deployed.

15 hours ago

Five Great DevOps Jobs Opportunities

Looking for a DevOps job? Look at these openings at NBC Universal, BAE, UBS, and other companies with three-letter abbreviations.

1 day ago

Tricentis Taps Generative AI to Automate Application Testing

Tricentis is adding AI assistants to make it simpler for DevOps teams to create tests.

3 days ago

Valkey is Rapidly Overtaking Redis

Redis is taking it in the chops, as both maintainers and customers move to the Valkey Redis fork.

4 days ago

GitLab Adds AI Chat Interface to Increase DevOps Productivity

GitLab Duo Chat is a natural language interface which helps generate code, create tests and access code summarizations.

4 days ago

The Role of AI in Securing Software and Data Supply Chains

Expect attacks on the open source software supply chain to accelerate, with attackers automating attacks in common open source software…

4 days ago