IBM InterConnect 2015 took over the Mandalay Bay and MGM Grand conference centers this week with an estimated audience of more than 20,000 attendees. The event is filled with DevOps evangelists, hybrid cloud experts, and IBM customers converging to share information in presentations and panel discussions.
On Monday I had an opportunity to attend one such panel discussion titled Mobile to Mainframe Experts: DevOps Best Practices for Systems of Engagement and Systems of Record. Yeah, the title doesn’t exactly “pop” but I didn’t let that dissuade me. The panel included a number of respected names in DevOps such as Carmen DeArdo from Nationwide Insurance, and Rosalind Radcliffe and Sanjeev Sharma from IBM.
In spite of the title of the panel discussion most of the conversation revolved around the concept of two-speed IT. In any organization—no matter how streamlined—there are going to be different processes or teams that function at different speeds. Applying the Pareto Principle, even if the organization figured out a way to operate more efficiently there would still be 20 percent doing 80 percent of the work and 80 percent doing the other 20. It would just be a matter of degree.
The problem is that if the speeds are too different problems arise. First, it’s just a waste of resources to have skilled workers sitting idle waiting for a different group or process to catch up. Second, those workers will often get antsy and go rogue—writing code and introducing features that weren’t part of the initial requirements and aren’t part of the planned testing. Now you’re just adding chaos that disrupts the workflow even further.
According to the esteemed panel the goal is to find ways for those processes and teams to function at their own pace without it impeding the overall efficiency —hence two-speed IT. Rather than trying to force a square peg into a round hole, or getting frustrated that the square peg can’t just be round in the first place, you have to find ways to maximize efficiency with the teams and processes you have.
Carmen DeArdo from Nationwide Insurance applies another term as well: variable-speed DevOps. In essence two-speed IT frames the challenge, and variable-speed DevOps is the set of processes and tools organizations can utilize to address the challenge. Removing some of the friction between different groups and enabling the organization to function more effectively is one of the primary benefits of DevOps.
When I spoke to Rosalind a couple weeks ago she stressed repeatedly that the foundation of effective DevOps is breaking down silos and finding common ground. That doesn’t mean all the teams have to be restructured under one organizational unit, or even that everyone has to adopt the exact same tools and processes. It means figuring out how they’re different and how they’re alike, and determining ways to make the most effective use of the common ground where their similarities overlap. It provides a starting point to get the ball rolling, and then things can mature and evolve from there.
In the session at IBM InterConnect Rosalind agreed that variable-speed DevOps is a great approach, but cautioned attendees regarding how it’s used. She told the audience that variable-speed IT should be driven by business needs, not by tools or processes. In other words, you have to stay focused on the business requirements and the need to achieve business goals and then find ways to do that. You can’t try to shrink or adapt the business requirements and goals to fit the processes and tools you already have.
DevOps tools and processes can—and in most cases should—speed processes and teams up. One of the hallmarks of DevOps is speed. Even as organizations embrace DevOps and introduce more automation, there will still be different teams and processes functioning at different speeds—and a need to implement variable-speed DevOps to enable those different speeds to exist in harmony.