Rewind 20 years ago when we lived in a business world where big companies with unlimited resources beat out smaller, more strapped companies. The digital era—also known as the “age of the customer”—is changing all that. Where big once beat small, fast now beats slow. The DevOps approaches of startups often make them more adept at continuously delivering the innovations and conveniences that earn customer raves. Bigger, established companies often feel disadvantaged‚ shackled by an outdated culture and inflexible processes that haven’t kept up with the pace of change.
Gartner coined the term “bimodal IT” as a way for older IT establishments to keep pace with agile startups. In the bimodal model, IT teams are separated into two distinct styles of work—one focused on supporting predictable, well-understood workloads with legacy systems and another that is more fast-moving, experimental and adaptable in meeting users’ constantly evolving demands and flourishing digital expectations.
But the bimodal concept is not well-aligned to the spirit of innovation underlying modern application development and delivery. This is especially true for bigger organizations using mainframes. Here’s why.
Successful DevOps Depends on Mainframe Agility
The goal of DevOps is to roll out high-performing software and updates, ahead of the competition and with maximum speed, ease and cost-efficiency. In a DevOps model, software delivery can only happen as quickly as the slowest team. Modern web and mobile applications tend to span multiple platforms in their end-to-end transaction—from front-end web servers (systems of engagement) all the way back to mainframes (systems of record). IT places a lot of emphasis on systems of engagement, often taking back-end systems of record for granted. The reality is, the value in systems of engagement cannot be realized without the systems of record—making it essentially all one system. Developers supporting modern applications must be able to maneuver across and between all platforms, especially within mainframe code and data.
Let’s look at a tactical example of this. According to industry research, 80 percent of the world’s corporate data—including customer data—continues to reside on mainframes. When a mobile web application is used to make a bank deposit, or purchase a product, for example, the transactional component of those applications typically is executed on back-end mainframes.
This means mainframe applications are crucial to development efforts. What happens when mainframe developers and testers are locked into waterfall processes or don’t have the modern tools necessary to understand application logic or access mainframe data for testing? Frustration mounts and the overall innovative effort is unnecessarily stalled.
Stability and Agility are not Mutually Exclusive
The longstanding myth that stability and agile cannot co-exist is simply untrue. Unfortunately, it’s at the core of negative perceptions surrounding the mainframe today—people think the mainframe environment is too slow and cumbersome to support agile development. Development teams may think if they have mainframes, the only approach it can support is waterfall.
The reality is, though, no customer will accept poor availability and performance in exchange for new features delivered quickly. People with hands-on experience painfully understand that slow, waterfall delivery methods produce far greater quality issues and missed expectations, resulting in competitive risk. Additionally, large enterprises taking the “slow and stable” route give startups a head start in beating them to market, with offerings that customers want now.
Alternatively, Agile process and DevOps tool chains enable scrum teams to test quality in terms of usability, function and stability early and often in the development cycle—when the time to remediate is the fastest and the cost the lowest. Agile process and DevOps tool chains build higher quality applications and enable continuous improvement in quality over time.
Re-Platforming is not Necessarily the Answer
Some DevOps proponents believe moving off the mainframe—to a supposedly more modern, commodity server-based distributed architecture—is the best way to circumvent perceived challenges to nimbleness and agility. But this is almost never the best answer. The mainframe retains its distinction as being the most powerful computing platform on the planet. It remains deeply embedded in large organizations, setting the standard for reliability, security and availability for mission-critical data and transactions. The new IBM z13 mainframe, for example, can handle the load of 100 Cyber Mondays every day, 365 days a year. Currently, mainframes process 30 billion business transactions per day. The post-modern mainframe is an engineering marvel unequaled as a system-of-record, business compute platform.
The mainframe also offers cost advantages. Studies have shown that the mainframe is more cost-effective than commodity servers in handling the massive computing volume increases instigated by mobile applications. This is because decreases in commodity server pricing have not kept pace with the increase in computing volumes. In addition, leading vendors have worked together to deliver intelligent cost management on the mainframe, enabling workloads to be distributed in a way that keeps mainframe licensing costs (MLCs) down.
In our view, “biomodal” advocates for the exact problem that must be solved. We think the answer is not replacing the mainframe, although if mainframe development is left unaddressed, this can present obstacles to DevOps. Rather, the answer is to evolve the mainframe to keep up with the pace of DevOps. There are mounting examples from large, high-performing organizations that it is possible to apply modern practices to the fullest extent, including mainframe applications. These organizations do this by giving developers the tools they need to work on mainframe code and data just as they would with any other programming language on any other platform, while meeting the escalating quality expectations of the digital age. For example:
- Mainstreaming the Mainframe Environment – Mainframe users need to move away from the antiquated “green screen” mainframe environment, adopting state-of-the-art interfaces and tools. This is the first step enabling developers to work more fluidly on the mainframe, in an environment that’s much more familiar and comfortable to them. Organizations that fail in this adoption will have a very difficult time enabling the next generation of development talent to leverage invaluable and irreplaceable mainframe code and data. This will place a huge hindrance on DevOps success.
- Leveraging Tools from the Java Development World – One example is visualization capabilities, which enable developers to quickly and easily identify critical interrelationships between apps and data. This helps them to sidestep unintended, undesirable impacts as they modify applications, increasing confidence and mitigating unnecessary delays.
- Make it Easier to Identify Mainframe Code-Level Issues – Developers need tools that enable them to identify problems with mainframe code; for example, when it is the source of a widespread user performance issue, or when it is consuming too many resources and driving up costs unnecessarily.
Conclusion
In our view, “bimodal” is not the solution, but a broken state of disadvantaged affairs for too many organizations. From a competitive vantage point, bimodal will only hurt DevOps teams and hold back the innovative efforts of the organizations they support.
The bimodal IT model is a road to inevitable failure for those enterprises seeking success in the digital economy. This is especially true for organizations that rely on mainframes. A smartly leveraged mainframe is an extremely valuable competitive asset that needs to be removed from its silo, evolved and smoothly integrated into modern, heterogeneous development environments.
It is possible for big, established companies to adopt DevOps, become agile, and move as fast as startups. But the ability to think and act like a startup requires some major fundamental changes, particularly at the level of software development. The good news is, the legacy technologies that are often perceived as impediments can actually be a tremendous competitive advantage, if they are harnessed and evolved in the right manner.
About the Author / Christopher O’Malley
Christopher O’Malley is CEO of Compuware. He has nearly 30 years of IT experience with past positions including CEO of VelociData, CEO of Nimsoft, EVP of CA’s Cloud Products & Solutions and EVP/GM of CA’s Mainframe business unit, where he led the successful transformation of that division. Connect with him on LinkedIn and Twitter.