DevOps.com

  • Latest
    • Articles
    • Features
    • Most Read
    • News
    • News Releases
  • Topics
    • AI
    • Continuous Delivery
    • Continuous Testing
    • Cloud
    • Culture
    • DevSecOps
    • Enterprise DevOps
    • Leadership Suite
    • DevOps Practice
    • ROELBOB
    • DevOps Toolbox
    • IT as Code
  • Videos/Podcasts
    • DevOps Chats
    • DevOps Unbound
  • Webinars
    • Upcoming
    • On-Demand Webinars
  • Library
  • Events
    • Upcoming Events
    • On-Demand Events
  • Sponsored Communities
    • AWS Community Hub
    • CloudBees
    • IT as Code
    • Rocket on DevOps.com
    • Traceable on DevOps.com
    • Quali on DevOps.com
  • Related Sites
    • Techstrong Group
    • Container Journal
    • Security Boulevard
    • Techstrong Research
    • DevOps Chat
    • DevOps Dozen
    • DevOps TV
    • Digital Anarchist
  • Media Kit
  • About
  • AI
  • Cloud
  • Continuous Delivery
  • Continuous Testing
  • DevSecOps
  • Leadership Suite
  • Practices
  • ROELBOB
  • Low-Code/No-Code
  • IT as Code
  • More
    • Application Performance Management/Monitoring
    • Culture
    • Enterprise DevOps

Home » DevOps at IBM » Mainframes and the Myth of Bimodal in DevOps

Mainframes and the Myth of Bimodal in DevOps

By: contributor on September 23, 2016 1 Comment

Rewind 20 years ago when we lived in a business world where big companies with unlimited resources beat out smaller, more strapped companies. The digital era—also known as the “age of the customer”—is changing all that. Where big once beat small, fast now beats slow. The DevOps approaches of startups often make them more adept at continuously delivering the innovations and conveniences that earn customer raves. Bigger, established companies often feel disadvantaged‚ shackled by an outdated culture and inflexible processes that haven’t kept up with the pace of change.

Recent Posts By contributor
  • How to Ensure DevOps Success in a Distributed Network Environment
  • Dissecting the Role of QA Engineers and Developers in Functional Testing
  • DevOps Primer: Using Vagrant with AWS
More from contributor
Related Posts
  • Mainframes and the Myth of Bimodal in DevOps
  • BMC Integrates DevOps Mainframe Portfolio With Git
  • The Risks of Shadow Code
    Related Categories
  • Blogs
  • DevOps at IBM
  • Variable Speed DevOps
    Related Topics
  • Agility
  • bimodal IT
  • development
  • devops
  • mainframe
  • waterfall
Show more
Show less

Gartner coined the term “bimodal IT” as a way for older IT establishments to keep pace with agile startups. In the bimodal model, IT teams are separated into two distinct styles of work—one focused on supporting predictable, well-understood workloads with legacy systems and another that is more fast-moving, experimental and adaptable in meeting users’ constantly evolving demands and flourishing digital expectations.

DevOps Connect:DevSecOps @ RSAC 2022

But the bimodal concept is not well-aligned to the spirit of innovation underlying modern application development and delivery. This is especially true for bigger organizations using mainframes. Here’s why.

Successful DevOps Depends on Mainframe Agility

The goal of DevOps is to roll out high-performing software and updates, ahead of the competition and with maximum speed, ease and cost-efficiency. In a DevOps model, software delivery can only happen as quickly as the slowest team. Modern web and mobile applications tend to span multiple platforms in their end-to-end transaction—from front-end web servers (systems of engagement) all the way back to mainframes (systems of record). IT places a lot of emphasis on systems of engagement, often taking back-end systems of record for granted. The reality is, the value in systems of engagement cannot be realized without the systems of record—making it essentially all one system. Developers supporting modern applications must be able to maneuver across and between all platforms, especially within mainframe code and data.

Let’s look at a tactical example of this. According to industry research, 80 percent of the world’s corporate data—including customer data—continues to reside on mainframes. When a mobile web application is used to make a bank deposit, or purchase a product, for example, the transactional component of those applications typically is executed on back-end mainframes.

This means mainframe applications are crucial to development efforts. What happens when mainframe developers and testers are locked into waterfall processes or don’t have the modern tools necessary to understand application logic or access mainframe data for testing? Frustration mounts and the overall innovative effort is unnecessarily stalled.

Stability and Agility are not Mutually Exclusive

The longstanding myth that stability and agile cannot co-exist is simply untrue. Unfortunately, it’s at the core of negative perceptions surrounding the mainframe today—people think the mainframe environment is too slow and cumbersome to support agile development. Development teams may think if they have mainframes, the only approach it can support is waterfall.

The reality is, though, no customer will accept poor availability and performance in exchange for new features delivered quickly.  People with hands-on experience painfully understand that slow, waterfall delivery methods produce far greater quality issues and missed expectations, resulting in competitive risk. Additionally, large enterprises taking the “slow and stable” route give startups a head start in beating them to market, with offerings that customers want now.

Alternatively, Agile process and DevOps tool chains enable scrum teams to test quality in terms of usability, function and stability early and often in the development cycle—when the time to remediate is the fastest and the cost the lowest. Agile process and DevOps tool chains build higher quality applications and enable continuous improvement in quality over time.

Re-Platforming is not Necessarily the Answer

Some DevOps proponents believe moving off the mainframe—to a supposedly more modern, commodity server-based distributed architecture—is the best way to circumvent perceived challenges to nimbleness and agility. But this is almost never the best answer. The mainframe retains its distinction as being the most powerful computing platform on the planet. It remains deeply embedded in large organizations, setting the standard for reliability, security and availability for mission-critical data and transactions. The new IBM z13 mainframe, for example, can handle the load of 100 Cyber Mondays every day, 365 days a year. Currently, mainframes process 30 billion business transactions per day. The post-modern mainframe is an engineering marvel unequaled as a system-of-record, business compute platform.

The mainframe also offers cost advantages. Studies have shown that the mainframe is more cost-effective than commodity servers in handling the massive computing volume increases instigated by mobile applications. This is because decreases in commodity server pricing have not kept pace with the increase in computing volumes. In addition, leading vendors have worked together to deliver intelligent cost management on the mainframe, enabling workloads to be distributed in a way that keeps mainframe licensing costs (MLCs) down.

In our view, “biomodal” advocates for the exact problem that must be solved. We think the answer is not replacing the mainframe, although if mainframe development is left unaddressed, this can present obstacles to DevOps. Rather, the answer is to evolve the mainframe to keep up with the pace of DevOps. There are mounting examples from large, high-performing organizations that it is possible to apply modern practices to the fullest extent, including mainframe applications. These organizations do this by giving developers the tools they need to work on mainframe code and data just as they would with any other programming language on any other platform, while meeting the escalating quality expectations of the digital age. For example:

  • Mainstreaming the Mainframe Environment – Mainframe users need to move away from the antiquated “green screen” mainframe environment, adopting state-of-the-art interfaces and tools. This is the first step enabling developers to work more fluidly on the mainframe, in an environment that’s much more familiar and comfortable to them. Organizations that fail in this adoption will have a very difficult time enabling the next generation of development talent to leverage invaluable and irreplaceable mainframe code and data. This will place a huge hindrance on DevOps success.
  • Leveraging Tools from the Java Development World – One example is visualization capabilities, which enable developers to quickly and easily identify critical interrelationships between apps and data. This helps them to sidestep unintended, undesirable impacts as they modify applications, increasing confidence and mitigating unnecessary delays.
  • Make it Easier to Identify Mainframe Code-Level Issues – Developers need tools that enable them to identify problems with mainframe code; for example, when it is the source of a widespread user performance issue, or when it is consuming too many resources and driving up costs unnecessarily.

Conclusion

In our view, “bimodal” is not the solution, but a broken state of disadvantaged affairs for too many organizations. From a competitive vantage point, bimodal will only hurt DevOps teams and hold back the innovative efforts of the organizations they support.

The bimodal IT model is a road to inevitable failure for those enterprises seeking success in the digital economy. This is especially true for organizations that rely on mainframes. A smartly leveraged mainframe is an extremely valuable competitive asset that needs to be removed from its silo, evolved and smoothly integrated into modern, heterogeneous development environments.

It is possible for big, established companies to adopt DevOps, become agile, and move as fast as startups. But the ability to think and act like a startup requires some major fundamental changes, particularly at the level of software development. The good news is, the legacy technologies that are often perceived as impediments can actually be a tremendous competitive advantage, if they are harnessed and evolved in the right manner.

About the Author / Christopher O’Malley

chris-omalley2Christopher O’Malley is CEO of Compuware. He has nearly 30 years of IT experience with past positions including CEO of VelociData, CEO of Nimsoft, EVP of CA’s Cloud Products & Solutions and EVP/GM of CA’s Mainframe business unit, where he led the successful transformation of that division. Connect with him on LinkedIn and Twitter.

Filed Under: Blogs, DevOps at IBM, Variable Speed DevOps Tagged With: Agility, bimodal IT, development, devops, mainframe, waterfall

Sponsored Content
Featured eBook
DevOps: Mastering the Human Element

DevOps: Mastering the Human Element

While building constructive culture, engaging workers individually and helping staff avoid burnout have always been organizationally demanding, they are intensified by the continuous, always-on notion of DevOps.  When we think of work burnout, we often think of grueling workloads and deadline pressures. But it also has to do with mismatched ... Read More
« Demystifying Scrum for the Ops Crowd
Webinar: Bringing Continuous Integration to the Database at Radial »

TechStrong TV – Live

Click full-screen to enable volume control
Watch latest episodes and shows

Upcoming Webinars

Continuous Deployment
Monday, July 11, 2022 - 1:00 pm EDT
Using External Tables to Store and Query Data on MinIO With SQL Server 2022
Tuesday, July 12, 2022 - 11:00 am EDT
Goldilocks and the 3 Levels of Cardinality: Getting it Just Right
Tuesday, July 12, 2022 - 1:00 pm EDT

Latest from DevOps.com

Rust in Linux 5.20 | Deepfake Hiring Fraud | IBM WFH ‘New Normal’
June 30, 2022 | Richi Jennings
Moving From Lift-and-Shift to Cloud-Native
June 30, 2022 | Alexander Gallagher
The Two Types of Code Vulnerabilities
June 30, 2022 | Casey Bisson
Common RDS Misconfigurations DevSecOps Teams Should Know
June 29, 2022 | Gad Rosenthal
Quick! Define DevSecOps: Let’s Call it Development Security
June 29, 2022 | Don Macvittie

Get The Top Stories of the Week

  • View DevOps.com Privacy Policy
  • This field is for validation purposes and should be left unchanged.

Download Free eBook

The Automated Enterprise
The Automated Enterprise

Most Read on DevOps.com

What Is User Acceptance Testing and Why Is it so Important?
June 27, 2022 | Ron Stefanski
Rust in Linux 5.20 | Deepfake Hiring Fraud | IBM WFH ‘New No...
June 30, 2022 | Richi Jennings
Chip-to-Cloud IoT: A Step Toward Web3
June 28, 2022 | Nahla Davies
DevOps Connect: DevSecOps — Building a Modern Cybersecurity ...
June 27, 2022 | Veronica Haggar
The Two Types of Code Vulnerabilities
June 30, 2022 | Casey Bisson

On-Demand Webinars

DevOps.com Webinar ReplaysDevOps.com Webinar Replays
  • Home
  • About DevOps.com
  • Meet our Authors
  • Write for DevOps.com
  • Media Kit
  • Sponsor Info
  • Copyright
  • TOS
  • Privacy Policy

Powered by Techstrong Group, Inc.

© 2022 ·Techstrong Group, Inc.All rights reserved.