Red Hat is looking to expand the number of organizations embracing best DevOps practices by expanding the monitoring capabilities included in Red Hat Enterprise Linux (RHEL).
The beta release this week of RHEL 8.2 includes an update to Performance Co-Pilot (PCP), a system performance toolkit that now includes collection agents for Microsoft SQL Server 2019 databases.
Scott McBrien, a principal product manager for Red Hat, said by extending the scope of the performance toolkit, Red Hat is looking to make it easier for IT generalists and developers to collaborate across silos to better maintain application performance levels. Support for Microsoft SQL Server 2019 is an outgrowth of the alliance Red Hat formed with Microsoft way back in 2015. However, McBrien said IT organizations should expect Red Hat to continue to expand the reach of PCP.
RHEL 8.2 Beta also simplifies the process of invoking the Red Hat Insights analytics service during the installation process and provides tighter integration with the extended Berkeley Packet Filter (eBPF) to better analyze network traffic.
Finally, the latest version of RHEL extends an Applications Stream capability to segregate runtimes and languages to additional programming environments and provides a tool to test in-place upgrades to RHEL 8 from RHEL 6/7.
McBrien said Red Hat is trying to keep to a six-month upgrade cycle of RHEL to provide IT organizations with a more predictable timetable. Of course, most of Red Hat customers are still running RHEL 6/7 releases. However, as the rate at which applications are built, deployed and updated continues to accelerate in the age of DevOps, the more pressure there will be on IT teams to maintain application performance levels across an increasingly extended enterprise. Inevitably, IT generalists will need access to a broader range of observability tools at all layers of the stack to keep pace.
Adoption of DevOps practices across the enterprise remains uneven at best. However, as more IT generalists are exposed to richer monitoring tools at the operating system level, the easier it becomes to augment or supplant legacy ITIL-based approaches to managing IT. In fact, as tools at the disposal of IT generalists become more sophisticated, the need to rely on various classes of IT specialists declines. Most IT organizations are especially keen to achieve that goal because labor stubbornly remains the single biggest cost of IT.
It remains to be seen to what degree the management of IT will be transformed in the months and years ahead. The one thing that is certain is IT will become more complex as new architectures based on microservices and serverless computing frameworks get adopted. Each additional layer of abstraction may make it easier for developers to build applications, but from the perspective of the IT operations teams, they tend to increase dependencies between systems that are already fairly fragile. The challenge now is to identify where bottlenecks exist between all those systems before a degradation in application performance becomes a full-blown catastrophic event.