NGINX has extended the reach and scope of a namesake application platform that the company is positioning as a bridge between legacy applications and emerging microservices architectures.
The NGINX Application Platform now includes NGINX Unit 1.0, an open source application server designed to simplify microservices across multiple application types simultaneously using a single instance of a server. It supports Go, Perl, Python, PHP and Ruby, with additional support planned for Java and JavaScript support soon. Sidney Rabsatt, vice president of product management at NGINX, said NGINIX Unit 1.0 is different from other application servers in that it is configured via an application programming interface (API) and is 30 percent faster than existing app servers.
NGINX in the second quarter plans to add new monitoring dashboards, alerting and management via NGINX Controller R1 offering that provides the capabilities of a service mesh for microservices without having to set up a separate dedicated framework. Service meshes are a critical capability because they provide a means to manage interdependent microservices at scale.
Finally, NGINX Plus R15 for enterprise customers builds on new HTTP/2, gRPC and JavaScript features that have been added to the core open source NGINX platform to provide support for high-availability clustering and OpenID Connect capabilities. These capabilities extend the ability to deploy NGINX as a web server, load balancer, API gateway, Kubernetes Ingress controller and sidecar proxy, said Rabsatt.
Rabsatt noted most IT organizations do not want to set up separate infrastructure for managing microservices. Managing legacy and microservices-based applications in DevOps environments will be a fact of life for years to come, he said, adding that based on the number of downloads of NGINX software being used on Kubernetes platforms running Docker containers, that message is already resonating.
NGINX claims users have pulled a billion NGINX instances from the Docker Store, including more than a million pulls of the NGINX Kubernetes Ingress controller. According to the company, more than 3 million NGINX instances are deployed in production microservices environments across more than 250 customers. That’s critical for the company because some rivals have positioned service meshes as an existential threat to the NGINIX platform. Overall, NGINIX says there are more than 11 million instances of NGINX are downloaded each month, an average of four instances every second.
Rabsatt said most developers have made it clear they don’t have the patience to wait for IT operations teams to stand up dedicated microservices infrastructures. Most IT operations teams still don’t have the skills required. NGINX provides the ability to deploy microservices at scale using a platform most developers already know how to programmatically invoke on their own, he said.
At the same time, NGINX is also committed to making available best practices that DevOps teams can employ when trying to manage what amounts to a new type of hybrid computing environment spanning legacy applications and microservices, he added.
It remains to be seen exactly where the divide between microservices and legacy applications will be laid down inside most organizations. Some will draw a proverbial line in the sand, while others will look to deconstruct existing applications into series of microservices. Regardless of the approach taken, all those existing legacy applications will not be disappearing anytime soon.
— Mike Vizard