At its annual user conference in Portland, Oregon, in September, NGINX delivered releases focusing on the application platform market and support of delivering microservices architectures:
In addition to launching the NGINX Application Platform, the company has added the ability to use NGINX Plus as a Kubernetes Ingress Controller. Based on the open-source NGINX Ingress Controller for Kubernetes, this new feature enables the deployment of applications within Kubernetes and Red Hat OpenShift anywhere across a cluster so they can be reached by outside traffic.
While these products combine to deliver a complete and manageable microservices solution, the most interesting of these is NGINX Unit. According to Owen Garrett, head of products, the complement of tools facilitates management of both north/south and east/west traffic for cloud-native applications. It’s small, lightweight, fast, polygot-enabled and programmable through an API and works with other Unit instances to deliver a service mesh.
Key features of Unit that should be of particular interest to businesses deploying microservices relate to the ability to foster continuous delivery. For example, when the router accepts new configurations from the Controller process, the worker threads start to handle new incoming connections with the new configuration, while old connections can continue to be processed by the threads according to the previous configuration. That is: Router worker threads can work simultaneously with several generations of configurations.
Additionally, Unit uses interprocess memory to communicate with the applications, thus allowing the Unit to provide greater agility in routing of HTTP requests. So, rather than forcing the application to directly listen, they can delegate network handling to the service mesh providing for increased scalability. They accept the clients’ requests, pass the requests to the application processes, get responses back from the applications and send the responses back to the clients. Each worker thread polls epoll or kqueue and can asynchronously work with thousands of simultaneous connections.
Everyone knew HashiCorp was attempting to find a buyer. Few suspected it would be IBM.
Embrace revealed today it is adding support for open source OpenTelemetry agent software to its software development kits (SDKs) that…
The data used to train AI models needs to reflect the production environments where applications are deployed.
Looking for a DevOps job? Look at these openings at NBC Universal, BAE, UBS, and other companies with three-letter abbreviations.
Tricentis is adding AI assistants to make it simpler for DevOps teams to create tests.