There’s no shortage of new development trends, but virtual reality, containers, big data, hyperconnected apps and IoT are affecting testing the most today.
If you like change and challenge, it’s a fun time to work in software. Some technology developments, such as cloud computing, have been gaining speed slowly over many years. That steady progression has meant that developers can stay on top of trends easily and be informed about market changes. Other areas, such as containers, have come to fruition much faster without as much notice. Development teams have had to scramble to stay ahead of these trends, hoping to capitalize on a their efficiency gains while staying ahead of the curve. Here’s a look at five important software trends which are having a profound effect on testing.
Docker made containerization cool and exciting in 2013, even though containers have arguably been around for many years, if not decades. Developers love containers because the technology makes it much easier to set up and install software in different environments, such as in the public cloud or in a private data center. Containers wrap up all of the infrastructure components that otherwise would be installed separately into one easily portable package with a common standard. And since multiple containers can run on the same hardware without creating problems, you can do more with less infrastructure. Typically, containers can move between different environments with relative ease—a nice benefit for multicloud infrastructures. Containers have evolved quickly from an experimental platform to a mainstream infrastructure technology, with many major businesses taking advantage of containers to deliver their product at scale.
Testing considerations: It’s hard to find much at fault for containers when it comes to facilitating software testing. Because of the ease in setting up test environments and moving code from one environment to another during the production cycle, testers can work much faster. Before containers, there was a lot more waiting around by both developers and testers. When it takes two weeks to set up an environment for a testing cycle that took two hours, teams would tend toward less frequent releases or would do less testing. Now, it is a matter of minutes to set up a testing environment. This is great for companies and development teams that value testing and want to test more frequently with greater access to a variety of environments.
Another boon with containers is stability. In the past, bugs sometimes appeared due to configuration errors from setting up dev/test environments. Given the plug-and-play nature of containers, there’s little if any concern about the proper configurations being completed before a testing environment can be used. Security has been an ongoing concern with containers, especially in highly regulated sectors such as health care, government and financial services. In those cases, companies may have to support two versions of the code—one in a containerized platform and one in a more traditional virtual or bare-metal environment. We believe this will be a short-term situation as containerization technology matures and becomes adopted across all industries.
Whether you want to use this vague and overhyped term or not, big data is simply a new way to refer to larger than normal data sets. Big data became a big trend as the hardware and software to store and manage it got a lot more powerful and cheaper, and the cloud shift helped to centralize data sets into one location. Some large companies and research organizations are seeing interesting results from big data projects already, whether in more targeted drug development (precision medicine) or just-in-time environmental data for farmers. Now, every Tom, Dick and Harry is trying to get into the big data market, which means there are too many solutions to count. A natural weeding-out process will occur over the next few years, as the prominent big data players and NoSQL standards converge into two or three leading providers.
Testing considerations: Big data applications bring value through the way they manage and/or analyze data, not so much through the user interface. Therefore, testing for big data applications is going to require more technical, database-oriented skills. Those skills are also different from the typical relational database skills of the past, given the fact that big data platforms typically are designed to manage structured and unstructured data. These are open source tools and systems such as NoSQL and Hadoop, along with large vendor systems such as SAP HANA. Testers in this field will need to be constantly adapting learning these new technologies.
Investors have funneled $8.8 billion into the sector since 2012, according to SuperData Research. Google, Facebook, Microsoft, Samsung and HTC are all in the game. The market is segmented between VR, augmented reality (AR) and mixed reality (MR) and consists of both traditional (PC/console) and mobile device applications. VR is a fully immersive, fully virtual experience that mimics reality, while AR and MR are different variations of blending real and virtual environments together. VR applications aren’t just for gaming anymore: We now are seeing the technology being used for product design and collaboration, tourism/travel and entertainment, including film.
Testing considerations: VR is an emerging technology still, with many different types of platforms, no clear leaders and no real standards in place. That makes it more difficult for developers and testers to make strategic decisions around technology. As well, there aren’t any great tools for automated testing in these environments yet. Therefore, the need for manual and real-world testing will add time and cost to the project. Yet, investing in doing this right is critical for ensuring optimal user experience. In virtual reality environments, it’s difficult to simulate experiences, such as testing for motion sickness, without actually playing with the real devices. This is especially challenging with the newer mobile platforms, such as Google Cardboard, which incorporate a viewing device that transforms the mobile phone into a VR headset. That experience simply can’t be simulated accurately on a PC, a common testing strategy for other mobile apps. Companies testing mobile VR apps may need to invest in the purchase of dozens or hundreds of different mobile devices to gain an accurate picture of how the application performs for most users.
Web-based and consumer apps comprise the bulk of hyperconnected apps, characterized by having deep, often real-time integration with other applications. Instagram and Facebook, Gmail and Google Calendar, Salesforce and their wealth of apps—these are just a few examples of hyperconnectivity. The advent of REST APIs has facilitated far easier and cheaper integration so any company can build a platform of third-party software in ways they couldn’t have imagined five or 10 years ago. Similarly, with the explosion of the app economy, including open source software, it doesn’t make good economic sense to write 100 percent of code in your application any longer. Product value is increasing the result of how developers can patch together different technologies in innovative ways.
Testing considerations: Given the importance of APIs in hyperconnected applications, a primary testing consideration will be ensuring the performance validity of those connection points. It’s imperative for both developers and testers to be aware of API update schedules and to test those updates immediately for any negative impact on the application. Beyond API testing, load and performance testing are also important considerations for hyperconnected applications, which typically involve heavy and/or unpredictable transaction processing. Overall, these types of testing are much more technical in nature than manual UI testing and will require testers to expand their skill set to optimize value.
Internet of Things (IoT)
The interest in (and hype) for the IoT sector rivals the largest IT trends, such as big data. Initial applications have focused on connecting devices, such as home lighting and thermostat systems, which use sensors to stream data to smartphones. Over time, we will see more device-to-device connections such as a thermostat collecting data from a smoke detector to ensure that the house isn’t on fire if the temperature rises dramatically. Products such as Amazon Echo and Google Nest might be at the heart of many new consumer-based IoT applications, but there is also ample potential in sectors such as utilities, health care, industrial equipment and government (a.k.a. smart city initiatives).
Testing considerations: As IoT applications mature, connectivity complexity will grow. IoT applications can involve many components: the hardware of the smart device or product and the software that controls it, along with the user applications that interact with the product. When troubleshooting an issue or vulnerability, a tester must investigate the applications that collect, analyze and display the IoT data, the various network connections across which that data travels, the IoT appliance’s hardware and its software. Hardware testing must incorporate typical scenarios such as what happens to the user experience when a device starts to run low on battery or goes offline. Testers will be most valuable when they have experience in both hardware and software testing; alternatively, an organization can have two teams of testers but they must be in very close contact. IoT apps should also be configured to collect and deliver more alerts and logging data than mobile apps, due to the higher number of components involved. This will make it easier to prevent, troubleshoot and fix problems when they arise.
About the author/Kevin Dunne
Kevin Dunne is VP of Strategy and Business Development at QASymphony.