Amazon Web Services (AWS) this week announced it has added a graphical user interface (GUI) tool dubbed AWS OpsHub for Snow Family to provide an alternative to the command-line interface (CLI) tool it makes available to manage its family of Snow appliances deployed in on-premises environments.
At the same time, AWS announced it is making available additional Snowball storage appliances that are 25% faster than previous generations thanks to additional memory, support for more virtual CPUs and 100GB networking.
Finally, AWS is also making it possible to control access to Snow appliance via AWS Identity and Access Management (IAM) and the ability to automate tasks using AWS Systems Manager.
Bill Vass, vice president of technology for AWS, said organizations can now build storage clusters that are now as large as 1.4PB using Snow appliances.
AWS originally rolled out Snow appliances to provide an easier way to upload massive amounts of the data to a public cloud. Once an organization loads data on to an appliance, it is physically shipped to AWS. Once there, AWS personnel transfer that data on to the AWS cloud.
However, with the rise of edge computing, AWS is now also employing Snowball appliances to drive hybrid applications in internet of things (IoT) and other environments. AWS now makes available a Greengrass framework for building hybrid cloud applications. The Snow appliance family provides a more efficient and secure alternative to the rival edge computing platform from Dell Technologies, said Vass.
The GUI from AWS is intended to make it easier for the average IT administrator to provision Snow appliances, which previously would have been accessible only to IT professionals who had mastered the Snow CLI. AWS is even going so far as to include cartoons that demonstrate how to provision its appliances.
While AWS clearly dominates the public cloud, it is not clear to what degree it will be able to extend that dominance to edge computing. AWS is counting on the fact that large numbers of edge computing applications will need to be tightly integrated with application workloads running on its public cloud. Providers of traditional data center platforms are also vying to become the providers of those same platforms, all of which now can be centrally managed via the cloud.
Edge computing platforms will, of course, vary widely in size depending on the use case. However, there may come a day soon when the total number of workloads deployed at the edge exceeds the number of workloads in the cloud. The amount of compute and storage horsepower at the edge will be crucial as organizations look to deliver next-generation applications that process and analyze data real-time, especially as 5G wireless computing networks simultaneously make more network bandwidth available.
AWS may not be the first IT vendor that comes to mind when it comes to edge computing platforms, but it is signaling its intention to be a lot more than just a provider of cloud computing services.