“You have to be able to ask questions of your data in real-time and then respond to the answers,” says Apurva Dave, VP of Marketing, Jut. Jut leverages live streaming, batch analytics, and visualization to enable enterprises to ask important queries of their operational data using the Jut operations data hub, which Jut created with DevOps goals (such as Continuous Deployment with Quality) in mind.
Jut supports real-time and historical data queries across all log, event, and metrics data. “Our platform is built on dataflow, dealing with large scale streaming data,” says Dave. Jut is an option for organizations that don’t want to go out and build their own operations data hub software.
Core Data Types
Jut intends to join data such as user activity (on websites and software), event data (triggered operationally such as support ticket filings, customer transactions, and software upgrades or migrations), and unstructured data (application and system logs) together for analysis, says Dave. Jut works with structured data such as metrics data (such as application level data like the user numbers and key action numbers or system level data like CPU utilization, memory utilization, and network I/O data), explains Dave.
Jut attempts to analyze the whole of your data to answer questions you may have like: “What resulted when I deployed a new version of my application into production?” Jut might answer based on an analysis correlating system resource utilization (“Was the app too resource intensive this time? More so than we expected?”).
Or you may ask Jut this question: “How did my code respond to user activity / user demand? How did it perform?” Jut could ask this question based on its assimilation of both unstructured and structured data. “It’s common for enterprises to A/B test two separate versions of a site or software feature. Using Jut, you can ingest those test results and analyze them in real-time to determine what version produces the desired effect,” says Dave.
Big Data Analytics & a Huddle about Juttle
“We have users ingesting tens of billions of data points per month. That number is growing fast,” says Dave. Jut targets big data management for easier analysis using data infrastructure techniques that simplify / streamline, integrate, and manage big data backend infrastructure. “We also use our own dataflow language to quickly iterate and visualize lots of data,” says Dave. All this is designed to ease DevOps teams’ work in accessing and assessing operational data to discern how the software and the business that it serves are performing.
Jut designed its Juttle dataflow programming language for data analysis and data visualization. And Juttle is dedicated to those tasks alone. “While you could try to cobble together the right packages and libraries in a general language like Python or Java to do these analytics, it would be relatively messy, time-consuming, and error prone,” says Dave.
Juttle is an extensible language enabling analytics and visualization. “Jut’s data scientist built an anomaly detection algorithm entirely in Juttle and then published it on GitHub for anyone to use inside Jut,” says Dave.
Juttle uses a structured approach to querying your data. “The approach looks like a series of piped UNIX commands, so anyone familiar with scripting languages such as JavaScript or query languages such as SQL should feel at home with Juttle,” says Dave. Jut responds to questions in seconds to tens of seconds (for larger data sets) based on iterative, interactive analysis.
Investigate, Question, Consider (Try?)
Outsourcing to Jut puts the enterprise in a position to focus on existing coding duties while proving whether Jut is the better operational analytics move for analytics performance and cost-savings (Jut offers a free trial). But as with the analytics themselves, ask a lot of questions of Jut. Do your homework before committing even to trial software (Jut has a demo and an online “playground” as well).
Tip: Jut made sparkling comments from its Beta-development-phase customers such as NPM and Mylio available. Rather than include those for your consideration, I suggest you reach out through your connections to these and other enterprises that are experimenting with this data hub software.