Three ways to make data deliver: Route, reduce, transform
Written by Ian Tinney
July 7, 2021
There are numerous ways to control your spending on data analysis. Ian Tinney explores the options and how you can apply them in the real world.
Businesses produce and utilise zettabytes of data on an annual basis, so to ensure your data analysis platform doesn’t become a very expensive storage space, you’ll want to be able to control the data ingestion without throttling it. That means you need visibility of the data inflight, so you can observe and control it before it reaches an expensive analysis platform.
Data is spread across a variety of systems. Agents are often used to get data from its source to an analysis platform. Agents take care of some of the difficult tasks associated with reading log data, for example, file-locking, seek-pointers, and encrypting and compressing data when transmitting to the destination. But agents are not designed to allow us to perform complex routing or transforming tasks on the data.
A stream processor is able to sit between the source and destination and perform a series of useful functions that can deliver important benefits.
Looking at data between its source and destination, we can then determine its nature, its value and therefore, where it needs to be sent. This might be multiple destinations for different reasons. IT and business teams often want to use the same datasets for different purposes, so the data may need to be sent to more than one destination.
Businesses therefore typically control their data analysis costs by choosing to…
Route it to different technology platform destinations for a variety of business purposes, without the need for additional agents
Reduce it by dropping, sampling or suppressing events to make sure they are only consuming the data they need
Transform it in flight by shaping it into different formats supporting different business requirements
Let’s take a more detailed look at each method and what they involve in practice.
Real world results
These aren’t just hypothetical examples – they’re real world instances where customers are using the Cribl Logstream to control and manage their data. LogStream can receive data from any source, streamline and reshape it before sending it on to one or multiple destinations. So, you can combine all your data flows and use one tool to parse, restructure and enrich data in-flight before you pay to analyse it, shrinking consumption costs.
If you’d like to see Cribl Logstream in action, why not sign-up for a demonstration. Or, if you’d like to give it a spin, register for a free trial.