Operations | Monitoring | ITSM | DevOps | Cloud

Latest Posts

Telemetry Data Compliance Module

Telemetry data sent from applications often contains Personally Identifying Information (PII) like names, user IDs, phone numbers, and other information that must be obfuscated before the data is sent to storage or observability tools, in order to be in compliance with corporate or government policies such as HIPAA in the US or the GDPR in the EU.

Pipeline Module: Event to Metric

At the most abstract level, a data pipeline is a series of steps for processing data, where the type of data being processed determines the types and order of the steps. In other words, a data pipeline is an algorithm, and standard data types can be processed in a standard way, just as solving an algebra problem follows a standard order of operations.

OpenTelemetry: The Key To Unified Telemetry Data

OpenTelemetry (OTel) is an open-source framework designed to standardize and automate telemetry data collection, enabling you to collect, process, and distribute telemetry data from your system across vendors. Telemetry data is traditionally in disparate formats, and OTel serves as a universal standard to support data management and portability.

What's New With Mezmo: In-stream Alerting

Here at Mezmo, we see the purpose of a telemetry pipeline is to help ingest, profile, transform, and route data to control costs and drive actionability. There are many ways to do that as we’ve previously discussed in our blogs, but today I’m going to talk about real-time alerting on data in motion, yes - on streaming data, before it reaches its destination.

Webinar Recap: Mastering Telemetry Pipelines - A DevOps Lifecycle Approach to Data Management

In our webinar, Mastering Telemetry Pipelines: A DevOps Lifecycle Approach to Data Management, hosted by Mezmo’s Bill Balnave, VP of Technical Services, and Bill Meyer, Principal Solutions Engineer, we showcased a unique data-engineering approach to telemetry data management that comprises three phases: Understand, Optimize, and Respond.

Open-source Telemetry Pipelines: An Overview

Imagine a well-designed plumbing system with pipes carrying water from a well, a reservoir, and an underground storage tank to various rooms in your house. It will have valves, pumps, and filters to ensure the water is of good quality and is supplied with adequate pressure. It will also have pressure gauges installed at some key points to monitor whether the system is functioning efficiently. From time to time, you will check pressure, water purity, and if there are any issues across the system.

SRECon Recap: Product Reliability, Burn Out, and more

I recently attended SRECon in San Francisco on March 18 - 20, a show dedicated to a gathering of engineers who care deeply about site reliability, systems engineering, and working with complex distributed systems at scale. While there were a lot of talks, I’ll focus on a few areas that gave me the most insight into how having the right data impacts an SREs and an organization’s success.

Webinar Recap: How to Manage Telemetry Data with Confidence

In our recent webinar hosted by Bill Balnave, VP of Technical Services, and Brandon Shelton, our Solution Architect, we discussed how data's continuous growth and dynamic nature cause DevOps and security teams to lose confidence in their data. The uncertainty about the content of telemetry data, concerns about its completeness, and worries about sending sensitive PII information in data streams reduce trust in the collected and distributed data.

Webinar Recap: Myths and Realities in Telemetry Data Handling

Telemetry data is growing exponentially, but the business value isn’t increasing at a similar pace. Getting the right telemetry data is hard, so I recently had a conversation with Matt Aslett, Director of Research at Ventana Research, now a part of ISG, about five myths and realities in telemetry data handling.