Operations | Monitoring | ITSM | DevOps | Cloud

Analytics

Snowflake Vs. AWS Vs. Azure: Which Should You Use?

Millions of organizations, across every industry, use data to improve strategies, products, and services. Yet, there are bucketloads of cloud data platforms on the market. Picking the right one for your needs can be a challenge. This in-depth comparison guide will help you decide between the data clouds that Snowflake, AWS, and Azure offer.

Maximize cloud efficiency: how to cut costs and keep uptime strong

As you move more of your workloads to the cloud, cloud infrastructure costs can quickly become a significant expense. To efficiently and successfully run your applications in the cloud, it’s crucial to take a look at your overall TCO (Total Cost of Ownership) from multiple lenses. In this fireside chat, Mike Zimberg, CTO at Digital Asset Research shares how they achieved a 99.99% uptime, while lowering their managed Apache Kafka® costs by 10% and improving their performance by 10x. Heikki Nousiainen, Aiven’s field CTO and co-founder, shares his expertise in the discussion as well.

How to monitor Snowflake with Grafana Cloud

Snowflake is a cloud-based data warehousing platform that allows organizations to store, manage, and analyze large amounts of data. It offers a scalable, secure, and highly available solution that separates storage and computing resources. We already offer the Snowflake datasource plugin, which allows you to query and visualize data from your Snowflake Data Cloud on your Grafana dashboards.

Exponential Smoothing: A Beginner's Guide to Getting Started

Exponential smoothing is a time series forecasting method that uses an exponentially weighted average of past observations to predict future values. This method assigns more weight to recent observations and less to older observations, allowing the forecast to adapt to changing trends in the data. The resulting forecast is a smoothed version of the original time series less affected by random fluctuations or noise in the data.

Data Aggregation 101: Definition, Uses & Benefits

By 2025, more than 180 zettabytes of data will have been created and captured around the globe. With the influx of data in this digitized world, the process of data aggregation has become an essential tool for businesses. It’s a way to take large amounts of separate data from different sources and combine them into one collective body.

An Introduction to Using OpenTelemetry & Python Together

This post was written by Mercy Kibet, a full-stack developer with a knack for learning and writing about new and intriguing tech stacks. In today’s digital world, software applications are becoming increasingly complex and distributed, making it more challenging than ever to diagnose and troubleshoot issues when they arise.

Left, Right, Center: A 3 Step Dance to Success with Building Data Pipelines

Remember the first time you were at a wedding, or a party and you learned about dances like The Electric Slide? You know, those dances with a clear structure and steps to follow, which were a huge help to someone who was slightly challenged on the dance floor, like me? All you had to do was learn a few simple steps, and you could hang with even the best dancers.