Operations | Monitoring | ITSM | DevOps | Cloud

The Future of Dashboards: Git Sync, SQL Expressions, and Dynamic Layouts | Big Tent S3E5

In this episode of Grafana’s Big Tent, Grafana founder Torkel Ödegaard joins Mat Ryer and Tom Wilkie for a wide-ranging conversation about how Grafana began, why design and usability mattered from day one, and how the project evolved into a platform used by tens of millions — from developers to power stations and even space missions.

Resolve's Zero Ticket Minute - Ep. 8 #itautomation #agenticai #aiautomation

Proof of concept season is over. In Zero Ticket Minute Ep. 8, Ian Coppock explains why AI is now judged by results, not demos. If it’s not reducing downtime, cutting costs, or stopping 2 a.m. pages, it’s not delivering value. The shift is here. From experimentation to execution.

Grafana 12, from the founder's perspective: design, scale, and the next chapter

Sometimes the most interesting engineering stories don’t start with a roadmap or a release plan—they start with personal taste. A preference for good design. A frustration with clunky tools. A desire to see everything in one place.

Building with the InfluxDB 3 MCP Server & Claude

InfluxDB 3 Model Context Protocol (MCP) server lets you manage and query InfluxDB 3 (Core, Enterprise, Dedicated, Serverless, Clustered) using natural language through popular LLM tools like Claude Desktop, ChatGPT Desktop, and other MCP-compatible agents. The setup is straightforward. In this article, we will focus on setting up InfluxDB 3 Enterprise using Docker with Claude Desktop.

Fortune 500 Companies Lost $43.6M Each In Five Days. Still Think Operational Risk is IT Problem?

The Optus CEO resigned in 2023 after a routine software upgrade killed emergency services for an entire day. Two years and $12 million in fines later, the exact same failure happened again. Same root cause, same chaos, different executive taking the fall. Governance expert Helen Bird's diagnosis was surgical.

Track OpenAI Spend: Explain Where Your OpenAI Budget Goes

The inevitable happened. A while back, Gartner projected that in 2026, 30–50% of all new SaaS product features would use LLM inference. That meant OpenAI-style costs would become a standard part of SaaS COGS. Today, OpenAI has become one of the most operationally significant line items for SaaS companies. But for many teams, this creates an uncomfortable gap. Engineering sees OpenAI as a fast path to innovation.

Context engineering: The missing layer for trusted AI in financial services

Financial services AI demands more than models and prompts. Context engineering provides real-time, governed, and explainable intelligence with Elastic serving as the foundational context layer. Artificial intelligence in financial services is no longer constrained by model capability. The real bottleneck is context.