Operations | Monitoring | ITSM | DevOps | Cloud

Latest News

What is log management, and why is it important?

Logs are like digital footprints or a letter that developers write to themselves for the future. They track every action or event that takes place within your software, applications and IT infrastructures. They provide important information such as when an action took place, host name, type of action, application used and more.

What is a log management tool?

Log management and analysis tools provide you real-time visualization of how your users are interacting with your apps and systems. Many of these log management tools include a sophisticated visual dashboard to immediately analyze data. They also offer your DevSecOps teams deeper insights and possibilities to enhance code quality, boost productivity and reduce risks. What should the best log management tools do for your team to be successful?

ChatGPT praise and trepidation - cyber defense in the age of AI

ChatGPT has taken the world by storm, so much so that we are all left guessing how far this will go. And it’s not a trivial question, as it relates to the future of humanity itself. On one extreme, technology is increasing rapidly enough to synthesize some of the most fundamental parts of our existence—communicating naturally with one another. That can be a scary thought.

Best Practices for SOC Tooling Acquisition

Your Security Operations Center (SOC) faces complex challenges for keeping corporate data safe and in the right hands everyday. The right tooling is critical for success. Deciding when—and how—to make investments in SOC tooling is complex and challenging across organizations. There’s a ton of vendor spin out there and it’s important to understand what’s real and what isn’t.

Reduce compliance TCO by using Grafana Loki for non-SIEM logs

Compliance is a term commonly associated with heavily regulated industries such as finance, healthcare, and telecommunication. But in reality, it touches nearly every business today as governments and other regulatory agencies seek to enact tighter controls over the use of our collective digital footprint. As a result, more and more companies need to retain a record of every single digital transaction under their control.

Elastic Observability: Built for open technologies like Kubernetes, OpenTelemetry, Prometheus, Istio, and more

As an operations engineer (SRE, IT Operations, DevOps), managing technology and data sprawl is an ongoing challenge. Cloud Native Computing Foundation (CNCF) projects are helping minimize sprawl and standardize technology and data, from Kubernetes, OpenTelemetry, Prometheus, Istio, and more. Kubernetes and OpenTelemetry are becoming the de facto standard for deploying and monitoring a cloud native application.

Data Denormalization: Pros, Cons & Techniques for Denormalizing Data

The amount of data organizations handle has created the need for faster data access and processing. Data Denormalization is a widely used technique to improve database query performance. This article discusses data normalization, its importance, how it differs from data normalization and denormalization techniques. Importantly, I’ll also look at the pros and cons of this approach.

Data lake vs. data mesh: Which one is right for you?

What’s the right way to manage growing volumes of enterprise data, while providing the consistency, data quality and governance required for analytics at scale? Is centralizing data management in a data lake the right approach? Or is a distributed data mesh architecture right for your organization? When it comes down to it, most organizations seeking these solutions are looking for a way to analyze data without having to move or transform it via complex extract, transform and load (ETL) pipelines.

The future of observability: Trends and predictions business leaders should plan for in 2023 and beyond

If the past year has taught us anything, it’s that the more things change, the more things stay the same. The whiplash and pivot from the go-go economy post-pandemic to a belt-tightening macroeconomic environment induced by higher inflation and interest rates has been seen before, but rarely this quickly. Technology leaders have always had to do more with less, but this slowdown may be unpredictable, longer, and more pronounced than expected.

Transforming Your Data With Telemetry Pipelines

Telemetry pipelines are a modern approach to monitoring and analyzing systems that collect, process, and analyze data from different sources (like metrics, traces, and logs). They are designed to provide a comprehensive view of the system’s behavior and identify issues quickly. Data transformation is a key aspect of telemetry pipelines, as it allows for the modification and shaping of data in order to make it more useful for monitoring and analysis.