Operations | Monitoring | ITSM | DevOps | Cloud

Logging

The latest News and Information on Log Management, Log Analytics and related technologies.

BindPlane Agent Resiliency

A quick video about the resiliency of your BindPlane agent showing the parameters to tweak on destinations and best collector architecture to ensure you're not losing any data. About ObservIQ: observIQ brings clarity and control to our customer's existing observability chaos. How? Through an observability pipeline: a fast, powerful and intuitive orchestration engine built for the modern observability team. Our product is designed to help teams significantly reduce cost, simplify collection, and standardize their observability data.

Automatic log level detection reduces your cognitive load to identify anomalies at 3 am

Let’s face it, when that alert goes off at 2:58am, abruptly shaking you out of a deep slumber because of a high-priority issue hitting the application, you’re not 100% “on”. You need to shake the fog out of your head to focus on the urgent task of fixing the problem. This is where having the best log analytics tool can take on some of that cognitive load. Sumo Logic recently released new features specific to our Log Search queries that automatically detect log levels.

Dark Data: Discovery, Uses, and Benefits of Hidden Data

Dark data is all of the unused, unknown and untapped data across an organization. This data is generated as a result of users’ daily interactions online with countless devices and systems — everything from machine data to server log files to unstructured data derived from social media. Organizations may consider this data too old to provide value, incomplete or redundant, or limited by a format that can’t be accessed with available tools.

Data Lakes Explored: Benefits, Challenges, and Best Practices

A data lake is a data repository for terabytes or petabytes of raw data stored in its original format. The data can originate from a variety of data sources: IoT and sensor data, a simple file, or a binary large object (BLOB) such as a video, audio, image or multimedia file. Any manipulation of the data — to put it into a data pipeline and make it usable — is done when the data is extracted from the data lake.

Send your logs to multiple destinations with Datadog's managed Log Pipelines and Observability Pipelines

As your infrastructure and applications scale, so does the volume of your observability data. Managing a growing suite of tooling while balancing the need to mitigate costs, avoid vendor lock-in, and maintain data quality across an organization is becoming increasingly complex. With a variety of installed agents, log forwarders, and storage tools, the mechanisms you use to collect, transform, and route data should be able to evolve and adjust to your growth and meet the unique needs of your team.

Store and analyze high-volume logs efficiently with Flex Logs

The volume of logs that organizations collect from all over their systems is growing exponentially. Sources range from distributed infrastructure to data pipelines and APIs, and different types of logs demand different treatment. As a result, logs have become increasingly difficult to manage. Organizations must reconcile conflicting needs for long-term retention, rapid access, and cost-effective storage.

Leveraging Git for Cribl Stream Config: A Backup and Tracking Solution

Having your Cribl Stream instance connected to a remote git repo is a great way to have a backup of the cribl config. It also allows for easy tracking and viewing of all Cribl Stream config changes for improved accountability and auditing. Our Goal: Get Cribl configured with a remote Git repo and also configured with git signed commits. Git signed commits are a way of using cryptography to digitally add a signature to git commits.