The latest News and Information on Log Management, Log Analytics and related technologies.
Many organizations are adopting centralized logging tools so that they have one place for all of their data. This is generally easier than having separate tools across teams for log storage and analysis. But centralized logging introduces new challenges, like how to segment those logs according to the teams or developers where they are the most relevant. And, how to manage log volume.
I would like to share our recent case study on our battling a very serious performance issue whose solution turned out to be a small change in code, but with a huge impact on all of our HTTP endpoints in our platform.
Trends in the infrastructure and software space have changed the way we build and run software. As a result, we have started treating our infrastructure as code, which has helped us lower costs and get our products to market more quickly. These new architectures also give us the ability to test our software faster in production-like deployments, and generally deliver more stable and reproducible deployments.
Most of the applications we see for the ELK stack are from businesses which want to improve their customers' experience. To return relevant search results and to create Kibana dashboards that allow them to analyse data and give the customers what they want. But there are some cases where the customer is always wrong, and where the last thing you want to do is give a site visitor what he wants. Welcome to the world of forensics, compliance and fraud detection.
Event Hubs are a big data streaming PasS capability provided by Azure. Event Hubs can process data or telemetry produced from your Azure environment. They also provide us a scalable method to get your valuable Azure data into Splunk! Splunk add-ons like the Splunk Add-on for Microsoft Cloud Services and the Microsoft Azure Add-on for Splunk provide the ability to connect to, and ingest all kinds of data sources from your Azure environment.
We’re excited to share that we’ve revamped our Shipping Tokens feature! If you’re a Logz.io user, you’re familiar with the key role tokens play in shipping and protecting your data. As a form of virtual identification, tokens help us properly attribute data to the right account. They are required in a variety use cases such as log shipping, API access, and read access. And in addition, they are also mandatory for compliance.
Logs are one of the most valuable assets when it comes to IT system management and monitoring. As they record every action that took place on your network, logs provide the insight you need to spot issues that might impact performance, compliance, and security. That’s why log management should be part of any monitoring infrastructure.