Operations | Monitoring | ITSM | DevOps | Cloud

Analytics

The Go client for Elasticsearch: Working with data

In our previous two blogs, we provided an overview of the architecture and design of the Elasticsearch Go client and explored how to configure and customize the client. In doing so, we pointed to a number of examples available in the GitHub repository. The goal of these examples is to provide executable "scripts" for common operations, so it's a good idea to look there whenever you're trying to solve a specific problem with the client.

9 Key Areas to Cover in Your Anomaly Detection RFP

Evaluating a new, unknown technology is a complicated task. Although you can articulate the goals you’re trying to achieve, you’re probably faced with multiple solutions that approach the problem in different ways and highlight varying features. To cut through the clutter, you need to figure out what questions to ask in order to evaluate which technology has the optimal capabilities to get the job done in your unique setting.

How Correlation Analysis Boosts the Efficacy of eCommerce Promotions

In the first part of the blog series, we discussed how correlation analysis can be leveraged to reduce time to detection (TTD) and time to remediation (TTR) by guiding mitigation efforts early. Further, correlation analysis helps to reduce alert fatigue by filtering out irrelevant anomalies and grouping multiple anomalies stemming from a single incident into one alert. In this part, we throw light on the applicability of correlation analysis in the realm of eCommerce, specifically, promotions.

Enriching data with GeoIPs from internal, private IP addresses

For public IPs, it is possible to create tables that will specify which city specific ranges of IPs belong to. However, a big portion of the internet is different. There are company private networks with IP addresses of the form 10.0.0.0/8, 172.16.0.0/12 or 192.168.0.0/16 scattered in every country in the world. These IP addresses tend to have no real information for the geographic locations.

Correlation Analysis: A Natural Next Step for Anomaly Detection

Over the last decade, data collection has become a commodity. Consequently, there has been a tremendous deluge of data in every area of industry. This trend is captured by recent research, which points to growing volume of raw data and growth of market segments fueled by that data growth.

TL;DR InfluxDB Tech Tips - How to Extract Values, Visualize Scalars, and Perform Custom Aggregations with Flux and InfluxDB

In this post, we learn how to use the reduce(), findColumn(), and findRecord() Flux functions to perform custom aggregations with InfluxDB. This TL;DR assumes that you have either registered for an InfluxDB Cloud account – registering for a free account is the easiest way to get started with InfluxDB – or installed InfluxDB 2.0 OSS. In order to easily demonstrate how these functions work, let’s use the array.from() function to build an ad hoc table to use in the query.

Dashboards Beta v0.7: Export Dashboard to PNG/PDF and Self-Service Install for Splunk Cloud

If you’re new to the Dashboards Beta app on Splunkbase and you’re trying to get started with building beautiful dashboards, this "Dashboards Beta" blog series is a great place to start. The Splunk Dashboards app (beta) brings a new dashboard framework, intended to combine the best of Simple XML and Glass Tables, and provide a friendlier experience for creating and editing dashboards.

Webinar Highlights: How Texas Instruments Uses InfluxDB

It’s back to school season, and oftentimes, that means people are purchasing TI-84 calculators for their kids. But did you know that Texas Instruments makes so much more than calculators? 😁 Michael Hinkle, a Probe Engineering and Manufacturing Supervisor at Texas Instruments, recently presented on “How Texas Instruments Uses InfluxDB to Upload Product Standards and to Improve Efficiencies”.

Elastic Workplace Search and Gmail: Unified search across all your content

As work from home has ballooned in 2020, virtual methods for communicating with colleagues have become more critical than ever. Same goes for all the useful productivity and collaboration tools at our disposal. The emerging downside is the difficulty of finding needed information among so many tools. Compounding the problem is the tendency for info to get siloed off by department.