Operations | Monitoring | ITSM | DevOps | Cloud

Latest Videos

Myth #5 of Apache Spark Optimization | Spark Dynamic Allocation

Spark Dynamic Allocation is a useful feature that was developed through the Spark community’s focus on continuous innovation and improvement. While Apache Spark users may believe Spark Dynamic Allocation is helping them eliminate resource waste, it doesn’t eliminate waste within applications themselves. Watch this video to understand SDA's benefits, where it falls short, and the solution gaps that remain with this component of Apache Spark.

Myth #4 of Apache Spark Optimization | Manual Tuning

Manual tuning can remediate some waste, but it doesn’t scale or address in-application waste. Watch this conversation to learn why manually tuning your Apache Spark applications is not the best approach to achieving optimization with price and performance in mind. Visit Pepperdata's page for information on real time, autonomous optimization for Apache Spark applications on Amazon EMR and EKS.

Cluster Autoscaling | The Second Myth of Apache Spark Optimization

Cluster Autoscaling is helpful for improving cloud resource optimization, but it doesn’t eliminate application waste. Watch the video to learn how Cluster Autoscaling can't fix the entire issue of application inefficiencies, but how Pepperdata Capacity Optimizer can enhance it and ensure it utilizes resources accordingly.

Observability and Monitoring | The First Myth of Apache Spark Optimization

It's valuable to know where waste in your applications and infrastructure is occurring, and to have recommendations for how to reduce that waste—but finding waste isn't necessarily fixing the problem. Check out this conversation between Shashi Raina, AWS Partner Solution Architect, and Kirk Lewis, Pepperdata Senior Solution Architect, as they dispel the first myth of Apache Spark optimization: observability and monitoring.

Did You Know These 5 Myths for Apache Spark Optimization?

There are several techniques and tricks when developers are tasked with optimizing their Apache Spark workloads, but most of them only fix a portion of the problem when it comes to price and performance. Watch this conversation between AWS Senior Partner Solution Architect Shashi Raina and Pepperdata Senior Solution Architect Kirk Lewis to understand the underlying myths of Apache Spark optimization, and how to ultimately fix the issue of wasted cloud resources and inflated costs.

Reduce Cloud Costs and Recover Application Waste | Pepperdata Capacity Optimizer

Pepperdata has saved companies over $200M over the last decade by reclaiming application waste and increasing your hardware utilization to reduce costs in the cloud. It completely eliminates the need for manual tuning, applying recommendations, or changing application code: it's autonomous, real-time cost optimization.

How Extole Discovered and Saved 30% By Reducing Application Waste

Not every application has wasted capacity in it—or do they? Watch Ben Smith, VP Technical Operations at Extole, discuss how he discovered that there's around 30% of application waste within every running app, and how Extole went about saving that wasted capacity.