Operations | Monitoring | ITSM | DevOps | Cloud

June 2022

Optimize Resources Through Apache Spark Tuning (Part Two)

In part one of this two-part blog post, we began our deep dive into Apache Spark tuning to optimize resources. We looked at what is involved in executor and partition sizing, particularly choosing the number of partitions and choosing an executor size. After establishing some principles of optimization here, we ended by asking an important question: Is it really practical for all applications to be optimized? As our recent State of the Market report helped reveal, the answer is two-sided. The good news?

Spark Tuning Helps You Optimize Your Resources (Part One)

As our recent survey showed, Apache Spark is poised to continue as big data’s most dominant large-scale big data processing platform. Thus it is imperative that Spark users learn and master Spark tuning if they want to get the most out of their Spark environments. But what is tuning in Spark? How is it done? Read on to know more about Spark tuning.