Spark Configuration

Table of Contents

You can alter the Spark configuration by using the interpreter %spark.conf.

Resource Allocation

The administrator of the Spark Cluster may limit the available resources per user. A request that exceeds this limit will crash the interpreter.

Dynamic Allocation profile:

%spark.conf
spark.dynamicAllocation.enabled=true

Static Allocation profile:

%spark.conf
spark.executor.instances=8
spark.executor.cores=1
spark.executor.memory=1g