RE: How to configure number of executors, executor cores and executor memory settings for a Spark application created by Diyotta?
These executor configurations can be set up from 2 different places in Diyotta Studio
- Spark Data Point
- Spark Data Flow
On spark data point, the configuration is available under “Runtime Properties”.
Spark application created for this datapoint will be with the configuration set in this window. All data flows that use this data point will make use of the same configuration. However, this behavior can be overridden by choosing “Run as separate Spark application” from the data flow properties and then setting the configuration from the Data Flow level “Runtime properties”. This option creates separate Spark application per data flow each with their own configuration.