RE: How to configure Diyotta’s spark execution to use the different scheduler mode Spark offers?
How to configure Diyotta’s spark execution to use the different scheduler mode Spark offers?
“spark.scheduler.mode” is available in Spark Data PointàRuntime Properties à Run Property Type à Scheduling
The same is also available at the Data Flow level. If we set at data point it will be applied for all the data flows using that data point. If required for a specific data flow then it should be set at that particular Data Flow.
- Data Point
In Spark Data Ppoint click on “Runtime Properties”, select “Run Property Type” as “Scheduling”. Click on “+” button and find the property “spark.scheduler.mode” and click on the check box and then Ok.Then set the value to FAIR/FIFO.
- Data Flow
In Spark Data Flow click on “Runtime Properties”, select “Run Property Type” as “Scheduling”. Click on “+” button and find the property “spark.scheduler.mode” and click on the check box and then Ok. Then set the value to FAIR/FIFO.