Does Diyotta start a different spark application for each Spark data flow or it creates a long living Spark application?

Does Diyotta start a different spark application for each Spark data flow or it creates a long living Spark application?

Beginner Asked on July 17, 2019 in Architecture.
Add Comment
1 Answer(s)

Diyotta leaves this option to the user, either way is possible. Working with multiple different versions of Spark is also possible with Diyotta. Data point for Spark has these options “Dedicated Spark Application” and “Separate Spark Application for each Data Flow”.

  • Choosing only the first option will spin off a separate agent service per data point, regardless of whether there is another data point that uses the same Spark version. All data flows using this data point will be sent to this service
  • Leaving both options unchecked creates only one spark agent service per Spark version even if there are multiple different data points
  • Choosing both options will create one spark agent service per data flow execution that uses this data point

 

RE: Does Diyotta start a different spark application for each Spark data flow or it creates a long living Spark application?

Expert Answered on July 17, 2019.
Add Comment

Your Answer

By posting your answer, you agree to the privacy policy and terms of service.