How to configure Kafka to process event from the last committed offset for a Spark Streaming application created in Diyotta?

How to configure Kafka to process event from the last committed offset for a Spark Streaming application created in Diyotta?

Advanced User Asked on July 15, 2019 in Spark Process Platform.
Add Comment
1 Answer(s)
  • On kafka source in the data stream, we set below to options for starting offsets.
  • Lastest
  • Earliest

RE: How to configure Kafka to process event from the last committed offset for a Spark Streaming application created in Diyotta?

Latest: – Processing only new data that arrives after the query has started. The consumer will start reading from the newest records (records that were written after the consumer started running).

Earliest: – Start reading at the beginning of the stream. Consumer will read all the data in the partition, starting from the very beginning.

Expert Answered on July 15, 2019.
Add Comment

Your Answer

By posting your answer, you agree to the privacy policy and terms of service.