RE: How to configure Kafka to process event from the last committed offset for a Spark Streaming application created in Diyotta?
- On kafka source in the data stream, we set below to options for starting offsets.
Latest: – Processing only new data that arrives after the query has started. The consumer will start reading from the newest records (records that were written after the consumer started running).
Earliest: – Start reading at the beginning of the stream. Consumer will read all the data in the partition, starting from the very beginning.