Uses of Class
org.apache.flink.connector.kafka.dynamic.source.enumerator.DynamicKafkaSourceEnumState
Packages that use DynamicKafkaSourceEnumState
Package
Description
-
Uses of DynamicKafkaSourceEnumState in org.apache.flink.connector.kafka.dynamic.source
Methods in org.apache.flink.connector.kafka.dynamic.source that return types with arguments of type DynamicKafkaSourceEnumStateModifier and TypeMethodDescriptionorg.apache.flink.api.connector.source.SplitEnumerator<DynamicKafkaSourceSplit, DynamicKafkaSourceEnumState> DynamicKafkaSource.createEnumerator(org.apache.flink.api.connector.source.SplitEnumeratorContext<DynamicKafkaSourceSplit> enumContext) Create theDynamicKafkaSourceEnumerator.org.apache.flink.core.io.SimpleVersionedSerializer<DynamicKafkaSourceEnumState> DynamicKafkaSource.getEnumeratorCheckpointSerializer()org.apache.flink.api.connector.source.SplitEnumerator<DynamicKafkaSourceSplit, DynamicKafkaSourceEnumState> DynamicKafkaSource.restoreEnumerator(org.apache.flink.api.connector.source.SplitEnumeratorContext<DynamicKafkaSourceSplit> enumContext, DynamicKafkaSourceEnumState checkpoint) Restore theDynamicKafkaSourceEnumerator.Methods in org.apache.flink.connector.kafka.dynamic.source with parameters of type DynamicKafkaSourceEnumStateModifier and TypeMethodDescriptionorg.apache.flink.api.connector.source.SplitEnumerator<DynamicKafkaSourceSplit, DynamicKafkaSourceEnumState> DynamicKafkaSource.restoreEnumerator(org.apache.flink.api.connector.source.SplitEnumeratorContext<DynamicKafkaSourceSplit> enumContext, DynamicKafkaSourceEnumState checkpoint) Restore theDynamicKafkaSourceEnumerator. -
Uses of DynamicKafkaSourceEnumState in org.apache.flink.connector.kafka.dynamic.source.enumerator
Methods in org.apache.flink.connector.kafka.dynamic.source.enumerator that return DynamicKafkaSourceEnumStateModifier and TypeMethodDescriptionDynamicKafkaSourceEnumStateSerializer.deserialize(int version, byte[] serialized) DynamicKafkaSourceEnumerator.snapshotState(long checkpointId) Besides for checkpointing, this method is used in the restart sequence to retain the relevant assigned splits so that there is no reader duplicate split assignment.Methods in org.apache.flink.connector.kafka.dynamic.source.enumerator with parameters of type DynamicKafkaSourceEnumStateModifier and TypeMethodDescriptionbyte[]DynamicKafkaSourceEnumStateSerializer.serialize(DynamicKafkaSourceEnumState state) Constructors in org.apache.flink.connector.kafka.dynamic.source.enumerator with parameters of type DynamicKafkaSourceEnumStateModifierConstructorDescriptionDynamicKafkaSourceEnumerator(KafkaStreamSubscriber kafkaStreamSubscriber, KafkaMetadataService kafkaMetadataService, org.apache.flink.api.connector.source.SplitEnumeratorContext<DynamicKafkaSourceSplit> enumContext, OffsetsInitializer startingOffsetsInitializer, OffsetsInitializer stoppingOffsetInitializer, Properties properties, org.apache.flink.api.connector.source.Boundedness boundedness, DynamicKafkaSourceEnumState dynamicKafkaSourceEnumState)