Class DynamicKafkaSource<T>
java.lang.Object
org.apache.flink.connector.kafka.dynamic.source.DynamicKafkaSource<T>
- Type Parameters:
T- Record type
- All Implemented Interfaces:
Serializable,org.apache.flink.api.connector.source.Source<T,,DynamicKafkaSourceSplit, DynamicKafkaSourceEnumState> org.apache.flink.api.connector.source.SourceReaderFactory<T,,DynamicKafkaSourceSplit> org.apache.flink.api.java.typeutils.ResultTypeQueryable<T>
@Experimental
public class DynamicKafkaSource<T>
extends Object
implements org.apache.flink.api.connector.source.Source<T,DynamicKafkaSourceSplit,DynamicKafkaSourceEnumState>, org.apache.flink.api.java.typeutils.ResultTypeQueryable<T>
Factory class for the DynamicKafkaSource components. FLIP-246: DynamicKafkaSource
This source's key difference from KafkaSource is that it enables users to read
dynamically, which does not require job restart, from streams (topics that belong to one or more
clusters). If using KafkaSource, users need to restart the job by deleting the job and
reconfiguring the topics and clusters.
This example shows how to configure a DynamicKafkaSource that emits Integer records:
DynamicKafkaSource<Integer> dynamicKafkaSource =
DynamicKafkaSource.<Integer>builder()
.setStreamIds(Collections.singleton("MY_STREAM_ID"))
// custom metadata service that resolves `MY_STREAM_ID` to the associated clusters and topics
.setKafkaMetadataService(kafkaMetadataService)
.setDeserializer(
KafkaRecordDeserializationSchema.valueOnly(
IntegerDeserializer.class))
.setStartingOffsets(OffsetsInitializer.earliest())
// common properties for all Kafka clusters
.setProperties(properties)
.build();
See more configuration options in DynamicKafkaSourceBuilder and DynamicKafkaSourceOptions.
- See Also:
-
Method Summary
Modifier and TypeMethodDescriptionstatic <T> DynamicKafkaSourceBuilder<T> builder()Get a builder for this source.org.apache.flink.api.connector.source.SplitEnumerator<DynamicKafkaSourceSplit, DynamicKafkaSourceEnumState> createEnumerator(org.apache.flink.api.connector.source.SplitEnumeratorContext<DynamicKafkaSourceSplit> enumContext) Create theDynamicKafkaSourceEnumerator.org.apache.flink.api.connector.source.SourceReader<T, DynamicKafkaSourceSplit> createReader(org.apache.flink.api.connector.source.SourceReaderContext readerContext) Create theDynamicKafkaSourceReader.org.apache.flink.api.connector.source.BoundednessGet theBoundedness.org.apache.flink.core.io.SimpleVersionedSerializer<DynamicKafkaSourceEnumState> org.apache.flink.api.common.typeinfo.TypeInformation<T> Get theTypeInformationof the source.org.apache.flink.core.io.SimpleVersionedSerializer<DynamicKafkaSourceSplit> Get theDynamicKafkaSourceSplitSerializer.org.apache.flink.api.connector.source.SplitEnumerator<DynamicKafkaSourceSplit, DynamicKafkaSourceEnumState> restoreEnumerator(org.apache.flink.api.connector.source.SplitEnumeratorContext<DynamicKafkaSourceSplit> enumContext, DynamicKafkaSourceEnumState checkpoint) Restore theDynamicKafkaSourceEnumerator.
-
Method Details
-
builder
Get a builder for this source.- Returns:
- a
DynamicKafkaSourceBuilder.
-
getBoundedness
public org.apache.flink.api.connector.source.Boundedness getBoundedness()Get theBoundedness.- Specified by:
getBoundednessin interfaceorg.apache.flink.api.connector.source.Source<T,DynamicKafkaSourceSplit, DynamicKafkaSourceEnumState> - Returns:
- the
Boundedness.
-
createReader
@Internal public org.apache.flink.api.connector.source.SourceReader<T,DynamicKafkaSourceSplit> createReader(org.apache.flink.api.connector.source.SourceReaderContext readerContext) Create theDynamicKafkaSourceReader.- Specified by:
createReaderin interfaceorg.apache.flink.api.connector.source.SourceReaderFactory<T,DynamicKafkaSourceSplit> - Parameters:
readerContext- Thecontextfor the source reader.- Returns:
- the
DynamicKafkaSourceReader.
-
createEnumerator
@Internal public org.apache.flink.api.connector.source.SplitEnumerator<DynamicKafkaSourceSplit,DynamicKafkaSourceEnumState> createEnumerator(org.apache.flink.api.connector.source.SplitEnumeratorContext<DynamicKafkaSourceSplit> enumContext) Create theDynamicKafkaSourceEnumerator.- Specified by:
createEnumeratorin interfaceorg.apache.flink.api.connector.source.Source<T,DynamicKafkaSourceSplit, DynamicKafkaSourceEnumState> - Parameters:
enumContext- Thecontextfor the split enumerator.- Returns:
- the
DynamicKafkaSourceEnumerator.
-
restoreEnumerator
@Internal public org.apache.flink.api.connector.source.SplitEnumerator<DynamicKafkaSourceSplit,DynamicKafkaSourceEnumState> restoreEnumerator(org.apache.flink.api.connector.source.SplitEnumeratorContext<DynamicKafkaSourceSplit> enumContext, DynamicKafkaSourceEnumState checkpoint) Restore theDynamicKafkaSourceEnumerator.- Specified by:
restoreEnumeratorin interfaceorg.apache.flink.api.connector.source.Source<T,DynamicKafkaSourceSplit, DynamicKafkaSourceEnumState> - Parameters:
enumContext- Thecontextfor the restored split enumerator.checkpoint- The checkpoint to restore the SplitEnumerator from.- Returns:
- the
DynamicKafkaSourceEnumerator.
-
getSplitSerializer
@Internal public org.apache.flink.core.io.SimpleVersionedSerializer<DynamicKafkaSourceSplit> getSplitSerializer()Get theDynamicKafkaSourceSplitSerializer.- Specified by:
getSplitSerializerin interfaceorg.apache.flink.api.connector.source.Source<T,DynamicKafkaSourceSplit, DynamicKafkaSourceEnumState> - Returns:
- the
DynamicKafkaSourceSplitSerializer.
-
getEnumeratorCheckpointSerializer
@Internal public org.apache.flink.core.io.SimpleVersionedSerializer<DynamicKafkaSourceEnumState> getEnumeratorCheckpointSerializer()- Specified by:
getEnumeratorCheckpointSerializerin interfaceorg.apache.flink.api.connector.source.Source<T,DynamicKafkaSourceSplit, DynamicKafkaSourceEnumState> - Returns:
- the
DynamicKafkaSourceEnumStateSerializer.
-
getProducedType
Get theTypeInformationof the source.- Specified by:
getProducedTypein interfaceorg.apache.flink.api.java.typeutils.ResultTypeQueryable<T>- Returns:
- the
TypeInformation.
-
getKafkaStreamSubscriber
-