All Classes Interface Summary Class Summary Enum Summary Exception Summary
| Class |
Description |
| AbstractFetcher<T,KPH> |
Base class for all fetchers, which implement the connections to Kafka brokers and pull records
from Kafka partitions.
|
| AbstractPartitionDiscoverer |
Base class for all partition discoverers.
|
| AbstractPartitionDiscoverer.ClosedException |
Thrown if this discoverer was used to discover partitions after it was closed.
|
| AbstractPartitionDiscoverer.WakeupException |
Signaling exception to indicate that an actual Kafka call was interrupted.
|
| BoundedMode |
End modes for the Kafka Consumer.
|
| ClosableBlockingQueue<E> |
A special form of blocking queue with two additions:
The queue can be closed atomically when empty.
|
| DefaultKafkaSinkContext |
Context providing information to assist constructing a ProducerRecord.
|
| ExceptionProxy |
A proxy that communicates exceptions between threads.
|
| FlinkFixedPartitioner<T> |
A partitioner ensuring that each internal Flink partition ends up in one Kafka partition.
|
| FlinkKafkaConsumer<T> |
Deprecated. |
| FlinkKafkaConsumerBase<T> |
Base class of all Flink Kafka Consumer data sources.
|
| FlinkKafkaErrorCode |
|
| FlinkKafkaException |
|
| FlinkKafkaInternalProducer<K,V> |
Internal flink kafka producer.
|
| FlinkKafkaPartitioner<T> |
A FlinkKafkaPartitioner wraps logic on how to partition records across partitions of
multiple Kafka topics.
|
| FlinkKafkaProducer<IN> |
Deprecated.
|
| FlinkKafkaProducer.ContextStateSerializer |
|
| FlinkKafkaProducer.ContextStateSerializer.ContextStateSerializerSnapshot |
Serializer configuration snapshot for compatibility and format evolution.
|
| FlinkKafkaProducer.KafkaTransactionContext |
|
| FlinkKafkaProducer.KafkaTransactionState |
State for handling transactions.
|
| FlinkKafkaProducer.NextTransactionalIdHint |
Keep information required to deduce next safe to use transactional id.
|
| FlinkKafkaProducer.NextTransactionalIdHintSerializer |
|
| FlinkKafkaProducer.NextTransactionalIdHintSerializer.NextTransactionalIdHintSerializerSnapshot |
Serializer configuration snapshot for compatibility and format evolution.
|
| FlinkKafkaProducer.Semantic |
Semantics that can be chosen.
|
| FlinkKafkaProducer.TransactionStateSerializer |
|
| FlinkKafkaProducer.TransactionStateSerializer.TransactionStateSerializerSnapshot |
Serializer configuration snapshot for compatibility and format evolution.
|
| FlinkKafkaProducer011 |
Compatibility class to make migration possible from the 0.11 connector to the universal one.
|
| FlinkKafkaProducer011.ContextStateSerializer |
|
| FlinkKafkaProducer011.ContextStateSerializer.ContextStateSerializerSnapshot |
|
| FlinkKafkaProducer011.NextTransactionalIdHint |
|
| FlinkKafkaProducer011.NextTransactionalIdHintSerializer |
|
| FlinkKafkaProducer011.NextTransactionalIdHintSerializer.NextTransactionalIdHintSerializerSnapshot |
|
| FlinkKafkaProducer011.TransactionStateSerializer |
|
| FlinkKafkaProducer011.TransactionStateSerializer.TransactionStateSerializerSnapshot |
|
| FlinkKafkaProducerBase<IN> |
Flink Sink to produce data into a Kafka topic.
|
| FlinkKafkaShuffle |
FlinkKafkaShuffle uses Kafka as a message bus to shuffle and persist data at the same
time.
|
| FlinkKafkaShuffleConsumer<T> |
Flink Kafka Shuffle Consumer Function.
|
| FlinkKafkaShuffleProducer<IN,KEY> |
Flink Kafka Shuffle Producer Function.
|
| FlinkKafkaShuffleProducer.KafkaSerializer<IN> |
Flink Kafka Shuffle Serializer.
|
| Handover |
The Handover is a utility to hand over data (a buffer of records) and exception from a
producer thread to a consumer thread.
|
| Handover.ClosedException |
|
| Handover.WakeupException |
|
| JacksonMapperFactory |
Factory for Jackson mappers.
|
| JSONKeyValueDeserializationSchema |
DeserializationSchema that deserializes a JSON String into an ObjectNode.
|
| KafkaCommitCallback |
A callback interface that the source operator can implement to trigger custom actions when a
commit request completes, which should normally be triggered from checkpoint complete event.
|
| KafkaConnectorOptions |
Options for the Kafka connector.
|
| KafkaConnectorOptions.ScanBoundedMode |
|
| KafkaConnectorOptions.ScanStartupMode |
|
| KafkaConnectorOptions.ValueFieldsStrategy |
Strategies to derive the data type of a value format by considering a key format.
|
| KafkaConsumerMetricConstants |
A collection of Kafka consumer metrics related constant strings.
|
| KafkaConsumerThread<T> |
The thread the runs the KafkaConsumer, connecting to the brokers and polling records.
|
| KafkaContextAware<T> |
An interface for KafkaSerializationSchemas that need information
about the context where the Kafka Producer is running along with information about the available
partitions.
|
| KafkaDeserializationSchema<T> |
The deserialization schema describes how to turn the Kafka ConsumerRecords into data types
(Java/Scala objects) that are processed by Flink.
|
| KafkaDeserializationSchemaWrapper<T> |
A simple wrapper for using the DeserializationSchema with the KafkaDeserializationSchema
interface.
|
| KafkaDynamicSink |
A version-agnostic Kafka DynamicTableSink.
|
| KafkaDynamicSource |
A version-agnostic Kafka ScanTableSource.
|
| KafkaDynamicTableFactory |
|
| KafkaFetcher<T> |
A fetcher that fetches data from Kafka brokers via the Kafka consumer API.
|
| KafkaMetricMutableWrapper |
Gauge for getting the current value of a Kafka metric.
|
| KafkaMetricWrapper |
Gauge for getting the current value of a Kafka metric.
|
| KafkaPartitionDiscoverer |
A partition discoverer that can be used to discover topics and partitions metadata from Kafka
brokers via the Kafka high-level consumer API.
|
| KafkaPartitionSplit |
A SourceSplit for a Kafka partition.
|
| KafkaPartitionSplitReader |
A SplitReader implementation that reads records from Kafka partitions.
|
| KafkaPartitionSplitSerializer |
|
| KafkaPartitionSplitState |
This class extends KafkaPartitionSplit to track a mutable current offset.
|
| KafkaRecordDeserializationSchema<T> |
An interface for the deserialization of Kafka records.
|
| KafkaRecordEmitter<T> |
|
| KafkaRecordSerializationSchema<T> |
A serialization schema which defines how to convert a value of type T to ProducerRecord.
|
| KafkaRecordSerializationSchema.KafkaSinkContext |
Context providing information of the kafka record target location.
|
| KafkaRecordSerializationSchemaBuilder<IN> |
|
| KafkaSerializationSchema<T> |
|
| KafkaSerializationSchemaWrapper<T> |
|
| KafkaShuffleFetcher<T> |
Fetch data from Kafka for Kafka Shuffle.
|
| KafkaShuffleFetcher.KafkaShuffleElement |
An element in a KafkaShuffle.
|
| KafkaShuffleFetcher.KafkaShuffleElementDeserializer<T> |
Deserializer for KafkaShuffleElement.
|
| KafkaShuffleFetcher.KafkaShuffleRecord<T> |
One value with Type T in a KafkaShuffle.
|
| KafkaShuffleFetcher.KafkaShuffleWatermark |
A watermark element in a KafkaShuffle.
|
| KafkaSink<IN> |
Flink Sink to produce data into a Kafka topic.
|
| KafkaSinkBuilder<IN> |
|
| KafkaSource<OUT> |
The Source implementation of Kafka.
|
| KafkaSourceBuilder<OUT> |
|
| KafkaSourceEnumerator |
The enumerator class for Kafka source.
|
| KafkaSourceEnumerator.PartitionOffsetsRetrieverImpl |
The implementation for offsets retriever with a consumer and an admin client.
|
| KafkaSourceEnumState |
The state of Kafka source enumerator.
|
| KafkaSourceEnumStateSerializer |
The Serializer for the enumerator
state of Kafka source.
|
| KafkaSourceFetcherManager |
The SplitFetcherManager for Kafka source.
|
| KafkaSourceOptions |
Configurations for KafkaSource.
|
| KafkaSourceReader<T> |
The source reader for Kafka partitions.
|
| KafkaSourceReaderMetrics |
|
| KafkaSubscriber |
Kafka consumer allows a few different ways to consume from the topics, including:
Subscribe from a collection of topics.
|
| KafkaTopicPartition |
Flink's description of a partition in a Kafka topic.
|
| KafkaTopicPartition.Comparator |
|
| KafkaTopicPartitionAssigner |
Utility for assigning Kafka partitions to consumer subtasks.
|
| KafkaTopicPartitionLeader |
Serializable Topic Partition info with leader Node information.
|
| KafkaTopicPartitionState<T,KPH> |
The state that the Flink Kafka Consumer holds for each Kafka partition.
|
| KafkaTopicPartitionStateSentinel |
Magic values used to represent special offset states before partitions are actually read.
|
| KafkaTopicPartitionStateWithWatermarkGenerator<T,KPH> |
A special version of the per-kafka-partition-state that additionally holds a TimestampAssigner, WatermarkGenerator, an immediate WatermarkOutput, and a
deferred WatermarkOutput for this partition.
|
| KafkaTopicsDescriptor |
A Kafka Topics Descriptor describes how the consumer subscribes to Kafka topics - either a fixed
list of topics, or a topic pattern.
|
| KeyedDeserializationSchema<T> |
Deprecated.
|
| KeyedSerializationSchema<T> |
Deprecated.
|
| KeyedSerializationSchemaWrapper<T> |
A simple wrapper for using the SerializationSchema with the KeyedSerializationSchema interface.
|
| MetricUtil |
Collection of methods to interact with Kafka's client metric system.
|
| NoStoppingOffsetsInitializer |
|
| OffsetCommitMode |
The offset commit mode represents the behaviour of how offsets are externally committed back to
Kafka brokers / Zookeeper.
|
| OffsetCommitModes |
|
| OffsetsInitializer |
|
| OffsetsInitializer.PartitionOffsetsRetriever |
An interface that provides necessary information to the OffsetsInitializer to get the
initial offsets of the Kafka partitions.
|
| OffsetsInitializerValidator |
|
| SinkBufferFlushMode |
Sink buffer flush configuration.
|
| SourceContextWatermarkOutputAdapter<T> |
A WatermarkOutput that forwards calls to a SourceFunction.SourceContext.
|
| StartupMode |
Startup modes for the Kafka Consumer.
|
| TopicSelector<IN> |
Selects a topic for the incoming record.
|
| TransactionalIdsGenerator |
Class responsible for generating transactional ids to use when communicating with Kafka.
|
| TypeInformationKeyValueSerializationSchema<K,V> |
A serialization and deserialization schema for Key Value Pairs that uses Flink's serialization
stack to transform typed from and to byte arrays.
|
| UpsertKafkaDynamicTableFactory |
Upsert-Kafka factory.
|
| UpsertKafkaDynamicTableFactory.DecodingFormatWrapper |
It is used to wrap the decoding format and expose the desired changelog mode.
|
| UpsertKafkaDynamicTableFactory.EncodingFormatWrapper |
It is used to wrap the encoding format and expose the desired changelog mode.
|