- AbstractProcessor<K,V> - Class in org.apache.kafka.streams.processor
-
- AbstractProcessor() - Constructor for class org.apache.kafka.streams.processor.AbstractProcessor
-
- addInternalTopic(String) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Adds an internal topic
- addLatencySensor(String, String, String, String...) - Method in interface org.apache.kafka.streams.StreamsMetrics
-
- addProcessor(String, ProcessorSupplier, String...) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Add a new processor node that receives and processes records output by one or more parent source or processor node.
- addSink(String, String, String...) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Add a new sink that forwards records from upstream parent processor and/or source nodes to the named Kafka topic.
- addSink(String, String, StreamPartitioner, String...) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Add a new sink that forwards records from upstream parent processor and/or source nodes to the named Kafka topic, using
the supplied partitioner.
- addSink(String, String, Serializer, Serializer, String...) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Add a new sink that forwards records from upstream parent processor and/or source nodes to the named Kafka topic.
- addSink(String, String, Serializer<K>, Serializer<V>, StreamPartitioner<K, V>, String...) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Add a new sink that forwards records from upstream parent processor and/or source nodes to the named Kafka topic.
- addSource(String, String...) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Add a new source that consumes the named topics and forwards the records to child processor and/or sink nodes.
- addSource(String, Deserializer, Deserializer, String...) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Add a new source that consumes the named topics and forwards the records to child processor and/or sink nodes.
- addStateStore(StateStoreSupplier, boolean, String...) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Adds a state store
- addStateStore(StateStoreSupplier, String...) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Adds a state store
- advance - Variable in class org.apache.kafka.streams.kstream.TimeWindows
-
The size of the window's advance interval, i.e.
- advanceBy(long) - Method in class org.apache.kafka.streams.kstream.TimeWindows
-
Returns a window definition with the original size, but advance ("hop") the window by the given
interval, which specifies by how much a window moves forward relative to the previous one.
- after - Variable in class org.apache.kafka.streams.kstream.JoinWindows
-
Maximum time difference for tuples that are after the join tuple.
- after(long) - Method in class org.apache.kafka.streams.kstream.JoinWindows
-
Specifies that records of the same key are joinable if their timestamps are within
the join window interval, and if the timestamp of a record from the secondary stream
is later than or equal to the timestamp of a record from the first stream.
- aggregate(Initializer<T>, Aggregator<K, V, T>, Aggregator<K, V, T>, Serde<T>, String) - Method in interface org.apache.kafka.streams.kstream.KGroupedTable
-
Aggregate updating values of this stream by the selected key into a new instance of
KTable
.
- aggregate(Initializer<T>, Aggregator<K, V, T>, Aggregator<K, V, T>, String) - Method in interface org.apache.kafka.streams.kstream.KGroupedTable
-
Aggregate updating values of this stream by the selected key into a new instance of
KTable
using default serializers and deserializers.
- aggregateByKey(Initializer<T>, Aggregator<K, V, T>, Windows<W>, Serde<K>, Serde<T>) - Method in interface org.apache.kafka.streams.kstream.KStream
-
Aggregate values of this stream by key on a window basis into a new instance of windowed
KTable
.
- aggregateByKey(Initializer<T>, Aggregator<K, V, T>, Windows<W>) - Method in interface org.apache.kafka.streams.kstream.KStream
-
Aggregate values of this stream by key on a window basis into a new instance of windowed
KTable
with default serializers and deserializers.
- aggregateByKey(Initializer<T>, Aggregator<K, V, T>, Serde<K>, Serde<T>, String) - Method in interface org.apache.kafka.streams.kstream.KStream
-
Aggregate values of this stream by key into a new instance of ever-updating
KTable
.
- aggregateByKey(Initializer<T>, Aggregator<K, V, T>, String) - Method in interface org.apache.kafka.streams.kstream.KStream
-
Aggregate values of this stream by key into a new instance of ever-updating
KTable
with default serializers and deserializers.
- Aggregator<K,V,T> - Interface in org.apache.kafka.streams.kstream
-
The
Aggregator
interface for aggregating values of the given key.
- all() - Method in interface org.apache.kafka.streams.state.KeyValueStore
-
Return an iterator over all keys in the database.
- APPLICATION_ID_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
application.id
- APPLICATION_ID_DOC - Static variable in class org.apache.kafka.streams.StreamsConfig
-
- applicationId() - Method in interface org.apache.kafka.streams.processor.ProcessorContext
-
Returns the application id
- apply(K, V, T) - Method in interface org.apache.kafka.streams.kstream.Aggregator
-
Compute a new aggregate from the key and value of a record and the current aggregate of the same key.
- apply(K, V) - Method in interface org.apache.kafka.streams.kstream.ForeachAction
-
Perform an action for each record of a stream.
- apply() - Method in interface org.apache.kafka.streams.kstream.Initializer
-
Return the initial value for an aggregation.
- apply(K, V) - Method in interface org.apache.kafka.streams.kstream.KeyValueMapper
-
Map a record with the given key and value to a new value.
- apply(V, V) - Method in interface org.apache.kafka.streams.kstream.Reducer
-
Aggregate the two given values into a single one.
- apply(V1, V2) - Method in interface org.apache.kafka.streams.kstream.ValueJoiner
-
Return a joined value consisting of value1
and value2
.
- apply(V1) - Method in interface org.apache.kafka.streams.kstream.ValueMapper
-
Map the given value to a new value.
- pair(K, V) - Static method in class org.apache.kafka.streams.KeyValue
-
Create a new key-value pair.
- parse(String) - Static method in class org.apache.kafka.streams.processor.TaskId
-
- partition() - Method in interface org.apache.kafka.streams.processor.ProcessorContext
-
Returns the partition id of the current input record; could be -1 if it is not
available (for example, if this method is invoked from the punctuate call)
- partition(K, V, int) - Method in interface org.apache.kafka.streams.processor.StreamPartitioner
-
Determine the partition number for a record with the given key and value and the current number of partitions.
- partition - Variable in class org.apache.kafka.streams.processor.TaskId
-
The ID of the partition.
- PARTITION_GROUPER_CLASS_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
partition.grouper
- PartitionGrouper - Interface in org.apache.kafka.streams.processor
-
A partition grouper that generates partition groups given the list of topic-partitions.
- partitionGroups(Map<Integer, Set<String>>, Cluster) - Method in class org.apache.kafka.streams.processor.DefaultPartitionGrouper
-
Generate tasks with the assigned topic partitions.
- partitionGroups(Map<Integer, Set<String>>, Cluster) - Method in interface org.apache.kafka.streams.processor.PartitionGrouper
-
Returns a map of task ids to groups of partitions.
- persistent() - Method in interface org.apache.kafka.streams.processor.StateStore
-
Return if the storage is persistent or not.
- persistent() - Method in interface org.apache.kafka.streams.state.Stores.KeyValueFactory
-
Keep all key-value entries off-heap in a local database, although for durability all entries are recorded in a Kafka
topic that can be read to restore the entries if they are lost.
- POLL_MS_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
poll.ms
- Predicate<K,V> - Interface in org.apache.kafka.streams.kstream
-
The
Predicate
interface represents a predicate (boolean-valued function) of a key-value pair.
- print() - Method in interface org.apache.kafka.streams.kstream.KStream
-
Print the elements of this stream to System.out
Implementors will need to override toString for keys and values that are not of
type String, Integer etc to get meaningful information.
- print(Serde<K>, Serde<V>) - Method in interface org.apache.kafka.streams.kstream.KStream
-
Print the elements of this stream to System.out
- print() - Method in interface org.apache.kafka.streams.kstream.KTable
-
Print the elements of this stream to System.out
Implementors will need to override toString for keys and values that are not of
type String, Integer etc to get meaningful information.
- print(Serde<K>, Serde<V>) - Method in interface org.apache.kafka.streams.kstream.KTable
-
Print the elements of this stream to System.out
- process(ProcessorSupplier<K, V>, String...) - Method in interface org.apache.kafka.streams.kstream.KStream
-
Process all elements in this stream, one element at a time, by applying a
Processor
.
- process(K, V) - Method in interface org.apache.kafka.streams.processor.Processor
-
Process the record with the given key and value.
- Processor<K,V> - Interface in org.apache.kafka.streams.processor
-
A processor of key-value pair records.
- ProcessorContext - Interface in org.apache.kafka.streams.processor
-
Processor context interface.
- ProcessorStateException - Exception in org.apache.kafka.streams.errors
-
Indicates a processor state operation (e.g.
- ProcessorStateException(String) - Constructor for exception org.apache.kafka.streams.errors.ProcessorStateException
-
- ProcessorStateException(String, Throwable) - Constructor for exception org.apache.kafka.streams.errors.ProcessorStateException
-
- ProcessorStateException(Throwable) - Constructor for exception org.apache.kafka.streams.errors.ProcessorStateException
-
- ProcessorSupplier<K,V> - Interface in org.apache.kafka.streams.processor
-
A processor supplier that can create one or more
Processor
instances.
- punctuate(long) - Method in interface org.apache.kafka.streams.kstream.Transformer
-
- punctuate(long) - Method in interface org.apache.kafka.streams.kstream.ValueTransformer
-
- punctuate(long) - Method in class org.apache.kafka.streams.processor.AbstractProcessor
-
- punctuate(long) - Method in interface org.apache.kafka.streams.processor.Processor
-
- put(K, V) - Method in interface org.apache.kafka.streams.state.KeyValueStore
-
Update the value associated with this key
- put(K, V) - Method in interface org.apache.kafka.streams.state.WindowStore
-
Put a key-value pair with the current wall-clock time as the timestamp
into the corresponding window
- put(K, V, long) - Method in interface org.apache.kafka.streams.state.WindowStore
-
Put a key-value pair with the given timestamp into the corresponding window
- putAll(List<KeyValue<K, V>>) - Method in interface org.apache.kafka.streams.state.KeyValueStore
-
Update all the given key/value pairs
- putIfAbsent(K, V) - Method in interface org.apache.kafka.streams.state.KeyValueStore
-
Update the value associated with this key, unless a value
is already associated with the key
- schedule(long) - Method in interface org.apache.kafka.streams.processor.ProcessorContext
-
Schedules a periodic operation for processors.
- segments - Variable in class org.apache.kafka.streams.kstream.Windows
-
- segments(int) - Method in class org.apache.kafka.streams.kstream.Windows
-
Specify the number of segments to be used for rolling the window store,
this function is not exposed to users but can be called by developers that extend this JoinWindows specs.
- selectKey(KeyValueMapper<K, V, K1>) - Method in interface org.apache.kafka.streams.kstream.KStream
-
Create a new key from the current key and value.
- setUncaughtExceptionHandler(Thread.UncaughtExceptionHandler) - Method in class org.apache.kafka.streams.KafkaStreams
-
Sets the handler invoked when a stream thread abruptly terminates due to an uncaught exception.
- sinkTopics - Variable in class org.apache.kafka.streams.processor.TopologyBuilder.TopicsInfo
-
- size - Variable in class org.apache.kafka.streams.kstream.TimeWindows
-
The size of the window, i.e.
- sourceTopics(String) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Get the names of topics that are to be consumed by the source nodes created by this builder.
- sourceTopics - Variable in class org.apache.kafka.streams.processor.TopologyBuilder.TopicsInfo
-
- start() - Method in class org.apache.kafka.streams.KafkaStreams
-
Start the stream instance by starting all its threads.
- start - Variable in class org.apache.kafka.streams.kstream.UnlimitedWindows
-
The start timestamp of the window.
- start() - Method in class org.apache.kafka.streams.kstream.Window
-
Return the start timestamp of this window, inclusive
- startOn(long) - Method in class org.apache.kafka.streams.kstream.UnlimitedWindows
-
Return a new unlimited window for the specified start timestamp.
- STATE_CLEANUP_DELAY_MS_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
state.cleanup.delay
- STATE_DIR_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
state.dir
- stateChangelogTopics - Variable in class org.apache.kafka.streams.processor.TopologyBuilder.TopicsInfo
-
- stateDir() - Method in interface org.apache.kafka.streams.processor.ProcessorContext
-
Returns the state directory for the partition.
- stateName() - Method in class org.apache.kafka.streams.state.StateSerdes
-
Return the name of the state.
- StateRestoreCallback - Interface in org.apache.kafka.streams.processor
-
Restoration logic for log-backed state stores upon restart,
it takes one record at a time from the logs to apply to the restoring state.
- StateSerdes<K,V> - Class in org.apache.kafka.streams.state
-
Factory for creating serializers / deserializers for state stores in Kafka Streams.
- StateSerdes(String, Serde<K>, Serde<V>) - Constructor for class org.apache.kafka.streams.state.StateSerdes
-
Create a context for serialization using the specified serializers and deserializers which
must match the key and value types used as parameters for this object; the state changelog topic
is provided to bind this serde factory to, so that future calls for serialize / deserialize do not
need to provide the topic name any more.
- StateStore - Interface in org.apache.kafka.streams.processor
-
A storage engine for managing state maintained by a stream processor.
- StateStoreSupplier - Interface in org.apache.kafka.streams.processor
-
A state store supplier which can create one or more
StateStore
instances.
- Stores - Class in org.apache.kafka.streams.state
-
Factory for creating state stores in Kafka Streams.
- Stores() - Constructor for class org.apache.kafka.streams.state.Stores
-
- Stores.InMemoryKeyValueFactory<K,V> - Interface in org.apache.kafka.streams.state
-
The interface used to create in-memory key-value stores.
- Stores.KeyValueFactory<K,V> - Interface in org.apache.kafka.streams.state
-
The interface used to specify the different kinds of key-value stores.
- Stores.PersistentKeyValueFactory<K,V> - Interface in org.apache.kafka.streams.state
-
The interface used to create off-heap key-value stores that use a local database.
- Stores.StoreFactory - Class in org.apache.kafka.streams.state
-
- Stores.StoreFactory() - Constructor for class org.apache.kafka.streams.state.Stores.StoreFactory
-
- Stores.ValueFactory<K> - Class in org.apache.kafka.streams.state
-
The factory for creating off-heap key-value stores.
- Stores.ValueFactory() - Constructor for class org.apache.kafka.streams.state.Stores.ValueFactory
-
- stream(String...) - Method in class org.apache.kafka.streams.kstream.KStreamBuilder
-
Create a
KStream
instance from the specified topics.
- stream(Serde<K>, Serde<V>, String...) - Method in class org.apache.kafka.streams.kstream.KStreamBuilder
-
Create a
KStream
instance from the specified topics.
- STREAM_THREAD_INSTANCE - Static variable in class org.apache.kafka.streams.StreamsConfig.InternalConfig
-
- StreamPartitioner<K,V> - Interface in org.apache.kafka.streams.processor
-
Determine how records are distributed among the partitions in a Kafka topic.
- StreamsConfig - Class in org.apache.kafka.streams
-
Configuration for Kafka Streams.
- StreamsConfig(Map<?, ?>) - Constructor for class org.apache.kafka.streams.StreamsConfig
-
- StreamsConfig.InternalConfig - Class in org.apache.kafka.streams
-
- StreamsConfig.InternalConfig() - Constructor for class org.apache.kafka.streams.StreamsConfig.InternalConfig
-
- StreamsException - Exception in org.apache.kafka.streams.errors
-
StreamsException is the top-level exception type generated by Kafka Streams.
- StreamsException(String) - Constructor for exception org.apache.kafka.streams.errors.StreamsException
-
- StreamsException(String, Throwable) - Constructor for exception org.apache.kafka.streams.errors.StreamsException
-
- StreamsException(Throwable) - Constructor for exception org.apache.kafka.streams.errors.StreamsException
-
- StreamsMetrics - Interface in org.apache.kafka.streams
-
The Kafka Streams metrics interface for adding metric sensors and collecting metric values.
- table(String) - Method in class org.apache.kafka.streams.kstream.KStreamBuilder
-
Create a
KTable
instance for the specified topic.
- table(Serde<K>, Serde<V>, String) - Method in class org.apache.kafka.streams.kstream.KStreamBuilder
-
Create a
KTable
instance for the specified topic.
- TaskAssignmentException - Exception in org.apache.kafka.streams.errors
-
Indicates a run time error incurred while trying to assign stream tasks to threads
- TaskAssignmentException(String) - Constructor for exception org.apache.kafka.streams.errors.TaskAssignmentException
-
- TaskAssignmentException(String, Throwable) - Constructor for exception org.apache.kafka.streams.errors.TaskAssignmentException
-
- TaskAssignmentException(Throwable) - Constructor for exception org.apache.kafka.streams.errors.TaskAssignmentException
-
- taskId() - Method in interface org.apache.kafka.streams.processor.ProcessorContext
-
Returns the task id
- TaskId - Class in org.apache.kafka.streams.processor
-
The task ID representation composed as topic group ID plus the assigned partition ID.
- TaskId(int, int) - Constructor for class org.apache.kafka.streams.processor.TaskId
-
- TaskIdFormatException - Exception in org.apache.kafka.streams.errors
-
Indicates a run time error incurred while trying parse the task id from the read string
- TaskIdFormatException(String) - Constructor for exception org.apache.kafka.streams.errors.TaskIdFormatException
-
- test(K, V) - Method in interface org.apache.kafka.streams.kstream.Predicate
-
Test if the record with the given key and value satisfies the predicate.
- through(String) - Method in interface org.apache.kafka.streams.kstream.KStream
-
Materialize this stream to a topic, also creates a new instance of
KStream
from the topic
using default serializers and deserializers and producer's
DefaultPartitioner
.
- through(StreamPartitioner<K, V>, String) - Method in interface org.apache.kafka.streams.kstream.KStream
-
Materialize this stream to a topic, also creates a new instance of
KStream
from the topic
using default serializers and deserializers and a customizable
StreamPartitioner
to determine the distribution of records to partitions.
- through(Serde<K>, Serde<V>, String) - Method in interface org.apache.kafka.streams.kstream.KStream
-
Materialize this stream to a topic, also creates a new instance of
KStream
from the topic.
- through(Serde<K>, Serde<V>, StreamPartitioner<K, V>, String) - Method in interface org.apache.kafka.streams.kstream.KStream
-
Materialize this stream to a topic, also creates a new instance of
KStream
from the topic
using a customizable
StreamPartitioner
to determine the distribution of records to partitions.
- through(String) - Method in interface org.apache.kafka.streams.kstream.KTable
-
Materialize this stream to a topic, also creates a new instance of
KTable
from the topic
using default serializers and deserializers and producer's
DefaultPartitioner
.
- through(StreamPartitioner<K, V>, String) - Method in interface org.apache.kafka.streams.kstream.KTable
-
Materialize this stream to a topic, also creates a new instance of
KTable
from the topic using default serializers
and deserializers and a customizable
StreamPartitioner
to determine the distribution of records to partitions.
- through(Serde<K>, Serde<V>, String) - Method in interface org.apache.kafka.streams.kstream.KTable
-
Materialize this stream to a topic, also creates a new instance of
KTable
from the topic.
- through(Serde<K>, Serde<V>, StreamPartitioner<K, V>, String) - Method in interface org.apache.kafka.streams.kstream.KTable
-
Materialize this stream to a topic, also creates a new instance of
KTable
from the topic
using a customizable
StreamPartitioner
to determine the distribution of records to partitions.
- timestamp() - Method in interface org.apache.kafka.streams.processor.ProcessorContext
-
Returns the current timestamp.
- TIMESTAMP_EXTRACTOR_CLASS_CONFIG - Static variable in class org.apache.kafka.streams.StreamsConfig
-
timestamp.extractor
- TimestampExtractor - Interface in org.apache.kafka.streams.processor
-
An interface that allows the Kafka Streams framework to extract a timestamp from an instance of ConsumerRecord
.
- TimeWindows - Class in org.apache.kafka.streams.kstream
-
The time-based window specifications used for aggregations.
- to(String) - Method in interface org.apache.kafka.streams.kstream.KStream
-
Materialize this stream to a topic using default serializers specified in the config
and producer's DefaultPartitioner
.
- to(StreamPartitioner<K, V>, String) - Method in interface org.apache.kafka.streams.kstream.KStream
-
Materialize this stream to a topic using default serializers specified in the config and a customizable
StreamPartitioner
to determine the distribution of records to partitions.
- to(Serde<K>, Serde<V>, String) - Method in interface org.apache.kafka.streams.kstream.KStream
-
Materialize this stream to a topic.
- to(Serde<K>, Serde<V>, StreamPartitioner<K, V>, String) - Method in interface org.apache.kafka.streams.kstream.KStream
-
Materialize this stream to a topic using a customizable
StreamPartitioner
to determine the distribution of records to partitions.
- to(String) - Method in interface org.apache.kafka.streams.kstream.KTable
-
Materialize this stream to a topic using default serializers specified in the config
and producer's DefaultPartitioner
.
- to(StreamPartitioner<K, V>, String) - Method in interface org.apache.kafka.streams.kstream.KTable
-
Materialize this stream to a topic using default serializers specified in the config
and a customizable
StreamPartitioner
to determine the distribution of records to partitions.
- to(Serde<K>, Serde<V>, String) - Method in interface org.apache.kafka.streams.kstream.KTable
-
Materialize this stream to a topic.
- to(Serde<K>, Serde<V>, StreamPartitioner<K, V>, String) - Method in interface org.apache.kafka.streams.kstream.KTable
-
Materialize this stream to a topic using a customizable
StreamPartitioner
to determine the distribution of records to partitions.
- topic() - Method in interface org.apache.kafka.streams.processor.ProcessorContext
-
Returns the topic name of the current input record; could be null if it is not
available (for example, if this method is invoked from the punctuate call)
- topicGroupId - Variable in class org.apache.kafka.streams.processor.TaskId
-
The ID of the topic group.
- topicGroups(String) - Method in class org.apache.kafka.streams.processor.TopologyBuilder
-
Returns the map of topic groups keyed by the group id.
- TopologyBuilder - Class in org.apache.kafka.streams.processor
-
A component that is used to build a ProcessorTopology
.
- TopologyBuilder() - Constructor for class org.apache.kafka.streams.processor.TopologyBuilder
-
Create a new builder.
- TopologyBuilder.TopicsInfo - Class in org.apache.kafka.streams.processor
-
- TopologyBuilder.TopicsInfo(Set<String>, Set<String>, Set<String>, Set<String>) - Constructor for class org.apache.kafka.streams.processor.TopologyBuilder.TopicsInfo
-
- TopologyBuilderException - Exception in org.apache.kafka.streams.errors
-
Indicates a pre-run time error incurred while parsing the builder to construct the processor topology
- TopologyBuilderException(String) - Constructor for exception org.apache.kafka.streams.errors.TopologyBuilderException
-
- toStream() - Method in interface org.apache.kafka.streams.kstream.KTable
-
Convert this stream to a new instance of
KStream
.
- toStream(KeyValueMapper<K, V, K1>) - Method in interface org.apache.kafka.streams.kstream.KTable
-
- toString() - Method in class org.apache.kafka.streams.KeyValue
-
- toString() - Method in class org.apache.kafka.streams.kstream.Windowed
-
- toString() - Method in class org.apache.kafka.streams.processor.TaskId
-
- transform(TransformerSupplier<K, V, KeyValue<K1, V1>>, String...) - Method in interface org.apache.kafka.streams.kstream.KStream
-
Create a new
KStream
instance by applying a
Transformer
to all elements in this stream, one element at a time.
- transform(K, V) - Method in interface org.apache.kafka.streams.kstream.Transformer
-
Transform the record with the given key and value.
- transform(V) - Method in interface org.apache.kafka.streams.kstream.ValueTransformer
-
Transform the record with the given key and value.
- Transformer<K,V,R> - Interface in org.apache.kafka.streams.kstream
-
A stateful
Transformer
interface for transform a key-value pair into a new value.
- TransformerSupplier<K,V,R> - Interface in org.apache.kafka.streams.kstream
-
- transformValues(ValueTransformerSupplier<V, R>, String...) - Method in interface org.apache.kafka.streams.kstream.KStream
-
- WallclockTimestampExtractor - Class in org.apache.kafka.streams.processor
-
Retrieves current wall clock timestamps as System.currentTimeMillis()
.
- WallclockTimestampExtractor() - Constructor for class org.apache.kafka.streams.processor.WallclockTimestampExtractor
-
- Window - Class in org.apache.kafka.streams.kstream
-
A single window instance, defined by its start and end timestamp.
- Window(long, long) - Constructor for class org.apache.kafka.streams.kstream.Window
-
Create a new window for the given start time (inclusive) and end time (exclusive).
- window() - Method in class org.apache.kafka.streams.kstream.Windowed
-
Return the window containing the values associated with this key.
- Windowed<K> - Class in org.apache.kafka.streams.kstream
-
The windowed key interface used in
KTable
, used for representing a windowed table result from windowed stream aggregations,
i.e.
- Windowed(K, Window) - Constructor for class org.apache.kafka.streams.kstream.Windowed
-
- windowed(long, int, boolean) - Method in interface org.apache.kafka.streams.state.Stores.PersistentKeyValueFactory
-
Set the persistent store as a windowed key-value store
- Windows<W extends Window> - Class in org.apache.kafka.streams.kstream
-
The window specification interface that can be extended for windowing operation in joins and aggregations.
- Windows(String) - Constructor for class org.apache.kafka.streams.kstream.Windows
-
- windowsFor(long) - Method in class org.apache.kafka.streams.kstream.JoinWindows
-
- windowsFor(long) - Method in class org.apache.kafka.streams.kstream.TimeWindows
-
- windowsFor(long) - Method in class org.apache.kafka.streams.kstream.UnlimitedWindows
-
- windowsFor(long) - Method in class org.apache.kafka.streams.kstream.Windows
-
Creates all windows that contain the provided timestamp.
- WindowStore<K,V> - Interface in org.apache.kafka.streams.state
-
- WindowStoreIterator<E> - Interface in org.apache.kafka.streams.state
-
- withBuiltinTypes(String, Class<K>, Class<V>) - Static method in class org.apache.kafka.streams.state.StateSerdes
-
Create a new instance of
StateSerdes
for the given state name and key-/value-type classes.
- withByteArrayKeys() - Method in class org.apache.kafka.streams.state.Stores.StoreFactory
-
Begin to create a
KeyValueStore
by specifying the keys will be byte arrays.
- withByteArrayValues() - Method in class org.apache.kafka.streams.state.Stores.ValueFactory
-
Use byte arrays for values.
- withByteBufferKeys() - Method in class org.apache.kafka.streams.state.Stores.StoreFactory
-
Begin to create a
KeyValueStore
by specifying the keys will be
ByteBuffer
.
- withByteBufferValues() - Method in class org.apache.kafka.streams.state.Stores.ValueFactory
-
Use ByteBuffer
for values.
- withDoubleKeys() - Method in class org.apache.kafka.streams.state.Stores.StoreFactory
-
Begin to create a
KeyValueStore
by specifying the keys will be
Double
s.
- withDoubleValues() - Method in class org.apache.kafka.streams.state.Stores.ValueFactory
-
Use Double
values.
- within(long) - Method in class org.apache.kafka.streams.kstream.JoinWindows
-
Specifies that records of the same key are joinable if their timestamps are within timeDifference
.
- withIntegerKeys() - Method in class org.apache.kafka.streams.state.Stores.StoreFactory
-
Begin to create a
KeyValueStore
by specifying the keys will be
Integer
s.
- withIntegerValues() - Method in class org.apache.kafka.streams.state.Stores.ValueFactory
-
Use Integer
values.
- withKeys(Class<K>) - Method in class org.apache.kafka.streams.state.Stores.StoreFactory
-
- withKeys(Serde<K>) - Method in class org.apache.kafka.streams.state.Stores.StoreFactory
-
Begin to create a
KeyValueStore
by specifying the serializer and deserializer for the keys.
- withLongKeys() - Method in class org.apache.kafka.streams.state.Stores.StoreFactory
-
Begin to create a
KeyValueStore
by specifying the keys will be
Long
s.
- withLongValues() - Method in class org.apache.kafka.streams.state.Stores.ValueFactory
-
Use Long
values.
- withStringKeys() - Method in class org.apache.kafka.streams.state.Stores.StoreFactory
-
Begin to create a
KeyValueStore
by specifying the keys will be
String
s.
- withStringValues() - Method in class org.apache.kafka.streams.state.Stores.ValueFactory
-
Use String
values.
- withValues(Class<V>) - Method in class org.apache.kafka.streams.state.Stores.ValueFactory
-
Use values of the specified type.
- withValues(Serde<V>) - Method in class org.apache.kafka.streams.state.Stores.ValueFactory
-
Use the specified serializer and deserializer for the values.
- writeAsText(String) - Method in interface org.apache.kafka.streams.kstream.KStream
-
Write the elements of this stream to a file at the given path.
- writeAsText(String, Serde<K>, Serde<V>) - Method in interface org.apache.kafka.streams.kstream.KStream
-
- writeAsText(String) - Method in interface org.apache.kafka.streams.kstream.KTable
-
Write the elements of this stream to a file at the given path using default serializers and deserializers.
- writeAsText(String, Serde<K>, Serde<V>) - Method in interface org.apache.kafka.streams.kstream.KTable
-
Write the elements of this stream to a file at the given path.
- writeTo(DataOutputStream) - Method in class org.apache.kafka.streams.processor.TaskId
-
- writeTo(ByteBuffer) - Method in class org.apache.kafka.streams.processor.TaskId
-