Package io.deephaven.kafka
Class KafkaTools.Consume
java.lang.Object
io.deephaven.kafka.KafkaTools.Consume
- Enclosing class:
- KafkaTools
-
Nested Class Summary
Modifier and TypeClassDescriptionstatic class
Class to specify conversion of Kafka KEY or VALUE fields to table columns. -
Field Summary
-
Constructor Summary
-
Method Summary
Modifier and TypeMethodDescriptionAvro spec from fetching an Avro schema from a Confluent compatible Schema Server The Properties used to initialize Kafka should contain the URL for the Schema Server to use under the "schema.registry.url" property.Avro spec from fetching an Avro schema from a Confluent compatible Schema Server.Avro spec from fetching an Avro schema from a Confluent compatible Schema Server.avroSpec
(String schemaName, String schemaVersion, Function<String, String> fieldNameToColumnName, boolean useUTF8Strings) Avro spec from fetching an Avro schema from a Confluent compatible Schema Server.Avro spec from fetching an Avro schema from a Confluent compatible Schema Server.avroSpec
(org.apache.avro.Schema schema) Avro spec from an Avro schema.Avro spec from an Avro schema.Spec to explicitly ask one of the "consume" methods to ignore either key or value.jsonSpec
(@NotNull ColumnDefinition<?>[] columnDefinitions) A JSON spec from a set of column definitions.jsonSpec
(@NotNull ColumnDefinition<?>[] columnDefinitions, @Nullable com.fasterxml.jackson.databind.ObjectMapper objectMapper) A JSON spec from a set of column definitions using a customObjectMapper
.jsonSpec
(@NotNull ColumnDefinition<?>[] columnDefinitions, @Nullable Map<String, String> fieldToColumnName) A JSON spec from a set of column definitions, with a specific mapping of JSON nodes to columns.jsonSpec
(@NotNull ColumnDefinition<?>[] columnDefinitions, @Nullable Map<String, String> fieldToColumnName, @Nullable com.fasterxml.jackson.databind.ObjectMapper objectMapper) A JSON spec from a set of column definitions, with a specific mapping of JSON nodes to columns and a customObjectMapper
.Creates a kafka key or value spec implementation from a named object processor provider.objectProcessorSpec
(NamedObjectProcessor<? super byte[]> processor) Creates a kafka key or value spec implementation from the named object processor.objectProcessorSpec
(ObjectProcessor<? super byte[]> processor, List<String> columnNames) Creates a kafka key or value spec implementation from a byte-arrayObjectProcessor
.static <T> KafkaTools.Consume.KeyOrValueSpec
objectProcessorSpec
(org.apache.kafka.common.serialization.Deserializer<? extends T> deserializer, NamedObjectProcessor<? super T> processor) Creates a kafka key or value spec implementation from aNamedObjectProcessor
.static <T> KafkaTools.Consume.KeyOrValueSpec
objectProcessorSpec
(org.apache.kafka.common.serialization.Deserializer<? extends T> deserializer, ObjectProcessor<? super T> processor, List<String> columnNames) Creates a kafka key or value spec implementation from anObjectProcessor
.protobufSpec
(ProtobufConsumeOptions options) The kafka protobuf specs.rawSpec
(ColumnHeader<?> header, Class<? extends org.apache.kafka.common.serialization.Deserializer<?>> deserializer) simpleSpec
(String columnName) IfcolumnName
is set, that column name will be used.simpleSpec
(String columnName, Class<?> dataType) IfcolumnName
is set, that column name will be used.
-
Field Details
-
IGNORE
-
-
Constructor Details
-
Consume
public Consume()
-
-
Method Details
-
ignoreSpec
Spec to explicitly ask one of the "consume" methods to ignore either key or value. -
jsonSpec
public static KafkaTools.Consume.KeyOrValueSpec jsonSpec(@NotNull @NotNull ColumnDefinition<?>[] columnDefinitions, @Nullable @Nullable Map<String, String> fieldToColumnName, @Nullable @Nullable com.fasterxml.jackson.databind.ObjectMapper objectMapper) A JSON spec from a set of column definitions, with a specific mapping of JSON nodes to columns and a customObjectMapper
. JSON nodes can be specified as a string field name, or as a JSON Pointer string (see RFC 6901, ISSN: 2070-1721).- Parameters:
columnDefinitions
- An array of column definitions for specifying the table to be createdfieldToColumnName
- A mapping from JSON field names or JSON Pointer strings to column names provided in the definition. For each field key, if it starts with '/' it is assumed to be a JSON Pointer (e.g.,"/parent/nested"
represents a pointer to the nested field"nested"
inside the toplevel field"parent"
). Fields not included will be ignoredobjectMapper
- A customObjectMapper
to use for deserializing JSON. May be null.- Returns:
- A JSON spec for the given inputs
-
jsonSpec
public static KafkaTools.Consume.KeyOrValueSpec jsonSpec(@NotNull @NotNull ColumnDefinition<?>[] columnDefinitions, @Nullable @Nullable Map<String, String> fieldToColumnName) A JSON spec from a set of column definitions, with a specific mapping of JSON nodes to columns. JSON nodes can be specified as a string field name, or as a JSON Pointer string (see RFC 6901, ISSN: 2070-1721).- Parameters:
columnDefinitions
- An array of column definitions for specifying the table to be createdfieldToColumnName
- A mapping from JSON field names or JSON Pointer strings to column names provided in the definition. For each field key, if it starts with '/' it is assumed to be a JSON Pointer (e.g.,"/parent/nested"
represents a pointer to the nested field"nested"
inside the toplevel field"parent"
). Fields not included will be ignored- Returns:
- A JSON spec for the given inputs
-
jsonSpec
public static KafkaTools.Consume.KeyOrValueSpec jsonSpec(@NotNull @NotNull ColumnDefinition<?>[] columnDefinitions, @Nullable @Nullable com.fasterxml.jackson.databind.ObjectMapper objectMapper) A JSON spec from a set of column definitions using a customObjectMapper
.- Parameters:
columnDefinitions
- An array of column definitions for specifying the table to be created. The column names should map one to JSON fields expected; is not necessary to include all fields from the expected JSON, any fields not included would be ignored.objectMapper
- A customObjectMapper
to use for deserializing JSON. May be null.- Returns:
- A JSON spec for the given inputs.
-
jsonSpec
public static KafkaTools.Consume.KeyOrValueSpec jsonSpec(@NotNull @NotNull ColumnDefinition<?>[] columnDefinitions) A JSON spec from a set of column definitions.- Parameters:
columnDefinitions
- An array of column definitions for specifying the table to be created. The column names should map one to JSON fields expected; is not necessary to include all fields from the expected JSON, any fields not included would be ignored.- Returns:
- A JSON spec for the given inputs.
-
avroSpec
public static KafkaTools.Consume.KeyOrValueSpec avroSpec(org.apache.avro.Schema schema, Function<String, String> fieldNameToColumnName) Avro spec from an Avro schema.- Parameters:
schema
- An Avro schema.fieldNameToColumnName
- A mapping specifying which Avro fields to include and what column name to use for them; fields mapped to null are excluded.- Returns:
- A spec corresponding to the schema provided.
-
avroSpec
Avro spec from an Avro schema. All fields in the schema are mapped to columns of the same name.- Parameters:
schema
- An Avro schema.- Returns:
- A spec corresponding to the schema provided.
-
avroSpec
public static KafkaTools.Consume.KeyOrValueSpec avroSpec(String schemaName, String schemaVersion, Function<String, String> fieldNameToColumnName) Avro spec from fetching an Avro schema from a Confluent compatible Schema Server. The Properties used to initialize Kafka should contain the URL for the Schema Server to use under the "schema.registry.url" property.- Parameters:
schemaName
- The registered name for the schema on Schema ServerschemaVersion
- The version to fetchfieldNameToColumnName
- A mapping specifying which Avro fields to include and what column name to use for them; fields mapped to null are excluded.- Returns:
- A spec corresponding to the schema provided.
-
avroSpec
public static KafkaTools.Consume.KeyOrValueSpec avroSpec(String schemaName, String schemaVersion, Function<String, String> fieldNameToColumnName, boolean useUTF8Strings) Avro spec from fetching an Avro schema from a Confluent compatible Schema Server. The Properties used to initialize Kafka should contain the URL for the Schema Server to use under the "schema.registry.url" property.- Parameters:
schemaName
- The registered name for the schema on Schema ServerschemaVersion
- The version to fetchfieldNameToColumnName
- A mapping specifying which Avro fields to include and what column name to use for them; fields mapped to null are excluded.useUTF8Strings
- If true, String fields will be not be converted to Java Strings.- Returns:
- A spec corresponding to the schema provided.
-
avroSpec
public static KafkaTools.Consume.KeyOrValueSpec avroSpec(String schemaName, Function<String, String> fieldNameToColumnName) Avro spec from fetching an Avro schema from a Confluent compatible Schema Server. The Properties used to initialize Kafka should contain the URL for the Schema Server to use under the "schema.registry.url" property. The version fetched would be latest.- Parameters:
schemaName
- The registered name for the schema on Schema ServerfieldNameToColumnName
- A mapping specifying which Avro fields to include and what column name to use for them; fields mapped to null are excluded.- Returns:
- A spec corresponding to the schema provided.
-
avroSpec
Avro spec from fetching an Avro schema from a Confluent compatible Schema Server. The Properties used to initialize Kafka should contain the URL for the Schema Server to use under the "schema.registry.url" property. All fields in the schema are mapped to columns of the same name.- Parameters:
schemaName
- The registered name for the schema on Schema ServerschemaVersion
- The version to fetch- Returns:
- A spec corresponding to the schema provided
-
avroSpec
Avro spec from fetching an Avro schema from a Confluent compatible Schema Server The Properties used to initialize Kafka should contain the URL for the Schema Server to use under the "schema.registry.url" property. The version fetched is latest All fields in the schema are mapped to columns of the same name- Parameters:
schemaName
- The registered name for the schema on Schema Server.- Returns:
- A spec corresponding to the schema provided.
-
protobufSpec
The kafka protobuf specs. This will fetch theprotobuf descriptor
based on theProtobufConsumeOptions.descriptorProvider()
and create themessage
parsing functions according toProtobufDescriptorParser.parse(Descriptor, ProtobufDescriptorParserOptions)
. These functions will be adapted to handle schema changes.- Parameters:
options
- the options- Returns:
- the key or value spec
- See Also:
-
simpleSpec
IfcolumnName
is set, that column name will be used. Otherwise, the names for the key or value columns can be provided in the properties as "deephaven.key.column.name" or "deephaven.value.column.name", and otherwise default to "KafkaKey" or "KafkaValue". IfdataType
is set, that type will be used for the column type. Otherwise, the types for key or value are either specified in the properties as "deephaven.key.column.type" or "deephaven.value.column.type" or deduced from the serializer classes for "key.deserializer" or "value.deserializer" in the provided Properties object. -
simpleSpec
IfcolumnName
is set, that column name will be used. Otherwise, the names for the key or value columns can be provided in the properties as "deephaven.key.column.name" or "deephaven.value.column.name", and otherwise default to "KafkaKey" or "KafkaValue". The types for key or value are either specified in the properties as "deephaven.key.column.type" or "deephaven.value.column.type" or deduced from the serializer classes for "key.deserializer" or "value.deserializer" in the provided Properties object. -
rawSpec
public static KafkaTools.Consume.KeyOrValueSpec rawSpec(ColumnHeader<?> header, Class<? extends org.apache.kafka.common.serialization.Deserializer<?>> deserializer) -
objectProcessorSpec
public static <T> KafkaTools.Consume.KeyOrValueSpec objectProcessorSpec(org.apache.kafka.common.serialization.Deserializer<? extends T> deserializer, ObjectProcessor<? super T> processor, List<String> columnNames) Creates a kafka key or value spec implementation from anObjectProcessor
.Equivalent to
objectProcessorSpec(deserializer, NamedObjectProcessor.of(processor, columnNames))
.- Type Parameters:
T
- the object type- Parameters:
deserializer
- the deserializerprocessor
- the object processorcolumnNames
- the column names- Returns:
- the Kafka key or value spec
-
objectProcessorSpec
public static KafkaTools.Consume.KeyOrValueSpec objectProcessorSpec(ObjectProcessor<? super byte[]> processor, List<String> columnNames) Creates a kafka key or value spec implementation from a byte-arrayObjectProcessor
.Equivalent to
objectProcessorSpec(NamedObjectProcessor.of(processor, columnNames))
.- Parameters:
processor
- the byte-array object processorcolumnNames
- the column names- Returns:
- the Kafka key or value spec
-
objectProcessorSpec
public static <T> KafkaTools.Consume.KeyOrValueSpec objectProcessorSpec(org.apache.kafka.common.serialization.Deserializer<? extends T> deserializer, NamedObjectProcessor<? super T> processor) Creates a kafka key or value spec implementation from aNamedObjectProcessor
.- Type Parameters:
T
- the object type- Parameters:
deserializer
- the deserializerprocessor
- the named object processor- Returns:
- the Kafka key or value spec
-
objectProcessorSpec
public static KafkaTools.Consume.KeyOrValueSpec objectProcessorSpec(NamedObjectProcessor<? super byte[]> processor) Creates a kafka key or value spec implementation from the named object processor.Equivalent to
objectProcessorSpec(new ByteArrayDeserializer(), processor)
.- Parameters:
processor
- the named object processor- Returns:
- the Kafka key or value spec
- See Also:
-
objectProcessorSpec(Deserializer, NamedObjectProcessor)
ByteArrayDeserializer
-
objectProcessorSpec
public static KafkaTools.Consume.KeyOrValueSpec objectProcessorSpec(NamedObjectProcessor.Provider provider) Creates a kafka key or value spec implementation from a named object processor provider. It must be capable of supportingbyte[]
.Equivalent to
objectProcessorSpec(provider.named(Type.byteType().arrayType()))
.- Parameters:
provider
- the named object processor provider- Returns:
- the Kafka key or value spec
- See Also:
-