Class KafkaTools.Consume

java.lang.Object
io.deephaven.kafka.KafkaTools.Consume
Enclosing class:
KafkaTools

public static class KafkaTools.Consume extends Object
  • Field Details

  • Constructor Details

    • Consume

      public Consume()
  • Method Details

    • ignoreSpec

      public static KafkaTools.Consume.KeyOrValueSpec ignoreSpec()
      Spec to explicitly ask one of the "consume" methods to ignore either key or value.
    • jsonSpec

      public static KafkaTools.Consume.KeyOrValueSpec jsonSpec(@NotNull @NotNull ColumnDefinition<?>[] columnDefinitions, @Nullable @Nullable Map<String,String> fieldToColumnName, @Nullable @Nullable com.fasterxml.jackson.databind.ObjectMapper objectMapper)
      A JSON spec from a set of column definitions, with a specific mapping of JSON nodes to columns and a custom ObjectMapper. JSON nodes can be specified as a string field name, or as a JSON Pointer string (see RFC 6901, ISSN: 2070-1721).
      Parameters:
      columnDefinitions - An array of column definitions for specifying the table to be created
      fieldToColumnName - A mapping from JSON field names or JSON Pointer strings to column names provided in the definition. For each field key, if it starts with '/' it is assumed to be a JSON Pointer (e.g., "/parent/nested" represents a pointer to the nested field "nested" inside the toplevel field "parent"). Fields not included will be ignored
      objectMapper - A custom ObjectMapper to use for deserializing JSON. May be null.
      Returns:
      A JSON spec for the given inputs
    • jsonSpec

      public static KafkaTools.Consume.KeyOrValueSpec jsonSpec(@NotNull @NotNull ColumnDefinition<?>[] columnDefinitions, @Nullable @Nullable Map<String,String> fieldToColumnName)
      A JSON spec from a set of column definitions, with a specific mapping of JSON nodes to columns. JSON nodes can be specified as a string field name, or as a JSON Pointer string (see RFC 6901, ISSN: 2070-1721).
      Parameters:
      columnDefinitions - An array of column definitions for specifying the table to be created
      fieldToColumnName - A mapping from JSON field names or JSON Pointer strings to column names provided in the definition. For each field key, if it starts with '/' it is assumed to be a JSON Pointer (e.g., "/parent/nested" represents a pointer to the nested field "nested" inside the toplevel field "parent"). Fields not included will be ignored
      Returns:
      A JSON spec for the given inputs
    • jsonSpec

      public static KafkaTools.Consume.KeyOrValueSpec jsonSpec(@NotNull @NotNull ColumnDefinition<?>[] columnDefinitions, @Nullable @Nullable com.fasterxml.jackson.databind.ObjectMapper objectMapper)
      A JSON spec from a set of column definitions using a custom ObjectMapper.
      Parameters:
      columnDefinitions - An array of column definitions for specifying the table to be created. The column names should map one to JSON fields expected; is not necessary to include all fields from the expected JSON, any fields not included would be ignored.
      objectMapper - A custom ObjectMapper to use for deserializing JSON. May be null.
      Returns:
      A JSON spec for the given inputs.
    • jsonSpec

      public static KafkaTools.Consume.KeyOrValueSpec jsonSpec(@NotNull @NotNull ColumnDefinition<?>[] columnDefinitions)
      A JSON spec from a set of column definitions.
      Parameters:
      columnDefinitions - An array of column definitions for specifying the table to be created. The column names should map one to JSON fields expected; is not necessary to include all fields from the expected JSON, any fields not included would be ignored.
      Returns:
      A JSON spec for the given inputs.
    • avroSpec

      public static KafkaTools.Consume.KeyOrValueSpec avroSpec(org.apache.avro.Schema schema, Function<String,String> fieldNameToColumnName)
      Avro spec from an Avro schema.
      Parameters:
      schema - An Avro schema.
      fieldNameToColumnName - A mapping specifying which Avro fields to include and what column name to use for them; fields mapped to null are excluded.
      Returns:
      A spec corresponding to the schema provided.
    • avroSpec

      public static KafkaTools.Consume.KeyOrValueSpec avroSpec(org.apache.avro.Schema schema)
      Avro spec from an Avro schema. All fields in the schema are mapped to columns of the same name.
      Parameters:
      schema - An Avro schema.
      Returns:
      A spec corresponding to the schema provided.
    • avroSpec

      public static KafkaTools.Consume.KeyOrValueSpec avroSpec(String schemaName, String schemaVersion, Function<String,String> fieldNameToColumnName)
      Avro spec from fetching an Avro schema from a Confluent compatible Schema Server. The Properties used to initialize Kafka should contain the URL for the Schema Server to use under the "schema.registry.url" property.
      Parameters:
      schemaName - The registered name for the schema on Schema Server
      schemaVersion - The version to fetch
      fieldNameToColumnName - A mapping specifying which Avro fields to include and what column name to use for them; fields mapped to null are excluded.
      Returns:
      A spec corresponding to the schema provided.
    • avroSpec

      public static KafkaTools.Consume.KeyOrValueSpec avroSpec(String schemaName, String schemaVersion, Function<String,String> fieldNameToColumnName, boolean useUTF8Strings)
      Avro spec from fetching an Avro schema from a Confluent compatible Schema Server. The Properties used to initialize Kafka should contain the URL for the Schema Server to use under the "schema.registry.url" property.
      Parameters:
      schemaName - The registered name for the schema on Schema Server
      schemaVersion - The version to fetch
      fieldNameToColumnName - A mapping specifying which Avro fields to include and what column name to use for them; fields mapped to null are excluded.
      useUTF8Strings - If true, String fields will be not be converted to Java Strings.
      Returns:
      A spec corresponding to the schema provided.
    • avroSpec

      public static KafkaTools.Consume.KeyOrValueSpec avroSpec(String schemaName, Function<String,String> fieldNameToColumnName)
      Avro spec from fetching an Avro schema from a Confluent compatible Schema Server. The Properties used to initialize Kafka should contain the URL for the Schema Server to use under the "schema.registry.url" property. The version fetched would be latest.
      Parameters:
      schemaName - The registered name for the schema on Schema Server
      fieldNameToColumnName - A mapping specifying which Avro fields to include and what column name to use for them; fields mapped to null are excluded.
      Returns:
      A spec corresponding to the schema provided.
    • avroSpec

      public static KafkaTools.Consume.KeyOrValueSpec avroSpec(String schemaName, String schemaVersion)
      Avro spec from fetching an Avro schema from a Confluent compatible Schema Server. The Properties used to initialize Kafka should contain the URL for the Schema Server to use under the "schema.registry.url" property. All fields in the schema are mapped to columns of the same name.
      Parameters:
      schemaName - The registered name for the schema on Schema Server
      schemaVersion - The version to fetch
      Returns:
      A spec corresponding to the schema provided
    • avroSpec

      public static KafkaTools.Consume.KeyOrValueSpec avroSpec(String schemaName)
      Avro spec from fetching an Avro schema from a Confluent compatible Schema Server The Properties used to initialize Kafka should contain the URL for the Schema Server to use under the "schema.registry.url" property. The version fetched is latest All fields in the schema are mapped to columns of the same name
      Parameters:
      schemaName - The registered name for the schema on Schema Server.
      Returns:
      A spec corresponding to the schema provided.
    • protobufSpec

      public static KafkaTools.Consume.KeyOrValueSpec protobufSpec(ProtobufConsumeOptions options)
      The kafka protobuf specs. This will fetch the protobuf descriptor based on the ProtobufConsumeOptions.descriptorProvider() and create the message parsing functions according to ProtobufDescriptorParser.parse(Descriptor, ProtobufDescriptorParserOptions). These functions will be adapted to handle schema changes.
      Parameters:
      options - the options
      Returns:
      the key or value spec
      See Also:
    • simpleSpec

      public static KafkaTools.Consume.KeyOrValueSpec simpleSpec(String columnName, Class<?> dataType)
      If columnName is set, that column name will be used. Otherwise, the names for the key or value columns can be provided in the properties as "deephaven.key.column.name" or "deephaven.value.column.name", and otherwise default to "KafkaKey" or "KafkaValue". If dataType is set, that type will be used for the column type. Otherwise, the types for key or value are either specified in the properties as "deephaven.key.column.type" or "deephaven.value.column.type" or deduced from the serializer classes for "key.deserializer" or "value.deserializer" in the provided Properties object.
    • simpleSpec

      public static KafkaTools.Consume.KeyOrValueSpec simpleSpec(String columnName)
      If columnName is set, that column name will be used. Otherwise, the names for the key or value columns can be provided in the properties as "deephaven.key.column.name" or "deephaven.value.column.name", and otherwise default to "KafkaKey" or "KafkaValue". The types for key or value are either specified in the properties as "deephaven.key.column.type" or "deephaven.value.column.type" or deduced from the serializer classes for "key.deserializer" or "value.deserializer" in the provided Properties object.
    • rawSpec

      public static KafkaTools.Consume.KeyOrValueSpec rawSpec(ColumnHeader<?> header, Class<? extends org.apache.kafka.common.serialization.Deserializer<?>> deserializer)
    • objectProcessorSpec

      public static <T> KafkaTools.Consume.KeyOrValueSpec objectProcessorSpec(org.apache.kafka.common.serialization.Deserializer<? extends T> deserializer, ObjectProcessor<? super T> processor, List<String> columnNames)
      Creates a kafka key or value spec implementation from an ObjectProcessor.

      The respective column definitions are derived from the combination of columnNames and ObjectProcessor.outputTypes().

      Type Parameters:
      T - the object type
      Parameters:
      deserializer - the deserializer
      processor - the object processor
      columnNames - the column names
      Returns:
      the Kafka key or value spec
    • objectProcessorSpec

      public static KafkaTools.Consume.KeyOrValueSpec objectProcessorSpec(ObjectProcessor<? super byte[]> processor, List<String> columnNames)
      Creates a kafka key or value spec implementation from a byte-array ObjectProcessor.

      Equivalent to objectProcessorSpec(new ByteArrayDeserializer(), processor, columnNames).

      Parameters:
      processor - the byte-array object processor
      columnNames - the column names
      Returns:
      the Kafka key or value spec