Kafka

Kafka source connector

Key Features

Description

Source connector for Apache Kafka.

Source Options

Name
Type
Required
Default
Description

topic

String

Yes

-

Topic name(s) to read data from when the table is used as source. It also supports topic list for source by separating topic by comma like 'topic-1,topic-2'.

table_list

Map

No

-

Topic list config You can configure only one table_list and one topic at the same time

bootstrap.servers

String

Yes

-

Comma separated list of Kafka brokers.

pattern

Boolean

No

false

If pattern is set to true,the regular expression for a pattern of topic names to read from. All topics in clients with names that match the specified regular expression will be subscribed by the consumer.

consumer.group

String

No

Nexus-Consumer-Group

Kafka consumer group id, used to distinguish different consumer groups.

commit_on_checkpoint

Boolean

No

true

If true the consumer's offset will be periodically committed in the background.

kafka.config

Map

No

-

In addition to the above necessary parameters that must be specified by the Kafka consumer client, users can also specify multiple consumer client non-mandatory parameters, covering all consumer parameters specified in the official Kafka document.

schema

Config

No

-

The structure of the data, including field names and field types.

format

String

No

json

Data format. The default format is json. Optional text format, canal_json, debezium_json, ogg_json and avro.If you use json or text format. The default field separator is ", ". If you customize the delimiter, add the "field_delimiter" option.If you use canal format, please refer to Canal Format for details.If you use debezium format, please refer to Debezium Format for details.

format_error_handle_way

String

No

fail

The processing method of data format error. The default value is fail, and the optional value is (fail, skip). When fail is selected, data format error will block and an exception will be thrown. When skip is selected, data format error will skip this line data.

field_delimiter

String

No

,

Customize the field delimiter for data format.

start_mode

StartMode[earliest],[group_offsets],[latest],[specific_offsets],[timestamp]

No

group_offsets

The initial consumption pattern of consumers.

start_mode.offsets

Config

No

-

The offset required for consumption mode to be specific_offsets.

start_mode.timestamp

Long

No

-

The time required for consumption mode to be "timestamp".

partition-discovery.interval-millis

Long

No

-1

The interval for dynamically discovering topics and partitions.

common-options

No

-

Source plugin common parameters, please refer to Source Common Options for details

Task Example

Simple

This example reads the data of kafka's topic_1, topic_2, topic_3 and prints it to the client.

Regex Topic

AWS MSK SASL/SCRAM

Replace the following ${username} and ${password} with the configuration values in AWS MSK.

AWS MSK IAM

Please ensure the IAM policy have "kafka-cluster:Connect",. Like this:

Source Config

Kerberos Authentication Example

Source Config

Multiple Kafka Source

This is written to the same pg table according to different formats and topics of parsing kafka Perform upsert operations based on the id

Last updated