generated from camel.apache.org/kamelet
2
Home
gitea_admin edited this page 2026-03-11 14:38:21 +00:00
Table of Contents
Kafka Scram Source
Receive data from Kafka topics through SCRAM login module.
Metadata
| Property | Value |
|---|---|
| Type | source |
| Group | Kafka |
| Namespace | Kafka |
| Support Level | Preview |
| Provider | Apache Software Foundation |
Properties
| Name | Type | Required | Default | Description |
|---|---|---|---|---|
topic |
string | ✓ | Comma separated list of Kafka topic names | |
bootstrapServers |
string | ✓ | Comma separated list of Kafka Broker URLs | |
securityProtocol |
string | SASL_SSL |
Protocol used to communicate with brokers. SASL_PLAINTEXT, PLAINTEXT, SASL_SSL and SSL are supported | |
saslMechanism |
string | SCRAM-SHA-512 |
The Simple Authentication and Security Layer (SASL) Mechanism used. | |
user |
string | ✓ | Username to authenticate to Kafka | |
password |
string | ✓ | Password to authenticate to kafka | |
autoCommitEnable |
boolean | true |
If true, periodically commit to ZooKeeper the offset of messages already fetched by the consumer | |
allowManualCommit |
boolean | false |
Whether to allow doing manual commits | |
pollOnError |
string | ERROR_HANDLER |
What to do if kafka threw an exception while polling for new messages. There are 5 enums and the value can be one of DISCARD, ERROR_HANDLER, RECONNECT, RETRY, STOP | |
autoOffsetReset |
string | latest |
What to do when there is no initial offset. There are 3 enums and the value can be one of latest, earliest, none | |
consumerGroup |
string | A string that uniquely identifies the group of consumers to which this source belongs | ||
deserializeHeaders |
boolean | true |
When enabled the Kamelet source will deserialize all message headers to String representation. | |
topicIsPattern |
boolean | false |
Whether the topic is a pattern (regular expression). This can be used to subscribe to dynamic number of topics matching the pattern. |
Dependencies
camel:corecamel:kafkacamel:kamelet