kafka_connect
: Main kafka_connect class.
kafka_connect::config
: Manages the Kafka Connect configuration.kafka_connect::confluent_repo
: Manages the Confluent package repository.kafka_connect::confluent_repo::apt
: Manages the Confluent apt package repository.kafka_connect::confluent_repo::yum
: Manages the Confluent yum package repository.kafka_connect::install
: Manages the Kafka Connect installation.kafka_connect::manage_connectors
: Class to manage individual Kafka Connect connectors and connector secrets.kafka_connect::manage_connectors::connector
: Class to manage individual Kafka Connect connectors.kafka_connect::manage_connectors::secret
: Class to manage individual Kafka Connect connector secrets.kafka_connect::service
: Manages the Kafka Connect service.
kc_connector
: Manage running Kafka Connect connectors.
Kafka_connect::Connector
: Validate the individual connector data.Kafka_connect::Connectors
: Validate the connectors data.Kafka_connect::HubPlugins
: Validate the Confluent Hub plugins list.Kafka_connect::LogAppender
: Validate the log4j file appender.Kafka_connect::Loglevel
: Matches all valid log4j loglevels.Kafka_connect::Secret
: Validate the individual secret data.Kafka_connect::Secrets
: Validate the secrets data.
Main kafka_connect class.
include kafka_connect
class { 'kafka_connect':
config_storage_replication_factor => 3,
offset_storage_replication_factor => 3,
status_storage_replication_factor => 3,
bootstrap_servers => [ 'kafka-01:9092', 'kafka-02:9092', 'kafka-03:9092' ],
confluent_hub_plugins => [ 'confluentinc/kafka-connect-s3:10.5.7' ],
value_converter_schema_registry_url => "http://schemaregistry-elb.${facts['networking']['domain']}:8081",
}
class { 'kafka_connect':
log4j_enable_stdout => true,
log4j_custom_config_lines => [ 'log4j.logger.io.confluent.connect.elasticsearch=DEBUG' ],
confluent_hub_plugins => [ 'confluentinc/kafka-connect-elasticsearch:latest' ],
}
class { 'kafka_connect':
manage_connectors_only => true,
connector_config_dir => '/opt/kafka-connect/etc',
rest_port => 8084,
enable_delete => true,
}
class { 'kafka_connect':
config_mode => 'standalone',
run_local_kafka_broker_and_zk => true,
}
The following parameters are available in the kafka_connect
class:
manage_connectors_only
manage_confluent_repo
include_java
repo_ensure
repo_enabled
repo_version
package_name
package_ensure
manage_schema_registry_package
schema_registry_package_name
confluent_rest_utils_package_name
confluent_hub_plugin_path
confluent_hub_plugins
confluent_hub_client_package_name
confluent_common_package_name
config_mode
kafka_heap_options
kc_config_dir
config_storage_replication_factor
config_storage_topic
group_id
bootstrap_servers
key_converter
key_converter_schemas_enable
listeners
log4j_file_appender
log4j_appender_file_path
log4j_appender_max_file_size
log4j_appender_max_backup_index
log4j_appender_date_pattern
log4j_enable_stdout
log4j_custom_config_lines
log4j_loglevel_rootlogger
offset_storage_file_filename
offset_flush_interval_ms
offset_storage_topic
offset_storage_replication_factor
offset_storage_partitions
plugin_path
status_storage_topic
status_storage_replication_factor
status_storage_partitions
value_converter
value_converter_schema_registry_url
value_converter_schemas_enable
run_local_kafka_broker_and_zk
service_name
service_ensure
service_enable
service_provider
connectors_absent
connectors_paused
connector_config_dir
owner
group
connector_config_file_mode
connector_secret_file_mode
hostname
rest_port
enable_delete
restart_on_failed_state
Data type: Boolean
Flag for including the connector management class only.
Default value: false
Data type: Boolean
Flag for including the confluent repo class.
Default value: true
Data type: Boolean
Flag for including class java.
Default value: false
Data type: Enum['present', 'absent']
Ensure value for the Confluent package repo resource.
Default value: 'present'
Data type: Boolean
Enabled value for the Confluent package repo resource.
Default value: true
Data type: Pattern[/^(\d+\.\d+|\d+)$/]
Version of the Confluent repo to configure.
Default value: '7.5'
Data type: String[1]
Name of the main KC package to manage.
Default value: 'confluent-kafka'
Data type: String[1]
State of the package to ensure. Note that this may be used by more than one resource, depending on the setup.
Default value: '7.5.1-1'
Data type: Boolean
Flag for managing the Schema Registry package (and REST Utils dependency package).
Default value: true
Data type: String[1]
Name of the Schema Registry package.
Default value: 'confluent-schema-registry'
Data type: String[1]
Name of the Confluent REST Utils package.
Default value: 'confluent-rest-utils'
Data type: Stdlib::Absolutepath
Installation path for Confluent Hub plugins.
Default value: '/usr/share/confluent-hub-components'
Data type: Kafka_connect::HubPlugins
List of Confluent Hub plugins to install. Each should be in the format author/name:semantic-version, e.g. 'acme/fancy-plugin:0.1.0' Also accepts 'latest' in place of a specific version.
Default value: []
Data type: String[1]
Name of the Confluent Hub Client package.
Default value: 'confluent-hub-client'
Data type: String[1]
Name of the Confluent Common package.
Default value: 'confluent-common'
Data type: Enum['distributed', 'standalone']
Configuration mode to use for the setup.
Default value: 'distributed'
Data type: String[1]
Value to set for 'KAFKA_HEAP_OPTS' export.
Default value: '-Xms256M -Xmx2G'
Data type: Stdlib::Absolutepath
Configuration directory for KC properties files.
Default value: '/etc/kafka'
Data type: Integer
Config value to set for 'config.storage.replication.factor'.
Default value: 1
Data type: String[1]
Config value to set for 'config.storage.topic'.
Default value: 'connect-configs'
Data type: String[1]
Config value to set for 'group.id'.
Default value: 'connect-cluster'
Data type: Array[String[1]]
Config value to set for 'bootstrap.servers'.
Default value: ['localhost:9092']
Data type: String[1]
Config value to set for 'key.converter'.
Default value: 'org.apache.kafka.connect.json.JsonConverter'
Data type: Boolean
Config value to set for 'key.converter.schemas.enable'.
Default value: true
Data type: Stdlib::HTTPUrl
Config value to set for 'listeners'.
Default value: 'HTTP://:8083'
Data type: Kafka_connect::LogAppender
Log4j file appender type to use (RollingFileAppender or DailyRollingFileAppender).
Default value: 'RollingFileAppender'
Data type: Stdlib::Absolutepath
Config value to set for 'log4j.appender.file.File'.
Default value: '/var/log/confluent/connect.log'
Data type: String[1]
Config value to set for 'log4j.appender.file.MaxFileSize'. Only used if log4j_file_appender = 'RollingFileAppender'.
Default value: '100MB'
Data type: Integer
Config value to set for 'log4j.appender.file.MaxBackupIndex'. Only used if log4j_file_appender = 'RollingFileAppender'.
Default value: 10
Data type: String[1]
Config value to set for 'log4j.appender.file.DatePattern'. Only used if log4j_file_appender = 'DailyRollingFileAppender'.
Default value: '\'.\'yyyy-MM-dd-HH'
Data type: Boolean
Option to enable logging to stdout/console.
Default value: false
Data type: Optional[Array[String[1]]]
Option to provide additional custom logging configuration. Can be used, for example, to adjust the log level for a specific connector type. See: https://docs.confluent.io/platform/current/connect/logging.html#use-the-kconnect-log4j-properties-file
Default value: undef
Data type: Kafka_connect::Loglevel
Config value to set for 'log4j.rootLogger'.
Default value: 'INFO'
Data type: String[1]
Config value to set for 'offset.storage.file.filename'. Only used in standalone mode.
Default value: '/tmp/connect.offsets'
Data type: Integer
Config value to set for 'offset.flush.interval.ms'.
Default value: 10000
Data type: String[1]
Config value to set for 'offset.storage.topic'.
Default value: 'connect-offsets'
Data type: Integer
Config value to set for 'offset.storage.replication.factor'.
Default value: 1
Data type: Integer
Config value to set for 'offset.storage.partitions'.
Default value: 25
Data type: Stdlib::Absolutepath
Config value to set for 'plugin.path'.
Default value: '/usr/share/java,/usr/share/confluent-hub-components'
Data type: String[1]
Config value to set for 'status.storage.topic'.
Default value: 'connect-status'
Data type: Integer
Config value to set for 'status.storage.replication.factor'.
Default value: 1
Data type: Integer
Config value to set for 'status.storage.partitions'.
Default value: 5
Data type: String[1]
Config value to set for 'value.converter'.
Default value: 'org.apache.kafka.connect.json.JsonConverter'
Data type: Optional[Stdlib::HTTPUrl]
Config value to set for 'value.converter.schema.registry.url', if defined.
Default value: undef
Data type: Boolean
Config value to set for 'value.converter.schemas.enable'.
Default value: true
Data type: Boolean
Flag for running local kafka broker and zookeeper services. Intended only for use with standalone config mode.
Default value: false
Data type: String[1]
Name of the service to manage.
Default value: 'confluent-kafka-connect'
Data type: Stdlib::Ensure::Service
State of the service to ensure.
Default value: 'running'
Data type: Boolean
Value for enabling the service at boot.
Default value: true
Data type: Optional[String[1]]
Backend provider to use for the service resource.
Default value: undef
Data type: Optional[Array[String[1]]]
List of connectors to ensure absent. Deprecated: use the 'ensure' hash key in the connector data instead.
Default value: undef
Data type: Optional[Array[String[1]]]
List of connectors to ensure paused. Deprecated: use the 'ensure' hash key in the connector data instead.
Default value: undef
Data type: Stdlib::Absolutepath
Configuration directory for connector properties files.
Default value: '/etc/kafka-connect'
Data type: Variant[String[1], Integer]
Owner to set on config files.
Default value: 'cp-kafka-connect'
Data type: Variant[String[1], Integer]
Group to set on config files.
Default value: 'confluent'
Data type: Stdlib::Filemode
Mode to set on connector config file.
Default value: '0640'
Data type: Stdlib::Filemode
Mode to set on connector secret file.
Default value: '0600'
Data type: String[1]
The hostname or IP of the KC service.
Default value: 'localhost'
Data type: Stdlib::Port
Port to connect to for the REST API.
Default value: 8083
Data type: Boolean
Enable delete of running connectors. Required for the provider to actually remove when set to absent.
Default value: false
Data type: Boolean
Allow the provider to auto restart on FAILED connector state.
Default value: false
Manage running Kafka Connect connectors.
The following properties are available in the kc_connector
type.
Valid values: yes
, no
, unknown
Property to ensure running config matches file config.
Default value: yes
Valid values: RUNNING
, PAUSED
State of the connector to ensure.
Default value: RUNNING
Valid values: present
, absent
The basic property that the resource should be in.
Default value: present
Valid values: RUNNING
State of the connector tasks to ensure. This is just used to catch failed tasks and should not be changed.
Default value: RUNNING
The following parameters are available in the kc_connector
type.
Config file fully qualified path.
Valid values: true
, false
, yes
, no
Flag to enable delete, required for remove action.
Default value: false
The hostname or IP of the KC service.
Default value: localhost
namevar
The name of the connector resource you want to manage.
The listening port of the KC service.
Default value: 8083
The specific backend to use for this kc_connector
resource. You will seldom need to specify this --- Puppet will
usually discover the appropriate provider for your platform.
Valid values: true
, false
, yes
, no
Flag to enable auto restart on FAILED connector state.
Default value: false
Validate the individual connector data.
Alias of
Struct[{
Optional['ensure'] => Enum['absent', 'present', 'running', 'paused'],
'name' => String[1],
Optional['config'] => Hash[String[1], String],
}]
Validate the connectors data.
Alias of
Hash[String[1], Kafka_connect::Connector]
Validate the Confluent Hub plugins list.
Alias of
Array[Optional[Pattern[/^\w+\/[a-zA-z0-9]{1,}[a-zA-z0-9\-]{0,}:(\d+\.\d+\.\d+|latest)$/]]]
Validate the log4j file appender.
Alias of
Enum['DailyRollingFileAppender', 'RollingFileAppender']
Matches all valid log4j loglevels.
Alias of
Enum['TRACE', 'DEBUG', 'INFO', 'WARN', 'ERROR', 'FATAL']
Validate the individual secret data.
Alias of
Struct[{
Optional['ensure'] => Enum['absent', 'present', 'file'],
Optional['connectors'] => Array[String[1]],
Optional['key'] => String[1],
Optional['value'] => String[1],
Optional['kv_data'] => Hash[String[1], String[1]],
}]
Validate the secrets data.
Alias of
Hash[String[1], Kafka_connect::Secret]