Apache Kafka Integration: Defining Producer Jobs
Producers in Apache Kafka are external applications that write messages to Topics in the target Apache Kafka environment. They are filters that consist of key-value pairs that contain the message metadata, the content, and so forth.
An Automic Automation Producer Job represents a Producer on your Apache Kafka environment. You define the Automic Automation Producer Job on various pages. On the Producer Job page you define where to write the message in the Apache Kafka environment and the message itself.
-
Make sure that you have already selected the Apache Kafka Agent when you start defining the parameters on the Producer Job page. To do so, go to the Attributes page and select it from the Agent dropdown list.
-
To configure an Automic Automation Producer Job you must know certain parameters that are defined in the target Apache Kafka environment. If you are not aware of them, you must request them from the team that works with the target cloud solution.
Configuring a Producer Job
Configuring a Producer Job implies specifying the location on your Apache Kafka environment where you want to publish the message and the content of the message.
-
Configuration File Path
The path on the Agent machine where the Apache Kafka configuration properties file is stored. It contains the Apache Kafka configuration details (host url, SSL connectivity, authorization mechanism properties, broker settings, partition settings, and so on). The same configuration file is used for Consumer and Producer Jobs.
Important!Usually, Automic Automation Jobs require Connection objects to establish the connection to the target cloud platform. This is NOT true for the Apache Kafka integration. The configuration file contains the parameters required to establish the connection between the Job in Automic Automation and the broker in the target Apache Kafka environment.
Example:
-
UNIX: /opt/home/kafka/server.properties
-
Windows: C:\kafka\server.properties
For more information about the available configuration parameters, please refer to the Apache Kafka official product documentation at https://kafka.apache.org/documentation/#producerconfigs and https://docs.confluent.io/platform/current/installation/configuration/producer-configs.html
Example 1
acks=all
bootstrap.servers= <server>
key.serializer= org.apache.kafka.common.serialization.StringSerializer
value.serializer= org.apache.kafka.common.serialization.StringSerializer
key.deserializer= org.apache.kafka.common.serialization.StringDeserializer
value.deserializer= org.apache.kafka.common.serialization.StringDeserializer
max.poll.records= 1
auto.offset.reset= earliest
Example 2: Connecting to a server running over SSL
acks=all
bootstrap.servers= <server>
key.serializer= org.apache.kafka.common.serialization.StringSerializer
value.serializer= org.apache.kafka.common.serialization.StringSerializer
key.deserializer= org.apache.kafka.common.serialization.StringDeserializer
value.deserializer= org.apache.kafka.common.serialization.StringDeserializer
max.poll.records= 1
auto.offset.reset= earliest
ssl.protocol=TLSv1.2
security.protocol=SSL
ssl.keystore.location=/root/kafka_ssl/kafka.server.keystore.jks
ssl.keystore.password=<password>
ssl.key.password=<password>
ssl.truststore.location=/root/kafka_ssl/kafka.server.truststore.jks
ssl.truststore.password=<password>
ssl.enabled.protocols=TLSv1.2,TLSv1.1,TLSv1
-
-
Topic
Topic in your Apache Kafka environment to which the Producer job should write the message.
-
Partition
Optionally, specify the partition within the topic to which the Job should write the message.
-
Timestamp
There are three scenarios regarding the timestamp:
-
You specify a timestamp here. When this Job executes, the corresponding Producer Job on Apache Kafka will assume this value.
-
(Option that is automatically handled by Apache Kafka) You leave this field empty; Apache Kafka uses the timestamp specified in the Configuration File.
-
(Option that is automatically handled by Apache Kafka) You leave this field empty; Apache Kafka uses the timestamp when the message reached the Partition.
If the external application that originated the original message specifies a timestamp, you must specify it here. It can have the following formats:
-
Epoch
-
YY/MM/DD/hh/mm/ss
If empty, then Apache Kafka will use the timestamp available in the Configuration File.
-
-
Message Headers
Key-value pair in either string or JSON format with metadata about the message on top of the payload and the key. For example, the message header could specify the type of message (JSON or something different), it could provide filtering or routing information that helps Apache Kafka route the message to the right Topic. You can also use this field to annotate, audit and monitor messages as they flow through Apache Kafka.
Example 1:
{ "messagetype":"Automic Event", "origin":"Automic Automation" }
Example 2:
"messagetype":"Automic Event", "origin": { "name" : "Automic Automation", "version" : "1.0" } }
Example 3:
{ "{\"type\" : \"messagetype\"}":"Automic Event", "origin": { "name" : "Automic Automation", "version" : "1.0" } }
-
Message Key
(Optional) When publishing a message to an Apache Kafka Topic, the message is distributed among the available Partitions in a round-robin way. The Message Key caters for a sequential storage of the messages within a Partition, but not across Partitions.
The key can be in any free form text (string, JSON, XML, and so on).
Important!If you use keys in messages, all messages with the same key will be stored in the same Partition.
Example 1:
Automic
Example 2:
{ "applicationname" : "Automic" }
-
Message Value
Actual payload of the message. It can be in any free form text (string, JSON, XML, and so on).
Automic Automation lets you prepare the content of the message (Headers, Key and Value). For example, you can define variables on the Variables page of the Job that you can then use in the JSONs or strings that you define here.
Example 1:
"Automic Job Ended successfully"
Example 2:
{ "message" : "Job Ended", "status" : "successful" }
Next Steps
Once you have defined the Producer Job, you can execute and monitor it. You can add it to other Automic Automation objects, such as Workflows or Schedules to orchestrate and automate its execution.
For more information, see:
For information about Automic Automation Workflows, Schedules and so forth, see Object Types in the Automic Automation product documentation.
See also: