Rule Engine Operational Documentation

The Analytics Rule Engine uses Apache Flink to deliver a stream processing service for Event Engine feature. This document is aimed at experienced operational teams, that require information about the CA Automic implementation Flink.

Notes:

  • This document is aimed at experience users of Apache Flink, it presumes that you have some basic knowledge of the topics outline.

This page includes the following:

Why Flink is Used as the Rule Engine

  • The Flink stream processing framework is fast, reliable, and efficient
  • A scalable solution that runs on 1 to n machines
  • Takes load balancing away from AE

See: http://flink.apache.org/introduction.html#features-why-flink

Event Engine Flow

As the Rule Engine is based on Flink, the architecture remains common.

Event Engine Flow

How the Rule Engine Interfaces with Analytics

The interface between Analytics and the Rule Engine is configured using the application.properties file that is located in following directory: <Automic>/Automation.Platform/Analytics/backend.

Important! collector.events.enabled is disabled by default (if not using the one-installer) and must be set to true explicitly.

Example of the default Rule Engine configuration:

#####################
## Events ingestion #
#####################

# Enable/disable event ingestion
collector.events.enabled=true

##########################
## Rule Engine settings ##
##########################

# Flink job manager
flink.host=localhost
#flink.port=6123
#flink.web_port=8081

# Use SSL for connecting to Flink
#flink.use_ssl=false

# Verify SSL certificate
#flink.disable_self_signed_certificates=true

# Job monitoring interval
#flink.monitoring_interval_seconds=60

# Heartbeat interval to check for abandoned jobs
#flink.monitoring_heartbeat_interval_minutes=3

Consuming Events

Kafka uses Topics as term to separate different streams of data. The Rule Engine uses Event definition as the entity, where there is one separate Kafka topic per client and event definition.

Executing Rules

  • A rule is not a job and does not contain any JCL
  • Rules are a subtype of Event (EVNT) objects (in Automation Engine terms) so they are more related to file events

How Incoming Data is Transformed with the Rule Engine

The diagram following how data is processed from IA Agent to Streaming Platform and to the Rule Engine.

Note: One Streaming Platform (Kafka) event topic per client, for example, 99_events.

How incoming data is transformed with the Rule Engine

---------: execute a rule

--------: Ingest external events

--------: Trigger if incoming events match a rule

Enabling Event Processing

See: Enable Event Processing for a Client