Key Concepts: AI Integration in Automic Automation

Traditionally, automation relies on static definitions: "If A happens, do B." AI Agents introduce dynamic decision-making. By integrating Large Language Models (LLMs) with Automic Automation, you can create Workflows that understand natural language, reason through complex failures, and autonomously interact with external systems.

To achieve this, Automic Automation relies on several interconnected components. This topic outlines concepts that are crucial for understanding the Automic Automation AI ecosystem.

[A] [C] [I] [L] [M] [N] [S] [T] [U]

A

AI Agents

A Workflow pattern where an AI Job uses MCP Tools to make a decision AND execute a follow-up action. For example, Analyze this error, and if it's a disk space issue, run the cleanup script.

AI Connection object

A specialized Connection object that centralizes the endpoint for the LLM and the MCP Server. It defines Who does the thinking (the LLM) and What resources they have access to (the MCP Servers). Separation of concerns allows you to swap models (for example, upgrade to GPT-5) centrally without editing individual jobs.

For more information, see Defining AI Connection Objects.

AI Job

An executable object that combines a System Prompt (identity), a User Prompt (task), and Tools (capabilities) to perform cognitive tasks within a Workflow.

For more information, see Defining AI Jobs.

Ask AI

Script function used within AE scripts to send a prompt to an LLM at runtime. It returns the AI's response as a string variable for further logic processing.

Automation.AI

This Automic Automation component is a dedicated backend intermediary between Automic Automation and external AI providers (like Google or OpenAI). It is an AI-platform-agnostic service responsible for crucial functions like conversation handling, prompt construction, and data context management. This design ensures that all conversations and data exchanges remain within a secure internal environment. The Automation.AI component is responsible for fetching the list of available LLMs and MCP Servers and presenting them to AWI.

Automation Assistant

A unified Gen AI hub within the AWI. It acts as a digital peer that can answer How-to questions, analyze RunID failures, explain complex error codes and much more.

For more information, see Understanding the Automation Assistant

C

Code Assistant

An integrated utility within the Script Editor that helps developers write, debug, and explain AE scripts (or Python/PowerShell) using natural language prompts.

For more information, see Generating and Analyzing Code Using AI.

Context and Memory

The ability of the system to maintain the state of a "conversation" across multiple Job steps using a Conversation ID.

Conversation ID

To enable "memory" in AI interactions, every AI interaction in Automic Automation is assigned a unique Conversation ID. This ID guarantees that the AI can remember previous exchanges, allowing for the creation of multi-step chains where subsequent AI Jobs can build upon the analysis or information from previous Jobs by passing the Conversation ID via variables.

D

Documentation Assistant

A specialized component of the Intelligent Assistant trained specifically on Automic Automation's product documentation. It allows users to ask questions and receive summarized instructions with links to the official manual, reducing the time spent searching through technical documentation.

I

Intelligent Assistant

The Intelligent Assistant serves as the central Generative AI hub within Automic Automation. It is accessible through the AWI and provides users with access to two key assistants: the Automation Assistant and the Documentation Assistant. See Understanding the Automation Assistant and Understanding the Documentation Assistant.

L

Large Language Models (LLMs)

The LLM is the "brain" of the operation. It is the external AI engine responsible for processing natural language, understanding intent, and generating responses. Its role is to reason, summarize and plan. Examples of LLMs are Google Gemini, OpenAI GPT-4o, and so forth.

M

Memory (AI Context)

Memory in the context of Automic Automation's AI refers to the system's ability to retain and utilize information from prior interactions within an ongoing dialog or sequence of tasks. This capability is ensured through the use of a unique Conversation ID, which allows for coherent, multi-step AI processes where later stages can recall the context of earlier ones.

Model Context Protocol (MCP)

Standard LLMs are excellent at processing text, but they are isolated from your environment. They cannot "see" your files, "check" your Job status, or "fix" a problem on their own. MCP solves this by providing the AI with tools.

MCPs are open standards that enable LLMs to interact with external data, systems, and tools. In the context of Automic Automation, MCP is the bridge that turns an AI from a passive chatbot into an active automation agent.

  • Without MCP you ask, Why did the job fail?. The AI explains general reasons for failure based on its training data.

  • With MCP you ask Why did the job fail?. The AI uses an MCP Tool to read the specific live error report from Automic Automation, analyzes it, and provides a precise answer based on your actual system data.

N

Natural Language

Natural language refers to human languages like English, Spanish, German, Chinese and so forth, which are used for communication. Large Language Models (LLMs) are specifically designed to process, understand the intent of, and generate responses in natural language. This capability allows users to interact with Automic Automation's AI features using conversational interfaces.

S

System Prompt

As part of an AI Job object, the System Prompt defines the "persona" of the AI agent. It provides the initial context and overarching instructions that guide the AI's behavior and response style for a specific task.

T

Tools

An MCP Server is a container or a gateway that hosts a specific set of capabilities, the tools. A tool is a specific function provided by an MCP Server that the AI can "call" to perform an action. For example, the Automic MCP Server might contain capabilities related to Automic Automation (listExecutions to find running jobs, executeObject to start an object, getReport to read a log file, and so forth), while a CSV Editor MCP contains capabilities for file manipulation. See Defining AI Jobs.

U

User Prompt

Also part of an AI Job object, the User Prompt specifies the "task" or the direct query that the user wants the AI to perform or answer. It's the specific input or question given to the AI for its current operation.