Building AI Agents in Automic Automation

An AI Agent is a software entity designed to perceive its environment, make decisions, and execute actions to achieve specific goals by leveraging Large Language Models (LLMs) for intelligent processing.

In Automic Automation, this concept is realized by integrating LLMs and Model Context Protocol (MCP) servers directly into your Workflows. This is achieved using AI Jobs, which are advanced, object-oriented interfaces that embed artificial intelligence into automated processes.

How it Works

AI Jobs link Workflows to the Automation.AI component, which serves as a dedicated backend intermediary between Automic Automation and one or more LLMs. Automation.AI is an AI-platform-agnostic service that connects to various LLM backends over REST. It is responsible for conversation handling, prompt construction, and data context management. MCP Servers, which can be configured within Automation.AI, act as 'toolboxes' providing specific capabilities (tools) that AI Jobs can use to perform actions, such as reading files or executing objects.

By utilizing AI Jobs within Workflows, you can automate complex decision-making, summarize intricate data sets, and trigger intelligent actions in external systems.

You can explore a detailed end-to-end example of these capabilities in action here:

Roles and Responsibilities

Using these capabilities requires different tasks for administrators and developers. Administrators establish the foundational setup. Developers and object designers then build on this administrative groundwork to create and use AI Jobs within Workflows, defining their prompts, conversation settings, and integrating them into automation processes. This clear division ensures that the powerful generative AI features are implemented securely, efficiently, and tailored to specific automation needs.

Administrator Responsibilities

As an administrator, your are responsible for setting up the foundation for AI integration.

  1. Install and configure Automation.AI.

    For on-premises and AAKE environments, you must install and configure this component, ensuring network connectivity.

    Note: Automic SaaS environments come with this preconfigured.

  2. Configure LLMs and MCP Servers

    Define available models (e.g., Gemini, GPT-4o) and register MCP Servers (for example, google-mcp-toolbox, csv-editor-mcp, or the automic-mcp-server) within the Automation.AI component.

  3. Define AI Connection Objects

    Create AI Connection objects, which serve as a bridge between Automic Automation and the Automation.AI component. These objects link AI Jobs to specific LLMs and MCP Servers.

    1. Select a pre-configured model (LLM) and assign the necessary toolsets (MCP Servers) on the AI Connection page.

    2. Create different AI Connection objects to control which tools and models are available to different teams.

    Note: AI Connection objects do not store authentication credentials or API keys; these are managed securely within the Automation.AI component.

For more information, see:

Developer/Object Designer Responsibilities

As a developer and object designer, you apply the administrator's setup to build and deploy AI-driven logic.

  1. Define AI Jobs.

    AI Jobs provide a robust, object-oriented approach to integrating AI into processes. They act as an interface between your Workflow and the Automation.AI component. These objects include standard Job pages plus specialized AI and AI Prompt pages.

  2. Connect to the AI.

    On the AI page, select an AI Connection object and choose the specific tools (from the MCP toolbox) required for the task.

  3. Configure the Prompt

    Use the AI Prompt page to define the logic.

    1. The User Prompt is the specific query or instruction the Job should execute. You can use Automic Automation variables here to inject dynamic data.

    2. The System Prompt defines the persona or context for the AI.

  4. Configure the conversation settings

    Configure whether the Job starts a new session or continues an existing one. Use the Conversation ID variable to maintain context across multiple sequential Jobs in a Workflow.

  5. Use AI Job in Workflows

    Incorporate the AI Job into your Workflows and use the model's output stored in variables to drive downstream automation logic.

For more information, see:

See also:

Automic Automation's Generative AI Capabilities