Preparations and Considerations for the AI Filter Assistant
The AI Filter Assistant uses generative AI (artificial intelligence) to make filtering easier and more natural for your users by allowing them to enter their filter criteria in natural language rather than going through prescribed steps. As an AAI administrator, you need to toggle the AI Filter Assistant on or off for your AAI installation. You also need to evaluate your organization's compliance considerations when implementing the AI Filter Assistant. This topic provides you with the steps for activating the AI Filter Assistant and the details you or your management needs to make an informed decision about that.
This page includes the following:
Enabling the AI Filter Assistant
The generative AI feature that powers the AI Filter Assistant is an optional system-wide feature that can be activated in an AAI instance. By default it is deactivated.
As an AAI administrator with rights to the AAI Configuration Tool, you can switch the Filter Assistant on or off in the system parameter featureToggle.enableGenAI by setting it to "true" or "false" respectively. By default it is "false" (disabled). You find the parameter on the Params tab among the Hidden named parameters.
Changes to this toggle parameter go into effect immediately. With the next login or user refresh of the browser tab or window, each user will be able to either see—or no longer see—the AI Filter Assistant in AAI.
Which Data the AI Filter Assistant Accesses
When users open the AI Filter Assistant, they see the following notice about using the feature:
You are interacting with a generative AI feature. AI-generated output may contain errors and unexpected results. Do not include any personal data or confidential data in your prompts and carefully review output before use.
You might need to know what this means and which data the AI Filter Assistant accesses so you can assess whether its use complies with your organizations data privacy policies. The following data is sent to the Generative AI service that powers the AI Filter Assistant:
-
The fact that someone is filtering for some data values
-
Any lists of valid filter attribute values, such as SLA status values, or business area values
-
Run data values that AAI creates and stores in its AAI databases, such as number of runs or number of alerts.
-
The hierarchy of the attribute values, such as the business area hierarchy
This is true whether the attribute values defined by AAI, such as SLA statuses, or defined by your users, such as business area names.
The values that your users define and are passed onto the Gen AI service include any AAI objects that they create and name, such as:
-
The name of the business areas and their hierarchy
-
The names of data insights
-
The names of schedulers created in AAI
Not sent are the names of jobs or jobstreams because these are not listed items in the standard filter, like scheduler names or SLA statuses.
LLM Partner Agreements
Gen AI relies on a Large Language Model (LLM) to understand the natural language criteria that users enter in the AI Filter Assistant. AAI cannot provide this kind of machine learning capability directly and has to partner with providers who do. Specifically, Broadcom has partnered with Google to use its LLM platform to interpret entries in the AI Filter Assistant and present well formed AAI filter queries.
Our customer's privacy concerns are always at the forefront of the products we provide. Therefore, as a key Google Enterprise customer, we engaged with Google to gain assurance that, while we use their LLM intelligence for our Gen AI, they do not add or store any of our user or our proprietary data to their data pool for further LLM training. This data stays solely with AAI and Broadcom, and as such is treated with the same rigorous data security measures that we have always provided.
If you want to implement the AI Filter Assistant and our agreement with Google does not comply with your organization's privacy and security policies, contact our Support group to talk about alternatives to enable this, such as setting up your own Google Could Platform or LLM. For contact information, see Support.
See also: