Google Cloud Composer (GC Composer) - Run DAG Jobs
Automic Automation Run DAG Jobs start and monitor Airflow DAGs on Google Cloud Composer environments. The Run DAG Job that you create in Automic Automation represents the Airflow DAG on the Google Cloud Composer, not its tasks. When you execute the Automic AutomationRun DAG Job, you trigger a run of the Airflow DAG on the target Google Cloud Composer environment.
This page includes the following:
Defining Automic Automation GC Composer Jobs
An Automic Automation Run DAG Job definition is made up of the following pages:
-
Standard pages that are always available, no matter what type of object you are defining:
-
Additional pages that are always available for executable objects:
In addition, an Automic Automation Run DAG Job consists of the following specific pages that we explain in this topic:
-
RUN DAG
-
Rapid Automation
RUN DAG
On the Run DAG page you enter the parameters that identify the Airflow DAG that the GC Composer Job will control.
-
Location
Region for the Google Cloud Composer environment.
Example: us-west1
-
Environment
Name of the Google Cloud Composer environment.
-
DAG ID
Here you select the Run DAG job in the configured location and environment that this job will start and monitor.
-
Click the button to open a dialog with the list of all available Airflow DAGs on the Google Cloud Composer environment.
-
To search for a DAG, start typing its name to limit the list to the DAGs that match your input.
-
Once you have found the one you need, click Choose.
Note: This list is populated only after you have entered the location and name of the Google Cloud Composer environment.
-
-
Transfer DAG task logs
Specify whether the task logs should be added to the Job report:
-
Never
This is the default. The report contains only a summary of tasks and their status.
-
Always
The Job report contains the logs of every sub-task in the DAG task. Selecting this option may result in very large reports.
-
On error only
The Job report contains logs of every failed task.
-
Rapid Automation
On this page you specify where to store the job, when to generate it and whether they include Agent log information.
Job Report
When you configure a Job, you define where to store the Job report and when to generate it in the Job Report section on the platform-specific page.
Where to Store the Report
You have two options. You can select one or both simultaneously:
-
Store to: Database means that when the Job has been executed, the process log available on the target system (on the Agent) is stored in the database.
When a Job has been executed on an Agent, the corresponding report is stored on the Agent computer. After the Automation Engine has written this data to the database, the report is automatically deleted from the Agent computer. If it cannot be deleted due to an error, the deletion process is not repeated and an error message is displayed.
- Store to: File means that the process log is stored as a file on the target system (Agent).
When to Generate the Report
You have two options:
-
Generate: Always means that the process log of the operating system is always written.
-
Generate: On error only means that the process log is kept when an error occurs. Example, when the Job is canceled or aborted.
Optional Reports
Select this option to include the Agent log in the Job report. This option is recommended for troubleshooting purposes.
For more information, see:
Executing Run DAG Jobs
When you execute the Automic Automation Run DAG Job, the Automic Automation Run DAG task is visible in the list of Tasks in the Process Monitoring perspective. The Airflow DAG reports back the following information to Automic Automation:
-
DAG start and end dates
-
Duration of the run
-
Status
The remote status of the DAG run on the target system is also reported back to Automic Automation. You can check it as follows:
-
Go to the list of tasks in the Process Monitoring perspective.
-
Find the Automic Automation DAG task. For more information, see Filtering Tasks
-
Select the task and select the Details button to open a pane with information about the task execution.
Available Action on Run DAG tasks
You can execute Run DAG Jobs and cancel the corresponding tasks. Canceling means that the Automic Automation Run DAG task is canceled, not the Airflow DAG on the Google Cloud Composer environment.
Run ID
When you execute an Automic Automation Run DAG Job, a DAG Run ID is generated in Airflow. Automic Automation uses this DAG Run ID to build the corresponding Automic Automation runID that is displayed in the Process Monitoring perspective. The Automic Automation runID is built as follows:
<Automic system name>-<Automic job run id>
For more information, see:
See also:
-
For information about Google Cloud Composer-specific topics, please refer to the official documentation at Cloud Composer documentation