Creating HDFS Jobs
To run HDFS operations with the RA Hadoop Agent, you need to create HDFS Jobs.
To create an RA Hadoop Agent HDFS Job:
- Add a Job object of type Hadoop > HDFS and select your RA Hadoop Agent solution Agent object in the Host field on the Attributes page.
- Go to the Hadoop page.
- Select an RA Hadoop Agent Connection object from the Connection field.
- Select an HDFS operation from the Operation field. Options for HDFS options are:
- Upload file
- Download file
- Delete file/directory
- Create directory
- Set owner
- Set permissions
- Click Save to save the Job.
The operation you select will determine which fields are shown on the panel. The sub-topics that follow describe the fields for each operation.
Workflow Variables
The following variables are available for the next Job, when this Job is included in a Workflow.
For the: | Use the Variable: |
---|---|
Standard output | &stdout# |
Standard error output | &stderr# |
Status directory | &statusdir# |
These could be used by an HDFS download Job to download the output (stdout) or error output (stderr) of the previous Job.