Defining S3 Upload File Jobs
This job allows you to update and upload a file from your local machine into an S3 bucket.
If the file already exist in the destination that you have specified, then it is overwritten. If the destination file path that you have defined does not exist in the bucket, the system creates the destination as defined and uploads the relevant file to that destination.
If the file name in the destination file path differs from the one in the source file path, the file is saved using the name that you have defined in the destination file path. The same applies to the file type. If the region and/or bucket do not exist, the Upload job fails.
This page includes the following:
S3 Upload File Job Parameters
On the Upload File Job section, you define the parameters relevant to run that job on the S3 system from Automic Automation.
Some fields allow you to open a picker dialog from where you can select the file and the bucket. By default, only 200 entries are displayed. If the relevant file or bucket is not displayed among those 200, please type in the relevant name on the Search field to narrow down the list.
When using regular expressions, make sure you consider the following issues:
-
The Destination File Path changes to a Destination Path. This means that you can no longer define a file pattern and that you must enter a destination folder. If you enter a file pattern in the Destination Path field, all the files in the folder are overwritten with that one file.
-
The Source File Path also changes to Source. Here too, you must define a Source folder and not a path to a single file.
-
All the files are copied to the destination bucket using the same name as in the source bucket, as you cannot define a specific file name for each file.
-
If you use only a regex and do not define the Destination Path, the system searches in all folders and sub folders thus increasing the execution time.
-
The AE REST API handles the regex on the background. That means that executions of jobs with regular expressions might take longer. You can use the Query Params to refine the search and reduce the matches thus significantly increasing the performance.
-
Connection
Select the S3 Connection object containing the relevant information to connect to the Simple Storage Service system.
To search for a Connection object, start typing its name to limit the list of the objects that match your input.
-
Region
Defining the region in the job is optional and only relevant for AWS. If you choose not to define it, the job takes the URL defined in the Connection object.
However, if you decide to define the region in the job, make sure that the definition matches the one defined in the Connection object that you have selected for the job. If both region definitions do not match, the job execution fails and an error message is logged in both, the Agent (PLOG) and the Job (REP) reports (see Monitoring S3 Jobs).
-
Use Regex
This option is not selected by default. Select the checkbox if you want to use a regular expression to upload multiple files at the time.
-
If not selected, you need to define the Source File Path and Destination File Path as described below.
-
When selected, you need to define where the Source and the Destination Path as described below.
Note:When using regex, an asterisk (*) stands for any number of characters, while a question mark (?) stands for a single character. If the Use Regex checkbox is not selected, any special characters (?, *) used are considered standard characters and part of the name string.
-
-
Source File Path
This option is available only if you have not selected the Use Regex option.
Define the source file name or the path of where the source file is located, for example: C:\temp\text.xml.
-
Source
This option is available only if you have selected the Use Regex option.
Define were the relevant files are located in your bucket, for example MYFILES/*.
-
Bucket Name
Define the bucket name to where the file must be uploaded. You can click the browse button to the right of the field to open a picker dialog where you can select the relevant name.
-
Destination File Path
This option is available only if you have not selected the Use Regex option.
Define the destination file name or the path to where the file should be uploaded in the bucket. You can define the path using either <file name> or <folder name>/<file name>.
-
Destination Path
This option is available only if you have selected the Use Regex option.
Define the path to the destination to which you want to move the files. You can define the path using <folder name>/.
Warning!Make sure you define a folder and not a single file (path). Otherwise, you will overwrite the contents of the folder with a single file.
-
Query Param
Allows you to filter the query and therefore the query response.
Examples
When using a regex, the prefix parameter allows you to optimize the search and get results that are more efficient. For example, if your bucket has the following files:
/opt/files/example_04_08.pdf
/opt/files/example_05_08.pdf
/opt/files/example_06_08.pdf
/opt/files/demo_07_08.pdf
/opt/files/demo_08_08.pdf
If you want to check for files starting with example and with a .pdf extension, you can specify .*.pdf on the File Name field and enable the Use Regex option.
You can further specify the query using the following query parameter:
prefix=/opt/files/example.
You can also send multiple query parameters using the format <param1>=<value1>&<param2>=<value2>.
For example, you can add the list-type=2 parameter to use version 2 of the AWS API operation:
prefix=/opt/files/example&list-type=2
There is no restriction on the parameters that you can use in a query as all URI request parameters for the Upload File Job are supported. For more information, refer to the official AWS S3 and GCS documentations, respectively.
The Pre-Process page allows you to define the settings all S3 Jobs using script statements. These statements are processed before the Schedule Job is executed, see Setting S3 Job Properties Through Scripts.
S3 Failed Operation Parameters
The Failed Operation Parameters section allows you to define if you want to try to execute failed operations again or not. There are a number of parameters that you can define to do so:
-
Retry
Select the checkbox if you want to retry to execute failed operations. Once selected, you can define the following:
-
Retry Count: Define the number of retry attempts.
-
Retry Delay: Define the time in seconds between attempts.
-
-
Check for Overwrites
Allows you to check if the destination bucket already contains the files that you want to put in it and, if so, allows you to choose to either overwrite the existing files on the destination or to skip the operation and fail the job.
Note:When you select this option, the system runs an additional job in the background to check the destination bucket, which might affect the duration of the job execution.
Select the checkbox if you want to check if the files already exist on the destination. If this option is not selected, the system does not check the target bucket before carrying out the operation.
Overwrite Action: Select the checkbox if you want to define an overwrite action:
-
Overwrite existing files: Select this action to overwrite the existing files on the destination.
-
Fail job and skip operation: Select this action to skip the operation and fail the job instead of overwriting files on the destination.
-
The following reasons for failure do not allow any recovery. Therefore, they are not considered when retrying to execute failed operation:
-
400 Bad Request
-
401 Unauthorized
-
403 Forbidden (access denied)
AWS S3 Server-Side Encryption Parameters
Amazon S3 encrypts your objects at their destination as it writes them in the respective AWS S3 data center and decrypts them when you access them. You can set a default encryption configuration for your buckets. However, you can also override the default bucket encryption and define a different one per object to be stored in an AWS S3 bucket.
You can only apply one type of server-side encryption to an object at the time.
The server-side encryption options available for AWS S3 Upload File jobs are the following:
Specify Encryption Key Allows you to define if you want to use server-side encryption or not.
Select this check box if you want to use it. When selected, the Override Default Bucket Encryption check box is displayed.
Override Default Bucket Encryption: Allows you to define if you want to use the bucket's default encryption configuration or not.
To use the bucket's default encryption, leave the check box unselected.
Select the check box if you want to override it and select a different encryption type for the file that you are copying. When selected, the Encryption Type options available are displayed.
Encryption Type: Select the file encryption type that you want to use to override the bucket's default encryption configuration.
-
SSE-S3: Select this option to use server-side encryption with AWS S3 managed keys.
-
SSE-KMS: Select this option to use server-side encryption with AWS Key Management Service keys.
Encryption ARN: Make sure you enter the ARN relevant for the encryption. You can also click the browse button to the right of the field to open a picker dialog where you can select the relevant ARN from the list.
-
DSSE-KMS: Select this option to use Dual-layer server-side encryption with AWS Key Management Service keys.
Encryption ARN: Make sure you enter the ARN relevant for the encryption. You can also click the browse button to the right of the field to open a picker dialog where you can select the relevant ARN from the list.
Note:The keys for the three types of encryption mentioned above are maintained in the AWS Key Management Services.
-
SSE-C: Select this option to use a custom encryption key.
Note:This option is supported only through the AWS S3 REST API and not the user interface. You need to maintain the keys used for this encryption type yourself, as they are not maintained in AWS S3.
Make sure you define the following parameters:
-
Customer Algorithm: AES256 is the only supported algorithm.
-
Customer Key: Enter the encryption key that you want to use to execute the job.
-
See also: