Property | Description |
---|---|
Connection | Name of the source connection. You can select an existing connection, create a new connection, or define parameter values for the source connection property. If you want to overwrite the source connection properties at run time, select the Allow parameter to be overridden at run time option. |
Source Type | Type of the Open Table source object. You can choose from the following source types:
|
Parameter | A parameter file where you define values that you want to update without the need to edit the task. Select an existing parameter for the source object or click New Parameter to define a new parameter for the source object. The Parameter property appears only if you select parameter as the source type. If you want to overwrite the parameter at run time, select the Allow parameter to be overridden at run time option. When the task runs, the Secure Agent uses the parameters from the file that you specify in the advanced session properties. |
Object | Source object for the mapping. |
Property | Description |
---|---|
Filter | Filter value in a read operation. Click Configure to add conditions to filter records and reduce the number of rows that the Secure Agent reads from the source. You can specify the following filter conditions:
|
Sort | Not applicable |
Property | Description |
---|---|
Iceberg Spark Properties | The properties that you want to configure for the Iceberg tables. Enter the properties in the following format: <parameter name>=<parameter value> If you enter more than one property, enter each property in a new line. When you use AWS Glue Catalog with Amazon S3 and the source and target are in different regions, you must specify the Amazon S3 bucket ARN property in the following format: s3.access-points.<bucket name>=<S3 bucket ARN> For more information about Amazon S3 bucket ARN, see the Create the ARN for your AWS S3 bucket Knowledge Base article. When you use Hive metastore with Amazon S3 and the source and target are in different regions, you must specify the bucket region property in the following format: BucketRegion=<Amazon-S3-source-bucket-region-name> When you use REST Catlog with Amazon S3 and the source and target are in different regions, you must specify the bucket region property in the following format: restcatalog.iceberg.s3.client.region = <Amazon-S3-source-bucket-region-name> |
Delta Spark Properties | The properties that you want to configure for the Delta Lake tables. Enter the properties in the following format: <parameter name>=<parameter value> If you enter more than one property, enter each property in a new line. When you use AWS Glue Catalog with Amazon S3 and the source and target are in different regions, you must specify the bucket region property in the following format: BucketRegion=<bucket-region-name> |
Pre-SQL | Pre-SQL queries to run before reading data from Apache Iceberg Open Table formats. Ensure that the SQL queries use valid Spark SQL syntax. You can enter multiple queries separated by a semicolon. You must prefix the table name with the <OpenTableCatalog> string and the database name. For example, <OpenTableCatalog>.databasename.tablename The database name and table name identifiers in the queries must be in lowercase. If the pre-SQL query fails, the mapping also fails. |
Post-SQL | Post-SQL queries to run after reading data from Apache Iceberg Open Table formats. Ensure that the SQL queries use valid Spark SQL syntax. You can enter multiple queries separated by a semicolon. You must prefix the table name with the <OpenTableCatalog> string and the database name. For example, <OpenTableCatalog>.databasename.tablename The database name and table name identifiers in the queries must be in lowercase. If the post-SQL query fails, the mapping also fails. If the mapping fails, the post-SQL query is not executed. |
Tracing Level | Sets the amount of detail that appears in the log file. You can choose terse, normal, verbose initialization, or verbose data. Default is normal. |