RDS Oracle Ingestion Target

RDS Oracle is Relational Database service on Cloud. RDS Oracle emitter allows you to write to the supported RDS Oracle Database.

Target Configuration

Fetch From Target/Upload Schema File

The data source records needs to be emitted to a RDS Oracle target table.

In case if the Gathr application has access to a target table in the RDS Oracle database, choose the option Fetch From Target.

In case if Gathr application does not have access to a RDS Oracle target table, you can choose the option Upload Schema File in order to map the RDS Oracle table columns with the source columns during design-time and confirm the data type for each column. In such cases you can run the application in a registered environment, that has access to all the required resources. During run-time, the application will run on the registered cluster of your choice picking up the configuration values as provided during application design.

When you select the Upload Schema File option, a Schema Results section will get displayed at the bottom of the configuration page.

Redshift_UploadSchema

You can then download the sample schema, provide RDS Oracle - Table Column Name against mapping values and verify the data type.

Once it is updated, you can upload the saved file to see a sample of how the records from a source column will be written into the corresponding mapped RDS Oracle column.

Connection Name

Connections are the service identifiers. A connection name can be selected from the list if you have created and saved connection details for RDS Oracle earlier. Or create one as explained in the topic - RDS Connection →

Use the Test Connection option to make sure that the connection with RDS Oracle target is established successfully.

A success message states that the connection is available. In case of any error in test connection, edit the connection to resolve issue before proceeding further.

Table Name

Existing table name of the specified database.

Ignore Values

Enable this option to exclude specified values from the data.

Values to ignore Enter comma-separated values to exclude from the data. If a record contains any of these values in columns, even partially, then matching column values will be ignored and set as null.

Add Configuration

Additional properties can be added using this option as key-value pairs.

More Configurations

Enable Batch

Enable parameter to process batch multiple messages and improve write performance.

If Enable Batch field is selected as True, additional field will be displayed as given below:

Batch Size

Batch Size determines how many rows to insert per round trip. This helps the performance on JDBC drivers. This option applies only to writing. The default value is 1000.

If Enable Batch field is selected as False, then proceed by updating the following fields.


Save Mode

Defines the strategy for handling pre-existing data in the target, with the following options:

  • Append: Adds new data to the existing target table, leaving current records untouched.

  • Overwrite: Completely replaces all current data in the target table with the new data set.

  • Upsert: Modifies existing records in the target table when matches are found and inserts new records if no matches exist.

Note: A primary key is required in the target table to execute the upsert operation.

  • Update: Updates existing rows in the target table based on specified join keys. No new records will be inserted.

When Save Mode is set to Update, additional configuration fields appear:

Update Type

Select a method to manage records between the current and incoming data:

  • Keep latest with Overwrite: Substitutes existing records with new data, adding a column to record the timestamp of the last modification.

  • Latest data with version: Retains the existing record and adds a new version of the latest modified record, incorporating columns to track start date, end date, and deletion status.

Note: The target table should not have a primary key to execute updates with the “latest data with version” option, as this will cause the update to fail.

Note: Employ the incremental data ingestion method from the source when using the Update method to prevent overwriting the entire dataset.

Join Columns

Specify the key columns used to align incoming source data with existing records in the target database.

Top