Read schema name Dynamically and load to table
I have a Table (lets say ) Balance in different schema from DB2 database
eg : Database = COUNTRY
Schema : Peru,chile,china
Target Table : BALANCE_ALL
I would want to create a single data object BALANCE and use the same object dynamically passing the schema names in the data flow/job flow and write to the targer table BALANCE_ALL.
we can achieve the above scenario by using the below 3 options.
if we enable project parameters in the datapoint and override the required details during execution
While executing the job flow run the job flow as an instance.
we can perform override in the source instance query and use DB/schema name as dataflow parameter.
At the job, the flow level includes the looper job to loop through the different DB/schemas. For every iteration, it will take the different db/schema value returned by looper and use that at the dataflow level.
This option will not work if the host/port is different. you need to create a single of and df in both the above cases.
WE can create task commands in the scheduler and remove the dependency between each command task and different loads
and we need to parameterize all the datapoint related properties at the project level.
Please go through the support link to know more about the project parameters.