-
As of now in Monitor window we are able to check for project level monitor logs. We need to have …
- 4 views
- 0 answers
- 0 votes
-
We have observed that one of our production job extracted data from source but unable to populate data in hive. …
- 98 views
- 1 answers
- 0 votes
-
I have oracle source and Hadoop target, but my jobflows are failing when trying to load the data into the …
- 93 views
- 1 answers
- 0 votes
-
One of the Hadoop jobflow is failing with vertex issue error as part of the execution, This jobflow is having …
- 125 views
- 1 answers
- 0 votes
-
I have data flow in the snowflake native, which has transformations included as part of it. After the dataflow is …
- 214 views
- 1 answers
- 0 votes
-
We are getting the error “Datetime field overflow” in Diyotta 4.x for most of the jobflows Consider following example to …
- 184 views
- 1 answers
- 0 votes
-
In TPT, schema definition section, some attributes size in Diyotta 3.7 DEFINE JOB INSERT_TGT_TDNW101_RISE_PRICE_PLAN_REF_STG ( DEFINE SCHEMA TGT_TDNW101_RISE_PRICE_PLAN_REF_STG_SCHEMA ( PRICE_PLAN_CD …
- 269 views
- 1 answers
- 0 votes
-
Loading the data from Hive to Snowflake, but my data has backslash (\) present in it. Due to which the …
- 181 views
- 1 answers
- 0 votes
-
How do I clear the control files present in /app/control folder present in the agent and controller, which are created …
- 210 views
- 1 answers
- 0 votes
-
Diyotta is using Bulk API V1(Bulk V1) to load salesforce data which supports only comma delimited files. The source file …
- 220 views
- 1 answers
- 0 votes