Diyotta job is failing with error ERROR:Load SQL execution failed with exception :: Connection reset.

I am trying to load data from hive table to mssql. I have around million records in the hive table. But every time I run the dataflow the load fails after 250k records. The error I see in the log is

INFO   07/17/2019 13:25:51     TGT_LOAD_COMM_POST     204770              [Load Delimiter : ]
                [Load Jdbc Sql : INSERT INTO INDIRECT_DATA.dbo.LOAD_COMM_POST ( SOURCE,TRANS_SCID,EMP_CID,EMP_NAME,TRANS_NO,MDN,IMEI_ESN,RMS_TRANS_YRMO,RMS_TRANS_DT,ACTIVITY_CATEGORY,SUBCATEGORY,CLIL,QTY,UNITVALUE,WEIGHTED,WALK_IN_IND,SKU,BILLING_TYPE,SUBSCRIBERNBR,BAN,COMPENSATIONDATE,VALUE,MRC,EVENTTYPEID,ROLL_UP_SCID,SERVICE_PLAN_DESC,COMM_DEVICE_CATEGORY,VCGROUP1,SOC,LOAD_TS,STORE_NBR )  VALUES ('$VCOL1','$VCOL2','$VCOL3','$VCOL4','$VCOL5','$VCOL6','$VCOL7','$VCOL8'
                                ,convert(date,replace('$DTCOL9','-','-'),120),'$VCOL10','$VCOL11','$VCOL12',$COL13,$COL14,$COL15,'$VCOL16','$VCOL17','$VCOL18','$VCOL19','$VCOL20'
                                ,convert(date,replace('$DTCOL21','-','-'),120),$COL22,$COL23,'$VCOL24','$VCOL25','$VCOL26','$VCOL27','$VCOL28','$VCOL29'
                                ,convert(datetime,replace(replace('$DTCOL30','-','-'),':',':'),121),'$VCOL31')]

ERROR   07/17/2019 13:27:11     TGT_LOAD_COMM_POST     204770          ERROR:Batch execution terminated with exception :: null

ERROR   07/17/2019 13:27:11     TGT_LOAD_COMM_POST     204770          ERROR:Load SQL execution failed with exception :: Connection reset

INFO   07/17/2019 13:27:11     TGT_LOAD_COMM_POST     204770              All opened resource(s) closed successfully

ERROR   07/17/2019 13:27:11     TGT_LOAD_COMM_POST     204770          Unit Execution completed :[Status:  F ],[ReturnCd:  EFA00017 ],[ReturnMsg:  LoadSQL execution failed:Connection reset ]

Is there any setting change that needs to be done to overcome this?
Beginner Asked on July 24, 2019 in Diyotta Monitor.
Add Comment
1 Answer(s)

As per the snapshot attached, the load type is JDBC for the MSSQL target. Follow the below steps to resolve the issue

 

1. Can you try to run the Job by reducing the batch size property value to 50000[50k] in the load properties?

Batch Size property

2. In order to reduce the data load latency try running the job using BCP as load type.
3. BCP[ Bulk Copy Program] a native utility of MSSQL will do Bulk inserts rather than row by row inserts.
4. Make sure BCP utility has to be installed on the Agent before running the job.

Expert Answered on July 26, 2019.
Add Comment

Your Answer

By posting your answer, you agree to the privacy policy and terms of service.