RE: I’m trying to ingest data from Hive to Snowflake, when data has backslash it is throwing an error
Loading the data from Hive to Snowflake, but my data has backslash (\) present in it. Due to which the Snowflake copy command is failing.
ERROR: File ‘@~/tmp/TGT_base_ecomm_dimension_container_current_im_63647_6/TGT_base_ecomm_dimension_container_current_im_63647_6.dat.gz’, line 338043, character 87
Row 338043, column “BASE_ECOMM_DIMENSION_CONTAINER_CURRENT_IM”[“MANIFEST_TIME”:5]
If you would like to continue loading when an error is encountered, use other values such as ‘SKIP_FILE’ or ‘CONTINUE’ for the ON_ERROR option. For more information on loading options, please run ‘info loading_data’ in a SQL client.
How do I overcome this error
As part of extraction from Hive source, we need to choose a text qualifier as DOUBLE quotes in the job level or dataflow level properties.
When the extract type as JDBC/HADOOPEXTRACT for the HIVE source. So the data is enclosed as part of double quotes (“) as part of the extraction.