Sqoop import error

what happens when sqoop import job fails
sqoop export
sqoop import command

I am trying to import a MySQL table using sqoop import. Following is the error that i am getting.


[root@sandbox ~]# sqoop import \
   --connect "jdbc:mysql://localhost:3306/retail_db" \
   --username=root \
   --password=hadoop \
   --table departments \
   --as-avrodatafile \


ERROR manager.SqlManager: Error executing statement: java.sql.SQLException: The connection property 'zeroDateTimeBehavior' acceptable values are: 'CONVERT_TO_NULL', 'EXCEPTION' or 'ROUND'. 
The value 'convertToNull' is not acceptable.

java.sql.SQLException: The connection property 'zeroDateTimeBehavior' acceptable values are: 'CONVERT_TO_NULL', 'EXCEPTION' or 'ROUND'.
The value 'convertToNull' is not acceptable.

Please help me out if anyone knows the reason/solution for this.

Thanks, Rishi

can you try using the following code below:


Solved: sqoop import error, Solved: I am trying to import a table named 'widgets' in the data base 'Testdb' with the command below: $sqoop import --connect. I am trying to import a table of 32 Million records from SQL Server to Hive via Sqoop. The connection is SQL Server is successful. But Map/Reduce job does not successfully execute. It gives the following error:

This how i tried and it worked for windows

sqoop import 
--connect "jdbc:mysql://localhost/employees?zeroDateTimeBehavior=CONVERT_TO_NULL" 
--table tablename
--username username
--password password
--m 1 
--target-dir /user/sqoop/tablename
--outdir java_files

zeroDateTimeBehavior=CONVERT_TO_NULL , the CONVERT_TO_NULL is driven from the error i got on run time . It works

sqoop import is showing error, I figured out issue. I set HADOOP_HOME correctly and it solves my problem. Solved: Hi I am trying to import a single table from sqoop and i get this error: Warning: Support Questions Find answers, ask questions, and share your expertise

Works fine with "zeroDateTimeBehavior=CONVERT_TO_NULL"

Sqoop import error with sqoop job, sqoop job --create retaildb_comerica --import --connect jdbc:mysql://mysqldb.edu​.cloudlab.com/retail_db --table comerica --username labuser  Hi team, I had tried to create a sqoop job, but I am getting errors when I use the following sqoop command. Please run this from your end let me know where the problem was.

I got into the same issue after devops team upgraded the mysql to version 8. After that all the sqoop jobs started failing. I used below command and it worked for me.

sqoop import -Dmapreduce.job.queuename=queue --connect jdbc:mysql:/hostname:3306/db_name?zeroDateTimeBehavior=round -m 1 --driver com.mysql.jdbc.Driver --username user --password pwd --table dim_store  --fields-terminated-by "\t" --hive-import --hive-overwrite  --hive-table hivedb.hivetable;

Most recent Sqoop Import Error While Trying To Import From MySql , Sqoop Import Error While Trying To Import From MySql To HDFS - Discussion Forum | DeZyre. 18/07/27 10:34:10 INFO sqoop.Sqoop: Running Sqoop version: Enter password: 18/07/27 10:34:16 WARN sqoop.ConnFactory: Parameter --driver is set to an explicit driver however appropriate connection manager is not being set (via --connection-manager).

Sqoop User Guide (v1.4.2) - Apache Sqoop, Sqoop will import data to the table specified as the column family do not exist, the Sqoop job will exit with an error. Hi, i am trying to import a table present in mysql to hive in parquet format. My table in mysql contains few date and timestamp columns. While importing the table to hive, the date and timestamp columns are automatically converted to bigint. Could some one help me out in resolving this issue? I have

Error while executing Sqoop import-all-tables command, Hi, When i run below i below command, it is giving me error. please help me to get out of this: sqoop import-all-tables \ -m 10 --connect  $ sqoop import (generic-args) (import-args) $ sqoop-import (generic-args) (import-args) Let us take an example of three tables named as emp, emp_add, and emp_contact, which are in a database called userdb in a MySQL database server. The three tables and their data are as follows. prasanth@ac.com. Importing a Table.

Sqoop Troubleshooting Tips, Sqoop Troubleshooting process & tips-Apache sqoop known issues, ways to resolve sqoop By using –hive-import and resulting in a NullPointerException: Sqoop is a tool designed to transfer data between Hadoop andrelational databases or mainframes. You can use Sqoop to import data from arelational database management system (RDBMS) such as MySQL or Oracle or amainframe into the Hadoop Distributed File System (HDFS),transform the data in Hadoop MapReduce, and then export the data backinto an RDBMS.

  • I think the issue is related to handling of null values, Try setting below params... --null-string '\\N' \ --null-non-string '\\N' \
  • This is how i tried it. Please correct me incase of any syntax error. However this code did not run as well. [root@sandbox ~]# sqoop import \ > --connect "jdbc:mysql://localhost:3306/retail_db?zeroDateTimeBehavior=convertToNull" \ > --username=root \ > --password=hadoop \ > --table departments \ > --as-avrodatafile \ > --target-dir=/user/root/departments
  • you might want to try the following [root@sandbox ~]# sqoop import \ > --connect jdbc:mysql://localhost:3306/retail_db?zeroDateTimeBehavior=‌​round \ > --username root \ > --password hadoop \ > --table departments \ > --as-avrodatafile \ > --target-dir /user/root/departments Hope this will help you.
  • Hi Peter, the solution you mentioned here is accurate. For zeroDateTimeBehavior, the three acceptable values are CONVERT_TO_NULL,EXCEPTION or ROUND. It was mentioned in the log clearly which i should have checked. Thanks a lot for the help.
  • @RishiDeshmukh no worries pal.