Sqoop create hive table ERROR - Encountered IOException running create table job

I am running sqoop on a Centos7 Machine that has hadoop/map reduce and hive already installed. I read from a tutorial that when importing data from a RDBMS (SQL Server in my case) to HDFS I need to run the next commands :

sqoop import -Dorg.apache.sqoop.splitter.allow_text_splitter=true  --connect 'jdbc:sqlserver://hostname;database=databasename' --username admin --password admin123  --table tableA

Everything works perfectly with this step. The next step is creating a hive table that has the same structure as the RDBMS (SQL Server in my case) and using a sqoop command :

sqoop create-hive-table --connect 'jdbc:sqlserver://hostname;database=databasename' --username admin --password admin123  --table tableA --hivetable hivetablename --fields-terminated-by ','

However, whenever I run the above command I get the next error :

FAILED: Execution Error, return code 1 from      
org.apache.hadoop.hive.ql.exec.DDLTask. 
com.fasterxml.jackson.databind.ObjectMapper.readerFor(Ljava/lang
/Class;)Lcom/fasterxml/jackson/databind/ObjectReader;
18/04/01 19:37:52 ERROR ql.Driver: FAILED: Execution Error, return code 1   
from org.apache.hadoop.hive.ql.exec.DDLTask. 
com.fasterxml.jackson.databind.ObjectMapper.readerFor(Ljava/lang
/Class;)Lcom/fasterxml/jackson/databind/ObjectReader;
18/04/01 19:37:52 INFO ql.Driver: Completed executing  
command(queryId=hadoop_20180401193745_1f3cf07d-ca16-40dd-
8f8d-1e426ecd5860); Time taken: 0.212 seconds
18/04/01 19:37:52 INFO conf.HiveConf: Using the default value passed in 
for log id: 0813b5c9-f374-4920-b8c6-b8541449a6eb
18/04/01 19:37:52 INFO session.SessionState: Resetting thread name to     
main
18/04/01 19:37:52 INFO conf.HiveConf: Using the default value passed in 
for log id: 0813b5c9-f374-4920-b8c6-b8541449a6eb
18/04/01 19:37:52 INFO session.SessionState: Deleted directory: /tmp/hive
/hadoop/0813b5c9-f374-4920-b8c6-b8541449a6eb on fs with scheme hdfs
18/04/01 19:37:52 INFO session.SessionState: Deleted directory: /tmp/hive  
/java/hadoop/0813b5c9-f374-4920-b8c6-b8541449a6eb on fs with scheme file
18/04/01 19:37:52 ERROR tool.CreateHiveTableTool: Encountered IOException 
running create table job: java.io.IOException: Hive CliDriver exited with 
status=1

I am not a java expert but I would like to know if you have any idea of this result?

I was facing the same issue and I have downgraded my hive to 1.2.2 and it works. That will solve the issue.

But not really sure if you want to use Sqoop with only hive2.

Sqoop create-hive-table getting permission error , ImportTool: Encountered IOException running import job: java.io. The file is getting created in hdfs but when it starts loading in hive I get an error /sqoop/​prod-imp --fields-terminated-by "," --hive-import --hive-table prod  57477 [main] ERROR org.apache.sqoop.tool.ImportTool - Encountered IOException running import job: java.io.IOException: Cannot run program "hive": error=2, No such file or directory When run from CLI sqoop imports fulfills without errors.

ERROR tool.ImportTool: Encountered IOException running import , Hi team, I had tried to create a sqoop job, but I am getting errors when I use the . jdbc:mysql://mysqldb.edu.cloudlab.com/retail_db --table comerica --username ImportTool: Encountered IOException running import job: java.io.IOException: Cannot run program "hive": error=2, No such file or directory. fix authorization failuare when the --hive-table option contaion a database name for import Encountered IOException running import job: apache.sqoop.hive

Instead of writing two different statements, you can put the whole thing in one statement, which will fetch the data from sql server and then create a HIVE table too.

sqoop import -Dorg.apache.sqoop.splitter.allow_text_splitter=true --connect 'jdbc:sqlserver://hostname;database=databasename' --username admin --password admin123 --table tableA --hive-import --hive-overwrite --hive-table hivetablename --fields-terminated-by ',' --hive-drop-import-delims --null-string '\\N' --null-non-string '\\N'

Sqoop import error with sqoop job, When importing data in Hive for a table that already exists, the import fails because the create table statement generated by Sqoop fails. Sqoop HiveImport: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive​.ql.exec. ImportTool: Encountered IOException running import job: java.io. Sqoop (import + --hive-import + --as-parquetfile) can fail due to a mismatch with the json schema that Hive produces vs. the json schema that Sqoop generates. The test case below demonstrates how to reproduce the issue as well as workaround it. SETUP (create parquet table with Sqoop import and Beeline CTAS)

For this please check the jackson-core, jackson-databind and jackson-annotation jar. The jar should be of the latest version. Usually it comes due to the older version. Place these jar inside the hive lib and sqoop lib. Along with please check the libthrift jar, both in hive and hbase it should be same and copy the same in sqoop lib

[SQOOP-212] Hive import for existing table does not work, ImportAllTablesTool: Encountered IOException running import job: java.io. Hi @ashwanirana55, I guess no need to use --create-hive-table command here. I am having some issues with sqooping data from sql server database to Hive. I am using Hive in Ambari (HortonWorks) and when I sqooped the data from

Error while SQOOP import all tables in Hive - Apache Hive, 18/06/26 22:39:48 ERROR tool.CreateHiveTableTool: Encountered IOException running create table job: java.io.IOException: java.lang. While using Sqoop Import using HCatalog : Encountered IOException running import job: java.io.IOException: NoSuchObjectException(message:default.test_inc_tbl table not found)

Sqoop import-all-table to hive in specific database fails, ImportAllTablesTool: Encountered IOException running import job: When You import data to HDFS as Hive table, Sqoop first creates a staging area in you name which you are importing as hive table. if it is there it will give you same error. We are trying to export RDMB table to Hive table for running Hive delete, update queries on exported Hive table. Since for the Hive to support delete, update queries on following is required: 1. Needs to declare table as having Transaction Property 2. Table must be in ORC format 3.

Sqoop Hive table import, Table dataType doesn't match with , In addition to above answers we may also have to observe when the error is ImportTool: Encountered IOException running import job: java.io. Loading Sequence File data into hive table created using stored as sequence file failing. Atlassian Jira Project Management Software (v8.3.4#803005-sha1:1f96e09); About Jira; Report a problem; Powered by a free Atlassian Jira open source license for Apache Software Foundation.

Comments
  • Please use formatting tools to properly edit and format your question/answer. Codes within sentences are to be formatted as code Very Important words to be bold , lesser important onces Italic Also use lists if necessary
  • Take a look at HiveServer2 logs. Are you seeing any exceptions when executing the create-hive-table command?
  • I had the same issue and downgraded. Works WONDERFULLY now.
  • Hi friend! despite running the command line, I am getting the same error stated before. Although, I am able to pass the lines: 18/04/01 21:26:27 INFO mapreduce.Job: map 0% reduce 0% 18/04/01 21:26:37 INFO mapreduce.Job: map 25% reduce 0% 18/04/01 21:26:41 INFO mapreduce.Job: map 50% reduce 0% 18/04/01 21:26:42 INFO mapreduce.Job: map 100% reduce 0% However it keeps stating those "fasterxml jackson databind" issues. Any further help will be appreciated.