Date type column is getting decremented on sqoop import by 2 days from SQL server to Hive

Related searches

I have one table in the database where insertdate is of 'Date' type. However, when I import the table into hive using sqoop values in hive tables are getting decremented.

Example

RDBMS --> insertdate='2013-04-01'

Hive --> insertdate='2013-03-30'

I have used below command to import data:

sqoop import --connect 'jdbc:sqlserver://localhost;username=XXXXX;password=XXXXXXX;database=XXXXXXXXXX'--table tbl_name \
 --warehouse-dir /user/hive/warehouse --m 1 \
 --hive-import --hive-database db_name --hive-overwrite --null-string '\\N' --null-non-string '\\N' --hive-drop-import-delims

The problem is not with Sqoop, it is with the JDBC driver of SQL Server.

Check related question - dates consistently two days off

I think you are putting sqljdbc4.jar in /sqoop/lib.

Use sqljdbc41.jar or newer to fix this.

(sqljdbc41.jar is compiled with Java 7)

Value of Date type column is getting decremented w, Date type column is getting decremented on sqoop import by 2 days from SQL server to Hive - sql-server. I have one table in the database where insertdate is of 'Date' type. However, when I import the table into hive using sqoop values in hive tables are getting decremented.

Its working after adding sqljdbc41.jar.

Using Sqoop - Hortonworks Data Platform, of 'Date' type. However when I import the table into hive using sqoop values in hive tables are. Value of Date type column is getting decremented with two day on sqooping the data from SQL server to Hive 2 REPLIES 2. Sqoop Import Specific Columns. After following all installation steps above, you are ready to proceed with sqoop import process. In this exercise, we will use columns directive from apache sqoop. Using sqoop’s columns directive, we can import specific columns from a MySQL table into a HDFS location.

Solved in mysql by using parameter : -D mapreduce.map.java.opts=" -Duser.timezone=GMT"

sqoop import -D mapreduce.map.java.opts=" -Duser.timezone=GMT" \

--connect jdbc:mysql://hostname/location \

--username name -P \

--table VW_Location_History_For_Hadoop \

--target-dir /apps/hive/warehouse/test.db/location_h \

--hive-table test.location_hierarchy \

--fields-terminated-by "," \

--hive-import \

--delete-target-dir \

--m 1

Sqoop User Guide (v1.4.6), 2. Data Warehousing with Apache Hive � Content Roadmap You can use Sqoop to import data into HDFS or directly into Hive. want to use the zeroDateTimeBehavior to handle values of '0000-00-00\' in DATE columns, explicitly specify Overrides the default mapping from SQL type to Hive type for configured columns. However, if we have a Hive metastore associated with our HDFS cluster, Sqoop can also import the data into Hive. It is possible by generating and executing a CREATE TABLE statement to define the data’s layout in Hive. Also, it is the very simple method to import data into Hive, like adding the –hive-import option to your Sqoop command line

Define a Hive Job - TechDocs, Connecting to a Database Server; 7.2.2. Providing Hive and HCatalog Libraries for the Sqoop Job; 23.10. Import Date And Timestamp Data Types from Oracle; 25.8.6.2. This document describes how to get started using Sqoop to move data between databases and sqoop import --driver com.microsoft.jdbc. sqlserver. With the changes in the Decimal data type in Hive 0.13.0, the pre-Hive 0.13.0 columns (of type "decimal") will be treated as being of type decimal(10,0). What this means is that existing data being read from these tables will be treated as 10-digit integer values, and data being written to these tables will be converted to 10-digit integer

Hive is a data warehousing infrastructure that is based on Hadoop. It allows you to query and manage large data sets in HDFS using HiveQL, an SQL-like query� @LOKASHIS RANA. As the logs shows ORA-01840: input value not long enough for date format . Try to use some value for last value like '1900-01-01 00:00:00.0' instead of 0 and run again with below sqoop import statement.

Step 2 – Incremental import of data from MySQL to Hive (after initial load is done) From 2nd load onwards you just need to import data as HDFS load ONLY. Use below SQOOP command for incremental loads.