Sqoop Import --password-file function not working properly in sqoop 1.4.4
I am using hadoop-1.2.1 and sqoop version is 1.4.4.
I am trying to run the following query.
sqoop import --connect jdbc:mysql://IP:3306/database_name --table clients --target-dir /data/clients --username root --password-file /sqoop.password -m 1
sqoop.password is a file which is kept on HDFS in path
/sqoop.password with permission 400.
It is giving me an error
Access denied for user 'root'@'IP' (using password: YES)
Can anyone provide solution for this? Thanks in advance.
"\n" is being written in file when you vi the file and write the password. Better use the below approach to avoid problems
echo -n "Your_sqoop_password" > sqoop.password
Sqoop User Guide (v1.4.4), Oracle: ORA-00933 error (SQL command not properly ended); 26.2.5. You can also add the --help argument to any command: sqoop import --help . foo --table TEST $ sqoop --options-file /users/homer/work/import.txt --table TEST using the --password-file argument, and is the preferred method of entering credentials. Stack Overflow Public questions and answers; Teams Private questions and answers for your team; Enterprise Private self-hosted questions and answers for your enterprise; Talent Hire technical talent
Not sure if you are still having this issue. The password file can be in any folder. Try the following syntax and it should work:
Sqoop User Guide (v1.4.6), Ensure The Oracle Database JDBC Driver Is Setup Correctly; 22.214.171.124. The Data Connector for Oracle and Hadoop Does Not Apply A Time Zone to DATE You can also add the --help argument to any command: sqoop import --help . the --password-file argument, and is the preferred method of entering credentials. Sqoop is a tool designed to transfer data between Hadoop and relational databases. You can use Sqoop to import data from a relational database management system (RDBMS) such as MySQL or Oracle into the Hadoop Distributed File System (HDFS), transform the data in Hadoop MapReduce, and then export the data back into an RDBMS.
As per the sqoop documentation
You should save the password in a file on the users home directory with 400 permissions and specify the path to that file using the
--password-file argument, and is the preferred method of entering credentials. Sqoop will then read the password from the file and pass it to the MapReduce cluster using secure means with out exposing the password in the job configuration. The file containing the password can either be on the Local FS or HDFS.
If I am running my sqoop job with root user then my password file will be in
/user/root/ in HDFS
sqoop import --connect jdbc:mysql://database.example.com/employees \ --username venkatesh --password-file /user/root/database.password
For more details you can check this
Sqoop 1.4.4 Release Notes, Release Notes for Sqoop 1.4.4: July, 2013 on JDK7; [SQOOP-1027] - Incremental import is not properly escaping table name when loading maximal value� Contribute to 3h-william/sqoop-orc-import development by creating an account on GitHub.
While creating password, use
echo -n option. (
-n option removes all trailing spaces).
Suppose you have a password "myPassword" and you want to save it to a file
sqoop.password, then follow below steps:
Create password using command
echo -n "myPassword" > sqoop.password
Upload the file to HDFS as the file needs to be present in HDFS
hadoop fs -put sqoop.password /user/keepMyFilesHere
Write the scoop import command
sqoop list-tables --connect jdbc:mysql://localhost/kpdatabase --username root --password-file /user/karanpreet.singh/sqoop.password
This will definitely work!
Release Notes - Sqoop, Sqoop Import --password-file function not working properly in sqoop 1.4.4. 6 个月ago By 几孤风月. I am using hadoop-1.2.1 and sqoop version is 1.4.4. Categories. Baby & children Computers & electronics Entertainment & hobby
Check if there is any garbage character in your password file. I was getting the same issue and finally found that the file contained a \n character at the end and sqoop considered that also as a part of the password string. Try creating a password file as mentioned below and than use the password file : echo -n "root_password" > password.txt Place your password in place of root_password.
关于mysql：Sqoop Import –password-file函数在sqoop 1.4.4中无法 , [SQOOP-3141] - Fix Java7 compile issue (getTypeName is only available after from git repository * [SQOOP-2283] - Support usage of --exec and --password- alias support * [SQOOP-2470] - Incremental Hive import with append not working work well with not default file systems * [SQOOP-3158] - Columns added to� An icon used to represent a menu that can be toggled by interacting with this icon.
mysql, 关于mysql：Sqoop Import –password-file函数在sqoop 1.4.4中无法 Sqoop Import --password-file function not working properly in sqoop 1.4.4. The Sqoop metastore works only with HSQLDB (1.8.0 and higher 1.x versions; the metastore does not work with any HSQLDB 2.x versions). 4. Sqoop 2 can transfer data to and from MySQL 5.0 and above, PostgreSQL 8.4 and above, Oracle 10.2 and above, and Microsoft SQL Server 2012 and above. The Sqoop 2 repository database is supported only on Derby. 5.
Trunk � Issue #16 � apache/sqoop � GitHub, 我正在尝试运行以下查询。 sqoop.password是一个文件，保留在HDFS上 Sqoop Import --password-file function not working properly in sqoop� Impala provides fast, interactive SQL queries directly on your Apache Hadoop data stored in HDFS, HBase, or the Amazon Simple Storage Service (S3). In addition to using the same unified storage platform, Impala also uses the same metadata, SQL syntax (Hive SQL), ODBC driver, and user interface (Impala query UI in Hue) as Apache Hive. This provides a familiar and unified platform for real-time
somasundaramsekar/quick-sqoop: A quick reference for , [SQOOP-2283] - Support usage of --exec and --password-alias [SQOOP-2470] - Incremental Hive import with append not working after validation check for -- hive-import and -- [SQOOP-3136] - Sqoop should work well with not default file systems -169,6 +417,7 @@ Changelog - Sqoop - Version 1.4.4 - 07-18-2013.
- Exactly. the password should be naked, no newline.
- Or use vim as per stackoverflow.com/questions/1050640/…, since part of the point of using a password file is to avoid the password appearing wherever command line usages appear in logs or
- Perfect, this fixed my issue.
- Thanks a lot. I had the same problem and got solved with this answer.
- Thanks, and it worked for me as well. I would add that if you want it to look in hdfs, for me I had to use the following:
- Hi Prasad, Didn't work on my end. It is giving the same error.
- I had to place the password file into HDFS to get it to work. sigh.
- yes as per the documentation that I have added above
The file containing the password can either be on the Local FS or HDFS.and im my case I have placed the password file in
- The subtlety that the password file must be in hadoop is missing from other answers, so this answer is more complete!
- Using echo is not advised for shared environments since it makes the password visible to other users.
- On linux, ls -l, or wc will disclose the file size, and od -c or od -c (do you prefer character or hex output?) will show you the contents.