mysql to sqoop - Connection Refused : java.net.ConnectException

I am trying to import data from mysql database on my localhost to HDFS using sqoop and whenever I try to run the code it is giving me connection refused exception.

sqoop import --connect jdbc:mysql://localhost/sqoop --username root --password 123@ajith --table mysql_sqoop --m 1

But it is showing the connection refused error.

Everything is installed in my local machine.

here is the error attached :

Warning: /usr/lib/sqoop/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /usr/lib/sqoop/../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
16/05/04 14:44:50 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
16/05/04 14:44:50 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
16/05/04 14:44:50 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
16/05/04 14:44:50 INFO tool.CodeGenTool: Beginning code generation
16/05/04 14:44:51 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `mysql_sqoop` AS t LIMIT 1
16/05/04 14:44:51 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `mysql_sqoop` AS t LIMIT 1
16/05/04 14:44:51 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/local/hadoop
Note: /tmp/sqoop-hduser_/compile/21bf2271e2878039d8e7c32486f8b7b7/mysql_sqoop.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
16/05/04 14:44:52 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hduser_/compile/21bf2271e2878039d8e7c32486f8b7b7/mysql_sqoop.jar
16/05/04 14:44:52 WARN manager.MySQLManager: It looks like you are importing from mysql.
16/05/04 14:44:52 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
16/05/04 14:44:52 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
16/05/04 14:44:52 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
16/05/04 14:44:52 INFO mapreduce.ImportJobBase: Beginning import of mysql_sqoop
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/Hbase/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
16/05/04 14:44:52 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/05/04 14:44:52 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
16/05/04 14:44:53 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
16/05/04 14:44:53 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
16/05/04 14:44:53 ERROR tool.ImportTool: Encountered IOException running import job: java.net.ConnectException: Call From ajith-HP-ENVY-17-Notebook-PC/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
	at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:792)
	at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:732)
	at org.apache.hadoop.ipc.Client.call(Client.java:1479)
	at org.apache.hadoop.ipc.Client.call(Client.java:1412)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
	at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
	at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2108)
	at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)
	at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1301)
	at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1424)
	at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:145)
	at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:266)
	at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:139)
	at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)
	at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
	at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)
	at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308)
	at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:196)
	at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:169)
	at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:266)
	at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673)
	at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:118)
	at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)
	at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
	at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
	at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
	at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
Caused by: java.net.ConnectException: Connection refused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:744)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:614)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:712)
	at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528)
	at org.apache.hadoop.ipc.Client.call(Client.java:1451)
	... 40 more

Can you please try this?

sqoop import --connect jdbc:mysql://localhost:3306/sqoop --username root --password 123@ajith --table mysql_sqoop --m 1

I suppose you should add the port 3306

Solved: Connection refused, Error: java.net. Connection refused - when loading data from MySQL to HDFS using sqoop2 ConnectException: Connection refused; For more details see: It sounds like your core-site.xml on the node running the Sqoop server isn't� I am using Ubuntu server 7.04 and mysql 5.0.38. i can connect to mysql using -h 192.168.168.230 and -h localhost. thers is no skip-networking in my.cnf but for this

Pls restart Hadoop and try again.

./stop-dfs.sh

./stop-yarn.sh

./start-dfs.sh

./start-dfs.sh

Unable to import data from mysql to HDFS through Sqoop, Unable to import data from mysql to HDFS through Sqoop 10 more. Caused by: java.net.ConnectException: Connection refused at java.net. Thanks for your reply Mark. As i mentioned earlier in the thread, I started with connector-java-3.0.16 but upgraded to 3.1.8a (latest release).

Try checking all the demons are running by using JPS command, connection refused error may come when demon services is not started.

To start all services start-all.sh (Note: It is deprecated but it will start all the services), if you want to start separate services then use it accordingly.

Sqoop connection refused error while importing in cloudera VM , MySQLManager: It looks like you are importing from mysql. 18/05/08 20:02:45 java.net.ConnectException: Connection refused at sun.nio.ch. Connection refused - when loading data from MySQL to HDFS using sqoop2 java.net.ConnectException: Connection refused; It sounds like your core-site.xml on the

Sqoop import failed ( connection refused ), I m able to connect to mysql via a java program . Problem description Caused by: java.net.ConnectException: Connection refused at java.net. - Unable to connect to any hosts due to exception: java.net.ConnectException: Connection refused com.mysql.jdbc.Connection in createNewIO (line 1719 in Connection.java) com.mysql.jdbc.Connection in (line 432 in Connection.java) com.mysql.jdbc.NonRegisteringDriver in connect (line 400 in NonRegisteringDriver.java)

Caused by: java.net.ConnectException: Connection refused , 1 [hadoop@slaver1 sqoop-1.4.5-cdh5.3.6]$ bin/sqoop import --connect jdbc: mysql://localhost:3306/test --username root --password 123456� I am trying to obtain mysql db connection using java.I am getting the following stack trace.Mysql server is running on localhost java.net.ConnectException

Resolved Issue / Error in sqoop 1.4 - what i learnt, Caused by: java.net.ConnectException: Connection refused. SOLUTION check the status of the MySQL up & running.ensure user has rights to� Please open a new topic as your issue is unrelated to this topic. This helps keep issues separate and improves your search experience. Effectively your issue is that your YARN Resource Manager is either (1) down, due to a crash explained in the /var/log/hadoop-yarn/*.out files, or (2) not serving on the external address that quickstart.cloudera runs on, for which you need to ensure that

Comments
  • I tried but the same error is happening again: please see the error : ERROR tool.ImportTool: Encountered IOException running import job: java.net.ConnectException: Call From ajith-HP-ENVY-17-Notebook-PC/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: wiki.apache.org/hadoop/ConnectionRefused
  • Can you do this successfully? hadoop fs -ls hdfs://localhost:9000/ It should be the connection problem to the HDFS name node. if not, use this parameter --hadoop-home to specify a explicit address.
  • When I did hadoop fs -ls hdfs://localhost:9000/ : This is the error comming -> 16/05/05 10:23:00 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable ls: Call From ajith-HP-ENVY-17-Notebook-PC/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: wiki.apache.org/hadoop/ConnectionRefused
  • What is the right address for you to access the hdfs? you should use that address. Can you list the files in your hdfs like hadoop fs -ls xxxx
  • hadoop fs -ls / Found 2 items drwxr-xr-x - hduser_ supergroup 0 2016-05-03 15:44 /system drwxr-xr-x - hduser_ supergroup 0 2016-05-03 15:37 /user