How to solve "Can't assign requested address: Service 'sparkDriver' failed after 16 retries" when running spark code?

Related searches

I am learning spark + scala with intelliJ , started with below small piece of code

import org.apache.spark.{SparkConf, SparkContext}

object ActionsTransformations {

  def main(args: Array[String]): Unit = {
    //Create a SparkContext to initialize Spark
    val conf = new SparkConf()
    conf.setMaster("local")
    conf.setAppName("Word Count")
    val sc = new SparkContext(conf)

    val numbersList = sc.parallelize(1.to(10000).toList)

    println(numbersList)
  }

}

when trying to run , getting below exception

Exception in thread "main" java.net.BindException: Can't assign requested address: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.
    at sun.nio.ch.Net.bind0(Native Method)
    at sun.nio.ch.Net.bind(Net.java:433)
    at sun.nio.ch.Net.bind(Net.java:425)
    at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
    at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:127)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:501)
    at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1218)
    at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:496)
    at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:481)
    at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:965)
    at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:210)
    at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:353)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:399)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:446)
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
    at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
    at java.lang.Thread.run(Thread.java:745)

Process finished with exit code 1

can any one suggest what to do .

Seems like you've used some old version of spark. In your case try to add this line:

conf.set("spark.driver.bindAddress", "127.0.0.1")

If you will use spark 2.0+ folowing should do the trick:

val spark: SparkSession = SparkSession.builder()
.appName("Word Count")
.master("local[*]")
.config("spark.driver.bindAddress", "127.0.0.1")
.getOrCreate()

Learn How to Solve a Rubik's Cube in 10 Minutes (Beginner Tutorial , Learning to solve a Rubik's Cube can be easy! Read the pinned comment for common Duration: 10:03 Posted: Sep 19, 2018 If you want to learn how to solve the Rubik’s Cube, look no further, you have come to the right place! Getting help with solving the Rubik’s Cube is not cheating. There are 42 Quintillion possibilities, but only one correct solution. Hence without knowing how to solve a Rubik’s Cube it is nearly impossible.

I think setMaster and setAppName will return a new SparkConf object and the line conf.setMaster("local") will not effect on the conf variable. So you should try:

val conf = new SparkConf()
    .setMaster("local[*]")
    .setAppName("Word Count")

How to solve the Rubik's Cube - Beginners Method, The easiest way to solve the cube using the beginner's method. Divide the Rubik's Cube into Duration: 6:58 Posted: Apr 6, 2017 How to solve the Rubik's Cube? There are many approaches on how to solve the Rubik's Cube. All these methods have different levels of difficulties, for speedcubers or beginners, even for solving the cube blindfolded. People usually get stuck solving the cube after completing the first face, after that they need some help.

It seems like the ports which spark is trying to bind are already in use. Did this issue start happening after you ran spark successfully a few times? You may want to check if those previously-run-spark-processes are still alive, and are holding onto some ports (a simple jps / ps -ef should tell you that). If yes, kill those processes and try again.

Unlock the Secret and Solve the Rubik's Cube, How To Solve A Rubik's Cube This is the easiest solution. You only have to learn six moves! Don't worry, this is not cheating. 99.9% can't solve the Rubik's Cube™ without help ;)

conf.set("spark.driver.bindAddress", "127.0.0.1")

Adding bindAddress worked for me.

WebMath is designed to help you solve your math problems. Composed of forms to fill-in and then returns analysis of a problem and, when possible, provides a step-by-step solution. Covers arithmetic, algebra, geometry, calculus and statistics.

Add SPARK_LOCAL_IP in load-spark-env.sh file located at spark/bin directory

export SPARK_LOCAL_IP="127.0.0.1"

Unlock the Secret - Solve the Cube Everything you need to learn to solve the Rubik's Cube (3x3,) Rubik's Mini (2x2), and Rubik's Master (4x4)! The team at You CAN Do the Rubik’s Cube is always looking to improve the solving experience. To this end, we have recently updated our solution guides.

The solve for x calculator allows you to enter your problem and solve the equation to see the result. Solve in one variable or many.

Comments
  • try conf.setMaster("local[*]")
  • Notice: Can't assign requested address: Service 'sparkDriver' failed after 16 retries (on a random free port). The error says it already tried 16 random free port!