java.lang.NoSuchMethodError: scala.Predef$.refArrayOps

Related searches

I have the following class:

import scala.util.{Success, Failure, Try}


class MyClass {

  def openFile(fileName: String): Try[String]  = {
    Failure( new Exception("some message"))
  }

  def main(args: Array[String]): Unit = {
    openFile(args.head)
  }

}

Which has the following unit test:

class MyClassTest extends org.scalatest.FunSuite {

  test("pass inexistent file name") {
    val myClass = new MyClass()
    assert(myClass.openFile("./noFile").failed.get.getMessage == "Invalid file name")
  }

}

When I run sbt test I get the following error:

java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
        at org.scalatest.tools.FriendlyParamsTranslator$.translateArguments(FriendlyParamsTranslator.scala:174)
        at org.scalatest.tools.Framework.runner(Framework.scala:918)
        at sbt.Defaults$$anonfun$createTestRunners$1.apply(Defaults.scala:533)
        at sbt.Defaults$$anonfun$createTestRunners$1.apply(Defaults.scala:527)
        at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
        at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
        at scala.collection.immutable.Map$Map1.foreach(Map.scala:109)
        at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
        at scala.collection.AbstractTraversable.map(Traversable.scala:105)
        at sbt.Defaults$.createTestRunners(Defaults.scala:527)
        at sbt.Defaults$.allTestGroupsTask(Defaults.scala:543)
        at sbt.Defaults$$anonfun$testTasks$4.apply(Defaults.scala:410)
        at sbt.Defaults$$anonfun$testTasks$4.apply(Defaults.scala:410)
        at scala.Function8$$anonfun$tupled$1.apply(Function8.scala:35)
        at scala.Function8$$anonfun$tupled$1.apply(Function8.scala:34)
        at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
        at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
        at sbt.std.Transform$$anon$4.work(System.scala:63)
        at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
        at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
        at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
        at sbt.Execute.work(Execute.scala:235)
        at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
        at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
        at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
        at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
[error] (test:executeTests) java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;

Build definitions:

version := "1.0"

scalaVersion := "2.12.0"

// https://mvnrepository.com/artifact/org.scalatest/scalatest_2.11
libraryDependencies += "org.scalatest" % "scalatest_2.11" % "3.0.0"

I can't figure out what causes this. My class and unit test seem simple enough. Any ideas?

scalatest_2.11 is the version of ScalaTest compatible only with Scala 2.11.x. Write libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.0" % "test" (note %%) instead to pick the correct version automatically and switch to Scala 2.11.8 until scalatest_2.12 is released (it should be very soon). See http://www.scala-sbt.org/0.13/docs/Cross-Build.html for more.

The Scala java.lang.NoSuchMethodError compiler error message , Exception: A NoSuchMethodError exception was thrown: scala.Predef$. refArrayOps([Ljava/lang/Object;)[Ljava/lang/Object; -> [Help 1]� Dismiss Join GitHub today. GitHub is home to over 50 million developers working together to host and review code, manage projects, and build software together.

I had SDK in global libraries with a different version of Scala(IntelliJ IDEA). File -> Project Structure -> Global libraries -> Remove SDK -> Rebuild. It fixed the Exception for me.

java.lang.NoSuchMethodError: scala.Predef$.refArrayOps scala , Thanks for contributing an answer to Stack Overflow! Please be sure to answer the question.Provide details and share your research! But avoid …. Asking for help, clarification, or responding to other answers.

I used IntelliJ, and just import the project again. I mean, close the open project and import in as Maven or SBT. Note: I select the mvn (import Maven projects automatically) It disappeared.

build error: NoSuchMethodError: scala.Predef$.refArrayOps � Issue , We use cookies for various purposes including analytics. By continuing to use Pastebin, you agree to our use of cookies as described in the Cookies Policy. OK, I Understand

In my experience, if you still get errors after matching scalatest version and scala version in build.sbt, you have to think about your actual scala version that is running on your machine. You can check it by $ scala, seeing

Welcome to Scala 2.12.1 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_121). Type in expressions for evaluation. Or try :help. this type of messages. You need to match this Scala version(eg. 2.12.1 here) and build.sbt's one.

There are a few jars that might be the cause of the issue. scala-library-2.11.5.jar suggests you are using Scala 2.11, this seems consistent with kafka_2.11-0.8.2.1.jar.

When you are using Spark, Hadoop, Scala, and java, some incompatibilities arise. You can use the version of each one that are compatible with others. I use Spark version: 2.4.1 , Hadoop: 2.7 , java: 9.0.1 and Scala: 2.11.12 they are compatible with each other.

Hi, When running a job using spark-submit, I am running into the following issue: 16/12/24 11:57:18 INFO mapreduce.AvroKeyInputFormat: Using a reader schema equal to the writer schema. 16/12/24 11:

It's been a while since I've seen a problem related to using different versions of Scala, but I just ran into it now.. I have a weird setup on one computer, and what I think I did was to compile two libraries with Scala 2.10.x, while their build.sbt files indicated that the libraries were intended for Scala 2.9.1.

This is the documentation for the Scala standard library. Package structure . The scala package contains core types like Int, Float, Array or Option which are accessible in all Scala compilation units without explicit qualification or imports.

Why GitHub? Features →. Code review; Project management; Integrations; Actions; Packages; Security

Comments
  • Can you share your build definition as well?
  • I confirmed your class methods work as expected in a standard scala repl. Must be an issue with the sbt build def.
  • ScalaTest is now available for Scala 2.12.
  • Saved my day. Thank you!
  • Awesome! Thanks!
  • Another thing that helped in addition to this: reimporting the maven module (if you're using that). Also check Project Structure -> Problems, which may indicate a reference to an invalid/outdated scala SDK library. This happened to me after extended troubleshooting and trying different scala versions.
  • or check your scala version , it should be the same as your pom.xml or sbt setting
  • Important to note that starting in Spark version 2.4.2 the default distribution is compiled using Scala 2.12; prior to that 2.11 is used by default. So if you hit this error and you are using 2.11 dependencies in your project, make sure your Spark installation is also built using 2.11
  • What is the difference between libraryDependencies += "org.scalactic" %% "scalactic" % "3.0.1" and libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.1" % "test"
  • How can this be handled in maven project?