Mllib dependency error

user3789843 picture user3789843 · Dec 12, 2014 · Viewed 8.8k times · Source

I'm trying to build a very simple scala standalone app using the Mllib, but I get the following error when trying to bulid the program:

Object Mllib is not a member of package org.apache.spark

Then, I realized that I have to add Mllib as dependency as follow :

version := "1"
scalaVersion :="2.10.4"

libraryDependencies ++= Seq(
"org.apache.spark"  %% "spark-core"              % "1.1.0",
"org.apache.spark"  %% "spark-mllib"             % "1.1.0"
)

But, here I got an error that says :

unresolved dependency spark-core_2.10.4;1.1.1 : not found

so I had to modify it to

"org.apache.spark" % "spark-core_2.10" % "1.1.1",

But there is still an error that says :

unresolved dependency spark-mllib;1.1.1 : not found

Anyone knows how to add dependency of Mllib in .sbt file?

Answer

Holden picture Holden · Dec 13, 2014

As @lmm pointed out, you can instead include the libraries as:

libraryDependencies ++= Seq( "org.apache.spark" % "spark-core_2.10" % "1.1.0", "org.apache.spark" % "spark-mllib_2.10" % "1.1.0" )

In sbt %% includes the scala version, and you are building with scala version 2.10.4 whereas the Spark artifacts are published against 2.10 in general.

It should be noted that if you are going to make an assembly jar to deploy your application you may wish to mark spark-core as provided e.g.

libraryDependencies ++= Seq( "org.apache.spark" % "spark-core_2.10" % "1.1.0" % "provided", "org.apache.spark" % "spark-mllib_2.10" % "1.1.0" )

Since the spark-core package will be in the path on executor anyways.