java - Do I need to install Apache Spark and/or Scala to run a jUnit? -
i new apache spark framework, trying setup first junit follows:
package com.sample.ccspark; import com.holdenkarau.spark.testing.sharedjavasparkcontext; import org.apache.spark.api.java.javardd; import org.junit.test; import java.util.list; import static java.util.arrays.aslist; import static org.hamcrest.core.is.is; import static org.junit.assert.assertthat; public class simpletest extends sharedjavasparkcontext { @test public void initializationworks() { list<integer> list = aslist(1, 2, 3, 4); javardd<integer> rdd = jsc().parallelize(list); assertthat(rdd.count(), is(list.size())); } }
with following dependencies in pom.xml
<dependency> <groupid>com.holdenkarau</groupid> <artifactid>spark-testing-base_2.11</artifactid> <version>2.2.0_0.7.2</version> <scope>test</scope> </dependency> <dependency> <groupid>org.apache.spark</groupid> <artifactid>spark-core_2.10</artifactid> <version>2.2.0</version> </dependency>
everything happily compiles, when running getting following exception:
exception in thread "dispatcher-event-loop-6" java.lang.noclassdeffounderror: scala/runtime/abstractpartialfunction$mcvl$sp
i not have spark or scala installed locally yet, under impression testing framework should take care of dependencies. there missing here?
in artifacts names
<artifactid>spark-testing-base_2.11</artifactid> <artifactid>spark-core_2.10</artifactid>
the last number version of scala. guess should select 1 both cases.
Comments
Post a Comment