amazon web services - EMR Spark working in a java main, but not in a java function -
i wonder why work :
public final class javasparkpi { public static void main(string[] args) throws exception { sparkconf sparkconf = new sparkconf().setmaster("yarn-cluster").setappname("mysparkapp"); javasparkcontext jsc = new javasparkcontext(sparkconf); arraylist<integer> list = new arraylist<>(); for(int = 0; < 10 ; i++){ list.add(i); } javardd<integer> dataset = jsc.parallelize(list) .map(s->2*s) .map(s->5*s); int weirdstuff= dataset.reduce((a, b) -> (a + b)/2); system.out.println("stuff " + weirdstuff); jsc.stop(); } }
and why not :
public final class javasparkpi { private void startworkingonmicrospark() { sparkconf sparkconf = new sparkconf().setmaster("yarn-cluster").setappname("mysparkapp"); javasparkcontext jsc = new javasparkcontext(sparkconf); arraylist<integer> list = new arraylist<>(); for(int = 0; < 10 ; i++){ list.add(i); } javardd<integer> dataset = jsc.parallelize(list) .map(s->2*s) .map(s->5*s); int weirdstuff = dataset.reduce((a, b) -> (a + b)/2); system.out.println("weirdstuff " + weirdstuff); jsc.stop(); } public static void main(string[] args) throws exception { javasparkpi jsp = new javasparkpi(); jsp.startworkingonmicrospark(); } }
i'm working on spark emr. difference found between 2 project fact 1 have spark part written in main , other not. launched both of them spark app in emr --class javasparkpi argument.
here failing statut :
statut :failed raison : fichier journal :s3://mynewbucket/logs/j-3akszxk7fkmx6/steps/s-2mt0sb910u3te/stderr.gz détails:exception in thread "main" org.apache.spark.sparkexception: application application_1501228129826_0003 finished failed status emplacement jar : command-runner.jar classe principale : aucun arguments : spark-submit --deploy-mode cluster --class javasparkpi s3://mynewbucket/code/sparkaws.jar action sur échec : continuer
and there successful 1 :
emplacement jar : command-runner.jar classe principale : aucun arguments : spark-submit --deploy-mode cluster --class javasparkpi s3://mynewbucket/code/sparkaws.jar action sur échec : continuer
put spark initialization methods main.
sparkconf sparkconf = new sparkconf().setmaster("yarn-cluster").setappname("mysparkapp"); javasparkcontext jsc = new javasparkcontext(sparkconf);
Comments
Post a Comment