Unable to install apache spark on DCOS -
i installed dc/os version 1.9.2 on openstack. try install apache spark on dc/os.
dcos package install spark installing marathon app package [spark] version [1.1.0-2.1.1] installing cli subcommand package [spark] version [1.1.0-2.1.1] new command available: dcos spark dc/os spark being installed!
but dc/os dashboard shows spark deploying , task not runing. errors showw messages.
i0728 16:43:36.348244 14038 exec.cpp:162] version: 1.2.2 i0728 16:43:36.656839 14046 exec.cpp:237] executor registered on agent abf187f4-ad7d-4ead-9437-5cdba4f77bdc-s1 + export dispatcher_port=24238 + dispatcher_port=24238 + export dispatcher_ui_port=24239 + dispatcher_ui_port=24239 + export spark_proxy_port=24240 + spark_proxy_port=24240 + scheme=http + other_scheme=https + [[ '' == true ]] + export dispatcher_ui_web_proxy_base=/service/spark + dispatcher_ui_web_proxy_base=/service/spark + grep -v '#https#' /etc/nginx/conf.d/spark.conf.template + sed s,#http#,, + sed -i 's,<port>,24240,' /etc/nginx/conf.d/spark.conf + sed -i 's,<dispatcher_url>,http://172.16.129.180:24238,' /etc/nginx/conf.d/spark.conf + sed -i 's,<dispatcher_ui_url>,http://172.16.129.180:24239,' /etc/nginx/conf.d/spark.conf + sed -i 's,<protocol>,,' /etc/nginx/conf.d/spark.conf + [[ '' == true ]] + [[ -f hdfs-site.xml ]] + [[ -n '' ]] + exec runsvdir -p /etc/service + + mkdirmkdir -p -p /mnt/mesos/sandbox/nginx /mnt/mesos/sandbox/spark + exec + exec svlogd /mnt/mesos/sandbox/nginx + exec svlogd /mnt/mesos/sandbox/spark nginx: [emerg] not build types_hash, should increase either types_hash_max_size: 1024 or types_hash_bucket_size: 32 nginx: [emerg] not build types_hash, should increase either types_hash_max_size: 1024 or types_hash_bucket_size: 32 nginx: [emerg] not build types_hash, should increase either types_hash_max_size: 1024 or types_hash_bucket_size: 32 nginx: [emerg] not build types_hash, should increase either types_hash_max_size: 1024 or types_hash_bucket_size: 32 nginx: [emerg] not build types_hash, should increase either types_hash_max_size: 1024 or types_hash_bucket_size: 32
how spark task run on dcos. thank .
check if uninstalled former spark installation properly. have remove old zookeeper spark entry (under /exhibitor).
also check if there no zombie frameworks in mesos blocking resources new deployment. kill them with:
http://mesosmaster_url:5050/master/teardown -d 'frameworkid='<frameworkid>''
Comments
Post a Comment