Spark + Python - Ship pytz module dir to spark executors -


i have added new dependency 1 of python modules (pytz) in package , trying ship dependency (pytz ) spark executors via sc.addpyfile

ex - sc.addfile('//lib/python2.7/site-packages/pyodin-0.0.0-py2.7.egg-info')

but doesn't seem work . can 1 me understand how can ship complete python modules spark executors using spark-submit?

i referring shipping python modules in pyspark other nodes?

and have recommended using same doesn't seem work with.egg_info directories.

while merging in new dependencies - getting following directories

1) /lib/python3.6/site-packages/pytz% ls
exceptions.py init.py lazy.py reference.py tzfile.py tzinfo.py zoneinfo

2) /lib/python3.6/site-packages/pytz-2017.2-py3.6.egg-info% ls dependency_links.txt pkg-info sources.txt top_level.txt zip-safe

could 1 please confirm of 2 paths should given part of sc.addpyfile() parameter. pointers appreciated ?


Comments

Popular posts from this blog

php - Vagrant up error - Uncaught Reflection Exception: Class DOMDocument does not exist -

vue.js - Create hooks for automated testing -

Add new key value to json node in java -