Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
359 views
in Technique[技术] by (71.8m points)

python - Pyspark import .py file not working

My goal is to import a custom .py file into my spark application and call some of the functions included inside that file

Here is what I tried:

I have a test file called Test.py which looks as follows:

def func():
    print "Import is working"

Inside my Spark application I do the following (as described in the docs):

sc = SparkContext(conf=conf, pyFiles=['/[AbsolutePathTo]/Test.py'])

I also tried this instead (after the Spark context is created):

sc.addFile("/[AbsolutePathTo]/Test.py")

I even tried the following when submitting my spark application:

./bin/spark-submit --packages com.datastax.spark:spark-cassandra-connector_2.10:1.5.0-M2 --py-files /[AbsolutePath]/Test.py ../Main/Code/app.py

However, I always get a name error:

NameError: name 'func' is not defined

when I am calling func() inside my app.py. (same error with 'Test' if I try to call Test.func())

Finally, al also tried importing the file inside the pyspark shell with the same command as above:

sc.addFile("/[AbsolutePathTo]/Test.py")

Strangely, I do not get an error on the import, but still, I cannot call func() without getting the error. Also, not sure if it matters, but I'm using spark locally on one machine.

I really tried everything I could think of, but still cannot get it to work. Probably I am missing something very simple. Any help would be appreciated.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

Alright, actually my question is rather stupid. After doing:

sc.addFile("/[AbsolutePathTo]/Test.py")

I still have to import the Test.py file like I would import a regular python file with:

import Test

then I can call

Test.func()

and it works. I thought that the "import Test" is not necessary since I add the file to the spark context, but apparently that does not have the same effect. Thanks mark91 for pointing me into the right direction.

UPDATE 28.10.2017:

as asked in the comments, here more details on the app.py

from pyspark import SparkContext
from pyspark.conf import SparkConf

conf = SparkConf()
conf.setMaster("local[4]")
conf.setAppName("Spark Stream")
sc = SparkContext(conf=conf)
sc.addFile("Test.py")

import Test

Test.func()

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...