Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
182 views
in Technique[技术] by (71.8m points)

python - Spark can access Hive table from pyspark but not from spark-submit

So, when running from pyspark i would type in (without specifying any contexts) :

df_openings_latest = sqlContext.sql('select * from experian_int_openings_latest_orc')

.. and it works fine.

However, when i run my script from spark-submit, like

spark-submit script.py i put the following in

from pyspark.sql import SQLContext
from pyspark import SparkConf, SparkContext
conf = SparkConf().setAppName('inc_dd_openings')
sc = SparkContext(conf=conf)
sqlContext = SQLContext(sc)

df_openings_latest = sqlContext.sql('select * from experian_int_openings_latest_orc')

But it gives me an error

pyspark.sql.utils.AnalysisException: u'Table not found: experian_int_openings_latest_orc;'

So it doesnt see my table.

What am I doing wrong? Please help

P.S. Spark version is 1.6 running on Amazon EMR

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

Spark 2.x

The same problem may occur in Spark 2.x if SparkSession has been created without enabling Hive support.

Spark 1.x

It is pretty simple. When you use PySpark shell, and Spark has been build with Hive support, default SQLContext implementation (the one available as a sqlContext) is HiveContext.

In your standalone application you use plain SQLContext which doesn't provide Hive capabilities.

Assuming the rest of the configuration is correct just replace:

from pyspark.sql import SQLContext

sqlContext = SQLContext(sc)

with

from pyspark.sql import HiveContext

sqlContext = HiveContext(sc)

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...