sc未在pyspark中定义

时间:2017-06-07 15:15:30

标签: apache-spark pyspark

SparkContext函数在cmd和Jupyter中的pyspark中返回错误

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
NameError: name 'sc' is not defined

我试过了:

>>> from pyspark import SparkContext
>>> sc = SparkContext()

但仍然显示错误:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "c:\spark\python\pyspark\context.py", line 115, in __init__
    SparkContext._ensure_initialized(self, gateway=gateway, conf=conf)
  File "c:\spark\python\pyspark\context.py", line 275, in _ensure_initialized
    callsite.function, callsite.file, callsite.linenum))
**ValueError: Cannot run multiple SparkContexts at once; existing SparkContext(app
=PySparkShell, master=local[*]) created by getOrCreate at c:\spark\bin\..\python
\pyspark\shell.py:43**strong text****

如何解决问题?

1 个答案:

答案 0 :(得分:1)

你可能有另一个执行pySpark的笔记本,你可以使用 SparkContext.getOrCreate()

问候。

相关问题