<p>下面的SOF问题<a href="https://stackoverflow.com/questions/25934778/how-to-run-script-in-pyspark-and-drop-into-ipython-shell-when-done">How to run script in Pyspark and drop into IPython shell when done?</a>告诉您如何启动pyspark脚本:</p>
<pre><code> %run -d myscript.py
</code></pre>
<p>但是我们如何访问existin spark上下文呢?</p>
<p>仅仅创建一个新的不起作用:</p>
<pre><code> ----> sc = SparkContext("local", 1)
ValueError: Cannot run multiple SparkContexts at once; existing
SparkContext(app=PySparkShell, master=local) created by <module> at
/Library/Python/2.7/site-packages/IPython/utils/py3compat.py:204
</code></pre>
<p>但试图使用现有的。。什么是现有的?</p>
<pre><code>In [50]: for s in filter(lambda x: 'SparkContext' in repr(x[1]) and len(repr(x[1])) < 150, locals().iteritems()):
print s
('SparkContext', <class 'pyspark.context.SparkContext'>)
</code></pre>
<p>也就是说,SparkContext实例没有变量</p>