pyspark无法正常工作,因为会话正在超时

时间:2018-09-17 08:50:55

标签: pyspark ipython jupyter-notebook jupyterhub

我们为python3安装了anaconda,可以通过jupyterhub访问。我们为python2创建了一个带有anaconda的新环境。可以在jupyter UI中看到该内核,但是python2内核死了。我们设法单独启动了pyspark。但是,这会初始化spark上下文,但是会花一些时间在集群空闲时返回简单show数据库的结果。一旦查询返回结果,运行下一个查询将在数小时内不返回结果。我们在纱线记录中看到以下错误。感谢您提供解决此问题的建议。

Container id: container_e48_1536611621510_0248_01_000003
Exit code: 1
Stack trace: ExitCodeException exitCode=1: 
    at org.apache.hadoop.util.Shell.runCommand(Shell.java:604)
    at org.apache.hadoop.util.Shell.run(Shell.java:507)
    at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:789)
    at org.apache.hadoop.yarn.server.nodemanager.LinuxContainerExecutor.launchContainer(LinuxContainerExecutor.java:399)
    at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
    at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:748)

Shell output: main : command provided 1 

我们按照https://ipython.readthedocs.io/en/stable/install/kernel_install.html

的要求安装了内核

0 个答案:

没有答案