链接命令在将容器链接到自身时挂起

时间:2016-04-10 02:07:09

标签: docker

尝试使用我为Apache Spark bernieai/docker-spark为我自己构建的Docker镜像。我发现当我尝试运行容器中包含的脚本时,Java引发了异常,因为找不到容器的名称spark_master

这个问题的根本原因是我试图通过脚本./start-master.sh在我的Docker容器中运行Spark,但它会抛出以下错误:

Caused by: java.net.UnknownHostException: spark_master

所以我用Google搜索了问题并遵循了这里的建议:https://groups.google.com/forum/#!topic/docker-user/d-yuxRlO0yE

问题出在我运行命令时:

docker run -d -t -P --name spark_master --link spark_master:spark_master bernieai/docker-spark

Docker突然挂了,守护进程没有响应。没有错误,只是悬挂。

任何想法有什么不对?有没有更好的方法来解决根本原因?

添加了Dockerfile

############################################################
# Dockerfile for a Apache Spark Development Environment
# Based on Ubuntu Image
############################################################

FROM ubuntu:latest
MAINTAINER Justin Long <crockpotveggies.com>

ENV SPARK_VERSION      1.6.1
ENV SCALA_VERSION      2.11.7
ENV SPARK_BIN_VERSION  $SPARK_VERSION-bin-hadoop2.6
ENV SPARK_HOME         /usr/local/spark
ENV SCALA_HOME         /usr/local/scala
ENV PATH               $PATH:$SPARK_HOME/bin:$SCALA_HOME/bin

# Update the APT cache
RUN sed -i.bak 's/main$/main universe/' /etc/apt/sources.list
RUN apt-get update
RUN apt-get upgrade -y

# Install and setup project dependencies
RUN apt-get install -y curl wget git
RUN locale-gen en_US en_US.UTF-8

#prepare for Java download
RUN apt-get install -y python-software-properties
RUN apt-get install -y software-properties-common

#grab oracle java (auto accept licence)
RUN add-apt-repository -y ppa:webupd8team/java
RUN apt-get update
RUN echo oracle-java8-installer shared/accepted-oracle-license-v1-1 select true | /usr/bin/debconf-set-selections
RUN apt-get install -y oracle-java8-installer

# Install Scala
RUN wget http://downloads.typesafe.com/scala/$SCALA_VERSION/scala-$SCALA_VERSION.tgz && \
    tar -zxf /scala-$SCALA_VERSION.tgz -C /usr/local/ && \
    ln -s /usr/local/scala-$SCALA_VERSION $SCALA_HOME && \
    rm /scala-$SCALA_VERSION.tgz

# Installing Spark for Hadoop
RUN wget http://d3kbcqa49mib13.cloudfront.net/spark-$SPARK_BIN_VERSION.tgz && \
    tar -zxf /spark-$SPARK_BIN_VERSION.tgz -C /usr/local/ && \
    ln -s /usr/local/spark-$SPARK_BIN_VERSION $SPARK_HOME && \
    rm /spark-$SPARK_BIN_VERSION.tgz

ADD scripts/start-master.sh /start-master.sh
ADD scripts/start-worker /start-worker.sh
ADD scripts/spark-shell.sh  /spark-shell.sh
ADD scripts/spark-defaults.conf /spark-defaults.conf
ADD scripts/remove_alias.sh /remove_alias.sh

ENV SPARK_MASTER_OPTS="-Dspark.driver.port=7001 -Dspark.fileserver.port=7002 -Dspark.broadcast.port=7003 -Dspark.replClassServer.port=7004 -Dspark.blockManager.port=7005 -Dspark.executor.port=7006 -Dspark.ui.port=4040 -Dspark.broadcast.factory=org.apache.spark.broadcast.HttpBroadcastFactory"
ENV SPARK_WORKER_OPTS="-Dspark.driver.port=7001 -Dspark.fileserver.port=7002 -Dspark.broadcast.port=7003 -Dspark.replClassServer.port=7004 -Dspark.blockManager.port=7005 -Dspark.executor.port=7006 -Dspark.ui.port=4040 -Dspark.broadcast.factory=org.apache.spark.broadcast.HttpBroadcastFactory"

ENV SPARK_MASTER_PORT 7077
ENV SPARK_MASTER_WEBUI_PORT 8080
ENV SPARK_WORKER_PORT 8888
ENV SPARK_WORKER_WEBUI_PORT 8081

EXPOSE 8080 7077 8888 8081 4040 7001 7002 7003 7004 7005 7006

1 个答案:

答案 0 :(得分:2)

使用-h标志运行。它会将主机名设置为spark_master。

docker run -it --rm --name spark_master -h spark_master  bernieai/docker-spark ./start-master.sh

这是输出

starting org.apache.spark.deploy.master.Master, logging to /usr/local/spark/logs/spark--org.apache.spark.deploy.master.Master-1-spark_master.out

root@spark_master:/# tail usr/local/spark/logs/spark--org.apache.spark.deploy.master.Master-1-spark_master.out
16/04/10 03:12:04 INFO SecurityManager: Changing modify acls to: root
16/04/10 03:12:04 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
16/04/10 03:12:05 INFO Utils: Successfully started service 'sparkMaster' on port 7077.
16/04/10 03:12:05 INFO Master: Starting Spark master at spark://spark_master:7077
16/04/10 03:12:05 INFO Master: Running Spark version 1.6.1
16/04/10 03:12:06 INFO Utils: Successfully started service 'MasterUI' on port 8080.
16/04/10 03:12:06 INFO MasterWebUI: Started MasterWebUI at http://172.17.0.2:8080
16/04/10 03:12:06 INFO Utils: Successfully started service on port 6066.
16/04/10 03:12:06 INFO StandaloneRestServer: Started REST server for submitting applications on port 6066
16/04/10 03:12:06 INFO Master: I have been elected leader! New state: ALIVE