从Spark Worker访问环境变量

时间:2016-06-19 11:36:02

标签: apache-spark amazon-ec2 amazon-dynamodb aws-sdk

我有一个需要访问DynamoDB表的应用程序。每个工作者都自己建立与数据库的连接。

我已将AWS_ACCESS_KEY_IDAWS_SECRET_ACCESS_KEY添加到主文件和工作人员spark-env.sh文件中。我还使用sh运行该文件以确保导出变量。

当代码运行时,我总是收到错误:

Caused by: com.amazonaws.AmazonClientException: Unable to load AWS credentials from any provider in the chain
    at com.amazonaws.auth.AWSCredentialsProviderChain.getCredentials(AWSCredentialsProviderChain.java:131)
    at com.amazonaws.http.AmazonHttpClient.getCredentialsFromContext(AmazonHttpClient.java:774)
    at com.amazonaws.http.AmazonHttpClient.executeOneRequest(AmazonHttpClient.java:800)
    at com.amazonaws.http.AmazonHttpClient.executeHelper(AmazonHttpClient.java:695)
    at com.amazonaws.http.AmazonHttpClient.doExecute(AmazonHttpClient.java:447)
    at com.amazonaws.http.AmazonHttpClient.executeWithTimer(AmazonHttpClient.java:409)
    at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:358)
    at com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient.doInvoke(AmazonDynamoDBClient.java:2051)
    at com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient.invoke(AmazonDynamoDBClient.java:2021)
    at com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient.describeTable(AmazonDynamoDBClient.java:1299)
    at com.amazon.titan.diskstorage.dynamodb.DynamoDBDelegate.describeTable(DynamoDBDelegate.java:635)
    ... 27 more

似乎AWS SDK无法加载凭据,即使它们已导出。我应该尝试什么类型的解决方案?

1 个答案:

答案 0 :(得分:2)

您可以在@RunWith(Cucumber.class)上使用setExecutorEnv方法。 E.g

SparkConf

另外

  /**
   * Set an environment variable to be used when launching executors for this application.
   * These variables are stored as properties of the form spark.executorEnv.VAR_NAME
   * (for example spark.executorEnv.PATH) but this method makes them easier to set.
   */
  def setExecutorEnv(variable: String, value: String): SparkConf = {
    set("spark.executorEnv." + variable, value)
  }

您可以考虑其他选项,例如设置java系统属性: /** * Set multiple environment variables to be used when launching executors. * These variables are stored as properties of the form spark.executorEnv.VAR_NAME * (for example spark.executorEnv.PATH) but this method makes them easier to set. */ def setExecutorEnv(variables: Seq[(String, String)]): SparkConf = { for ((k, v) <- variables) { setExecutorEnv(k, v) } this } 会自动选择它们。

相关问题