How to set Environment variable in AWS EMR using SSM to be used by pyspark scripts

0

I am using emr-6.12.0 and trying to set environment varibles which are stored in secret manager in bootstrap.sh file.

SECRET_NAME="/myapp/dev/secrets"
SECRETS_JSON=$(aws secretsmanager get-secret-value --secret-id $SECRET_NAME --query SecretString --output text)

# Parse the secrets and set them as environment variables
for key in $(echo "$SECRETS_JSON" | jq -r "keys[]"); do
  value=$(echo "$SECRETS_JSON" | jq -r ".$key // empty" | sed 's/"/\\"/g')
  echo "$value"
  if [ ! -z "$value" ]; then
    export "$key"="$value"
  fi
done

I am able to see these values in log.

but when I try to access these variables from my pyspark script, I am not able to get these env variables.

os.environ.get("POSTGRES_URL") // Returns None

for key, value in os.environ.items():
    self.logger.info(f"{key}: {value}") // not able to see my env variables

As I am new to EMR and spark, please help me to know how can I set my env variables from SSM to EMR.

vivek
已提问 5 个月前293 查看次数
1 回答
0
profile pictureAWS
已回答 5 个月前

您未登录。 登录 发布回答。

一个好的回答可以清楚地解答问题和提供建设性反馈,并能促进提问者的职业发展。

回答问题的准则