Skip to content

Pass env vars in CodeBuild projects

0

Hi,

I have AWS CodePipeline with 6 standalone Codebuild projects running in parallel. They have their own env vars. After that I have manual approval step. After manual approval step I have one CodeBuild project. How can I pass those different 6 ENV VARS to this one CodeBuild project?

Thanks

2 Answers
2

Hi myronix88,

Please go through below steps once it will heplfull to resolve your issue.

Store Environment Variables in S3:

Each of the 6 parallel CodeBuild projects should store their environment variables in an S3 bucket as a JSON file or as individual files.

Add a Lambda Function:

Use a Lambda function to gather these environment variable files after the manual approval step, combine them into a single JSON file, and upload it back to S3.

Retrieve and Use Environment Variables in the Final CodeBuild Project:

The final CodeBuild project will download this combined JSON file and set the environment variables accordingly.

Step 1: Modify Parallel CodeBuild Projects In each of the 6 CodeBuild projects, add a command to store the environment variables in S3 at the end of the build process.

For example, you can add the following command in the buildspec file:

phases:
  build:
    commands:
      - echo '{ "ENV_VAR1": "'"$ENV_VAR1"'", "ENV_VAR2": "'"$ENV_VAR2"'" }' > /tmp/env_vars.json
      - aws s3 cp /tmp/env_vars.json s3://your-bucket/env_vars_project_1.json

Repeat this for each CodeBuild project, changing the filename for each project to avoid overwriting.

Step 2: Create a Lambda Function Create a Lambda function to consolidate the environment variables from the 6 files into one.

import boto3
import json

s3 = boto3.client('s3')

def lambda_handler(event, context):
    bucket_name = 'your-bucket'
    env_var_files = [
        'env_vars_project_1.json',
        'env_vars_project_2.json',
        'env_vars_project_3.json',
        'env_vars_project_4.json',
        'env_vars_project_5.json',
        'env_vars_project_6.json'
    ]
    
    combined_env_vars = {}
    
    for file in env_var_files:
        obj = s3.get_object(Bucket=bucket_name, Key=file)
        env_vars = json.loads(obj['Body'].read().decode('utf-8'))
        combined_env_vars.update(env_vars)
    
    combined_file = '/tmp/combined_env_vars.json'
    with open(combined_file, 'w') as f:
        json.dump(combined_env_vars, f)
    
    s3.upload_file(combined_file, bucket_name, 'combined_env_vars.json')

    return {
        'statusCode': 200,
        'body': json.dumps('Combined environment variables stored successfully.')
    }

Step 3: Add the Lambda Function to the Pipeline Add a new action in your CodePipeline to invoke this Lambda function after the manual approval step.

Step 4: Modify the Final CodeBuild Project In the final CodeBuild project, add a command to download the combined JSON file from S3 and export the environment variables.

Modify the buildspec file as follows:

phases:
  pre_build:
    commands:
      - aws s3 cp s3://your-bucket/combined_env_vars.json /tmp/combined_env_vars.json
      - export $(cat /tmp/combined_env_vars.json | jq -r 'to_entries|map("\(.key)=\(.value|tostring)")|.[]')
  build:
    commands:
      - echo "ENV_VAR1 is $ENV_VAR1"
      - echo "ENV_VAR2 is $ENV_VAR2"
      # Add your build commands here
EXPERT
answered 2 years ago
  • Thanks, but there is out of the box solution I found.

1
Accepted Answer

Fixed by using this syntax:

Enter image description here

answered 2 years ago
EXPERT
reviewed a year ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.