By using AWS re:Post, you agree to the Terms of Use

Component with bash script and python code - how to package/deploy


I have created an AWS Greengrass component that consists of:

  • A python program
  • A bash script, called as a subprocess from within that Python program.

I have it packaged/deployed as follows:

  • .py file and .sh script zipped in
  • stored on S3:/path/to/

Snippet from recipe file is as follows:
"Manifests": [
"Platform": {
"os": "linux"
"Lifecycle": {
"Run": "python3 -u {artifacts:decompressedPath}/my_archive/ "
"Artifacts": [
"URI": "sS3:/path/to/ ",
"Unarchive": "ZIP",
"Permission": {
"Read": "ALL",
"Execute": "ALL"

Within the code, the python file executes "" assuming that the script file is local:
process = subprocess.Popen(["./",

The logs report "file not found", and when I run "ls -ltrh" with the subprocess call within the component to debug, it reports 0 files.

My question is -

  1. How do I get the deployed .sh filepath, relative to the python script, to be able to call it with the "subprocess" call?
asked a year ago61 views
1 Answer

Hi GreengrassUser,

Thanks for using Greengrass V2. You cannot find the file from the artifacts:decompressedPath from the component process because when the lifecycle command runs its working directory will be set to the component's work path, i.e. <greengrass_root>/work/<component_name>, as described in the documentation here and you can verify $PWD from your component process. So you can access the file from the artifacts:decompressedPath by passing the path either as an environment variable or as an argument to your python script run lifecycle command as {artifacts:decompressedPath}/my_archive/ i.e. same as you run the python script.


answered a year ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions