- Newest
- Most votes
- Most comments
may be try upload in json formate, see if this change works
# Function to upload file to S3 Bucket
def store_payload_into_bucket(assignment_roles):
# print("inside put_dynamodb_item")
try:
# print(f'store_payload_into_bucket: {assignment_roles}')
bucket_name = os.environ['permission_data_bucket']
today = date.today()
file_name = f"azure/datapayload_{today}.json" # platform/assignment_role_data_{datetime}
table_name = os.environ['dynamodb_id']
client = boto3.client('s3')
# assignment_roles = str(assignment_roles)
# Convert assignment_roles list to a JSON string
json_data = json.dumps(assignment_roles)
try:
# Save the JSON data with content type set to 'application/json'
client.put_object(Bucket=bucket_name, Key=file_name, Body=json_data, ContentType='application/json')
return {
'statusCode':200,
'body': {'key':file_name,'bucket_id':bucket_name}
}
except Exception as err:
return {
'statusCode':500,
'body':f'Error: {str(err)}'
}
except Exception as err:
raise err
I think that where you do assignment_roles = str(assignment_roles)
you should be doing json.dumps
instead of str
. While you can convert most data types in Python to string in this case it won't encode it as valid JSON.
@Brettski-AWS I am already returning json.dumps formatted data from the main function to store_payload_into_bucket.
Then you don't need to do a
str()
on it - you might find that is putting additional quotes around the data. Try downloading the raw file from S3 using the CLI - I think there will be the slightest of change there.Actually, that worked. I am still wondering why it did not as I was returning response json.dumps(data), but anyway thank you so much for your help.
Relevant content
- asked 9 months ago
- Accepted Answerasked 2 months ago
- AWS OFFICIALUpdated a year ago
- AWS OFFICIALUpdated 6 months ago
- AWS OFFICIALUpdated a year ago
@Nitin I am already returning json.dumps formatted data from the main function to store_payload_into_bucket.