1 Answer
- Newest
- Most votes
- Most comments
0
The common cause of this issue is due to missing necessary permissions in execution role to create cloudwatch log groups and log streams.
Permissions snippet from notebook'es execution role, whose logs are publishing successfully.
{
"Action": "logs:PutLogEvents",
"Effect": "Allow",
"Resource": "<<ARN_NO>>",
"Sid": "Logs"
},
{
"Action": [
"logs:DescribeLogStreams",
"logs:CreateLogStream",
"logs:CreateLogGroup"
],
"Effect": "Allow",
"Resource": "<<ARN_NO>>",
"Sid": "Logs2"
}
Please review above IAM statements and modify to your requirement.
answered 3 years ago
Relevant content
- AWS OFFICIALUpdated 10 months ago

Hello @Ram_AWS, thank you for your answer. I thought that this would be the result but it seems to be that the script inside the processing step is facing an issue while reading a csv file as follows: /opt/ml/processing/input/code/processing.py:153: DtypeWarning: Columns (14) have mixed types. Specify dtype option on import or set low_memory=False. df_raw = pd.read_csv(f"{base_dir}/input/raw_df.csv").
It turns out that whenever I try to solve this by adding low_memory=False, the cloud watch logs won't be published. I am still looking for a solution for this csv file reading problem though, I would be glad if you have something to add here.