1 Answer
- Newest
- Most votes
- Most comments
0
The common cause of this issue is due to missing necessary permissions in execution role to create cloudwatch log groups and log streams.
Permissions snippet from notebook'es execution role, whose logs are publishing successfully.
{
"Action": "logs:PutLogEvents",
"Effect": "Allow",
"Resource": "<<ARN_NO>>",
"Sid": "Logs"
},
{
"Action": [
"logs:DescribeLogStreams",
"logs:CreateLogStream",
"logs:CreateLogGroup"
],
"Effect": "Allow",
"Resource": "<<ARN_NO>>",
"Sid": "Logs2"
}
Please review above IAM statements and modify to your requirement.
answered a year ago
Relevant content
- asked 10 months ago
- asked a year ago
- AWS OFFICIALUpdated 2 years ago
- AWS OFFICIALUpdated 7 months ago
- AWS OFFICIALUpdated 3 months ago
- AWS OFFICIALUpdated 2 years ago
Hello @Ram_AWS, thank you for your answer. I thought that this would be the result but it seems to be that the script inside the processing step is facing an issue while reading a csv file as follows: /opt/ml/processing/input/code/processing.py:153: DtypeWarning: Columns (14) have mixed types. Specify dtype option on import or set low_memory=False. df_raw = pd.read_csv(f"{base_dir}/input/raw_df.csv").
It turns out that whenever I try to solve this by adding low_memory=False, the cloud watch logs won't be published. I am still looking for a solution for this csv file reading problem though, I would be glad if you have something to add here.