- 新しい順
- 投票が多い順
- コメントが多い順
You are still missing some permissions to write working directory files. The location is under
Glue job > Details Tab > Advanced > Temporary Path
These are the permissions you would need:
{
"Action": [
"s3:Abort*",
"s3:DeleteObject*",
"s3:GetBucket*",
"s3:GetObject*",
"s3:List*",
"s3:PutObject",
"s3:PutObjectLegalHold",
"s3:PutObjectRetention",
"s3:PutObjectTagging",
"s3:PutObjectVersionTagging"
],
"Resource": [
"arn:aws:s3:::glue-assets-xxxxxxx",
"arn:aws:s3:::glue-assets-xxxxxxx/*"
],
"Effect": "Allow"
}
Hi, thanks! This helped me get at the issue.
The solution is that all Glue Ray jobs need the above Put access for the location of the script. The reason my job was failing was that my stack setup only allowed get access to the bucket where the glue scripts are stored.
If your script is stored at s3://{script_base_path}/my_script.py
, Glue Ray seems to want to put some metadata objects at
s3://{script_base_path}/jobs/{job_name}/{job_run_id}/job-result/metadata
s3://{script_base_path}/jobs/{job_name}/{job_run_id}/job-result/result
s3://{script_base_path}/jobs/{job_name}/{job_run_id}/job-result/stderr
s3://{script_base_path}/jobs/{job_name}/{job_run_id}/job-result/stdout
at the end of every job run.
This does not happen for other types of glue jobs.
This location does not seem to be configurable to be anything but the "script base path" like the answer indicated.
関連するコンテンツ
- 質問済み 6ヶ月前
- AWS公式更新しました 2年前
- AWS公式更新しました 3年前