4 回答
- 最新
- 投票最多
- 评论最多
0
Hello
The robots.txt file is an agreement which is not mandatory so it may help but it does not stop Yandex from keep doing it.
You can create a bucket policy to denie access to an IP address or IP address range.
here is an example
https://aws.amazon.com/premiumsupport/knowledge-center/block-s3-traffic-vpc-ip/
hope this helps
Rt
已回答 5 年前
0
Thank you so much.
However, the example you give is for blocking all access and giving permission to specific IP addresses. I wish to do the opposite - to give public access, but block specific IP addresses.
Is there JSON code you know of which does this please?
thanks again
已回答 5 年前
0
Hello
Sorry, this link should have a bit more info.
a little modification to the other example and you have it.
{
"Version": "2012-10-17",
"Id": "S3PolicyId1",
"Statement": [
{
"Sid": "IPAllow",
"Effect": "Deny",
"Principal": "*",
"Action": "s3:*",
"Resource": "arn:aws:s3:::examplebucket/*",
"Condition": {
"IpAddress": {"aws:SourceIp": "54.240.143.24/32"}
}
}
]
}
RT
已回答 5 年前
相关内容
- AWS 官方已更新 2 年前
- AWS 官方已更新 2 年前