Is there a way to batch-check the existence of specific object versions in AWS S3?

0

I am writing an R package which needs to check the existence of a specific version of each AWS S3 object in its data store. The version of a given object is the version ID recorded in the local metadata, and the recorded version may or may not be the most current version in the bucket. Currently, the package accomplishes this by sending a HEAD request for each relevant object-version pair.

Is there a more efficient/batched way to do this for each version/object pair? list_object_versions() returns every version of every object of interest, which is way too many versions to download efficiently, and neither list_objects() nor list_objects_v2() return any version IDs at all. It would be great to have something like delete_objects(), but instead of deleting the objects, accept the supplied key-version pairs and return the ETag and custom metadata of each one that exists.

r-user
已提問 6 個月前檢視次數 294 次
1 個回答
0

Hello, you may have already tried the prefix parameter that's part of the list_object_versions call, that can be used to filter down the results, if the prefix is known or common.

CLI example:

$ aws s3api list-object-versions --bucket EXAMPLE-BUCKET --prefix EXAMPLE-PREFIX

If you know the object and versiondID you can directly call get-object-attributeswith the object-attributes parameter. This will return the values specified in the object-attributes parameter, along with the LastModified and VersionID values.

CLI example:

$ aws s3api get-object-attributes --bucket EXAMPLE-BUCKET --key EXAMPLE-PREFIX/OBJECT.html --object-attributes "ETag" 

Hope this somewhat helped.

AWS
Gary_S
已回答 6 個月前

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南