Is there a way to batch-check the existence of specific object versions in AWS S3?

0

I am writing an R package which needs to check the existence of a specific version of each AWS S3 object in its data store. The version of a given object is the version ID recorded in the local metadata, and the recorded version may or may not be the most current version in the bucket. Currently, the package accomplishes this by sending a HEAD request for each relevant object-version pair.

Is there a more efficient/batched way to do this for each version/object pair? list_object_versions() returns every version of every object of interest, which is way too many versions to download efficiently, and neither list_objects() nor list_objects_v2() return any version IDs at all. It would be great to have something like delete_objects(), but instead of deleting the objects, accept the supplied key-version pairs and return the ETag and custom metadata of each one that exists.

r-user
질문됨 6달 전294회 조회
1개 답변
0

Hello, you may have already tried the prefix parameter that's part of the list_object_versions call, that can be used to filter down the results, if the prefix is known or common.

CLI example:

$ aws s3api list-object-versions --bucket EXAMPLE-BUCKET --prefix EXAMPLE-PREFIX

If you know the object and versiondID you can directly call get-object-attributeswith the object-attributes parameter. This will return the values specified in the object-attributes parameter, along with the LastModified and VersionID values.

CLI example:

$ aws s3api get-object-attributes --bucket EXAMPLE-BUCKET --key EXAMPLE-PREFIX/OBJECT.html --object-attributes "ETag" 

Hope this somewhat helped.

AWS
Gary_S
답변함 6달 전

로그인하지 않았습니다. 로그인해야 답변을 게시할 수 있습니다.

좋은 답변은 질문에 명확하게 답하고 건설적인 피드백을 제공하며 질문자의 전문적인 성장을 장려합니다.

질문 답변하기에 대한 가이드라인

관련 콘텐츠