- 最新
- 投票最多
- 评论最多
For what it's worth, I managed to resolve this after contact with the AWS support by adding "--provenance=false" together with "--output type=docker" as arguments to the docker buildx build commands (in our case done via the aws-ecr circle-ci orb, using "extra_build_args", but adding the --provenance==false may help in other build environments too)
Thank you for your question about SageMaker's support for OCI manifests. You're correct that this has been a challenge for some users, particularly when working with MLflow integrations.
Currently, SageMaker's CreateModel operation primarily supports the Docker V2 manifest media type, specifically "application/vnd.docker.distribution.manifest.v2+json". While Amazon Elastic Container Registry (ECR) does support OCI specifications, SageMaker's model creation process is not yet fully compatible with OCI manifests.
To work around this limitation, you have a couple of options:
-
When building your Docker image, you can specify the --format option to ensure the image is in a format that SageMaker's CreateModel operation supports. This approach can help convert OCI format images to a compatible Docker V2 format.
-
You could consider rebuilding your images using a Dockerfile that explicitly creates a Docker V2 format image. This might involve adjusting your build process or the way MLflow generates the images.
It's worth noting that SageMaker is continually evolving, and there's a possibility that full OCI manifest support may be added in the future. However, as of now, you'll need to work within the constraints of the Docker V2 manifest format for SageMaker endpoints.
If this limitation significantly impacts your workflow, it would be valuable to provide feedback to AWS support or through official AWS channels. This can help prioritize potential future enhancements to SageMaker's container image support.
In the meantime, focusing on ensuring your images are in the Docker V2 format should allow you to successfully deploy your models to SageMaker endpoints, even if it requires some additional steps in your workflow.
Sources
Manifest error when staging ML model | AWS re:Post
Container image compatibility - Amazon SageMaker AI
相关内容
- AWS 官方已更新 4 个月前

sir... i want cry rn. this solved my issue. i have the most recent version of docker (see date) i have tried for over 12 hours now to get my bentoml service deployed on sage maker. I have tried multiple gpt code editors and going back and forth with dockers own documentation AI. THIS IS THE CORRECT ANSWER TO GET NON OCI MANIFEST working. thank you.