- Newest
- Most votes
- Most comments
Hello,
Thank you for using Amazon SageMaker.
At the moment, [production-variant-name]/[instance-id]/data-log
are all the logs provided by Amazon SageMaker for asynchronous endpoints.
I have raised a feature request on your behalf to include the model container logs for async endpoints. While I am unable to comment on if/when this feature may get released, I request you to keep an eye on our What's New and Blog pages for any new feature announcements.
I have active asynchronous inference endpoints for which both [production-variant-name]/[instance-id]
(the endpoint logs) and [production-variant-name]/[instance-id]/data-log
(the queue orchestration logs) are present, so believe the other answer cannot be correct... However, my endpoints aren't using the Triton container.
I'd suggest to double-check for possible permissions errors that could be preventing your endpoint from creating the relevant CW log streams, exploring any other factors that might be hiding logs (e.g. configured log level? Other Triton settings?), and maybe setting the PYTHONUNBUFFERED
environment variable + adding some additional print()
statements if possible to be sure?
Thanks @Marta_M, @Alex_T I followed your suggestions - but I'm using the exact same model invoked as batch Transform and I get the [production-variant-name]/[instance-id]/data-log
when calling create_transform_job
but not invoke_endpoint_async
and I don't see anywhere obvious where one mode could set the logging level different to the other.
One additional friction points I've encountered is sagemaker in general doesn't appear to log anything written to stdout/stderr until the endpoint is in service - which means any messages created by say an entrypoint.sh
script that starts the server in the container isn't captured anywhere that I've seen.
Relevant content
- asked 2 years ago
- asked 5 months ago
- AWS OFFICIALUpdated a year ago
- AWS OFFICIALUpdated 16 days ago
- AWS OFFICIALUpdated 5 months ago
- AWS OFFICIALUpdated 22 days ago