Questions tagged with Amazon SageMaker Pipelines

Amazon SageMaker Pipelines is the first purpose-built, easy-to-use continuous integration and continuous delivery (CI/CD) service for machine learning (ML). With SageMaker Pipelines, you can create, automate, and manage end-to-end ML workflows at scale.

Content language: English

Select up to 5 tags to filter
Sort by most recent

Browse through the questions and answers listed below or filter and sort to narrow down your results.

74 results
I am trying to train a SageMaker built-in KMeans model on data stored in RecordIO-Protobuf format, using the Pipe input mode. However, the training job fails with the following error: ``` UnexpectedSt...
1
answers
0
votes
22
views
asked 10 days ago
Hi, I am using Sagemaker TrainingJob and it fails when it tries to upload the mode artifact to a bucket that has objectlock enabled. It throws this error: ClientError: Artifact upload failed:Error 7:...
1
answers
0
votes
103
views
AWS
asked 4 months ago
Hello, I am creating the pipeline using aws sage maker visual studio editor, I have the input file placed in s3 and pointed to container path "/opt/ml/processing/input". and output also i ma saving i...
1
answers
0
votes
85
views
asked 5 months ago
I am following these docs (https://docs.aws.amazon.com/sagemaker/latest/dg/processing-container-run-scripts.html) to Run a script with my own processing container (I need to download a few custom pack...
1
answers
0
votes
296
views
asked 9 months ago
I am trying to launch a sagemaker pipeline and running into an issue where the container cannot detect the py script that's being launched. Basic set up: - A docker container thats been registered i...
1
answers
0
votes
267
views
asked 9 months ago
Hi all, is it possible to incorporate SageMaker Data Wrangler as a step in SageMaker Pipelines? So that every time the SageMaker Pipelines gets triggered, it starts with sagemaker data wrangler job fi...
2
answers
0
votes
339
views
AWS
asked a year ago
I am currently working on a kmeans clustering algorithm for my dataset. Currently what i have done is to creating a preprocess.py that preprocess my data and stores it in s3 bucket.and train step fu...
1
answers
0
votes
862
views
asked a year ago
Does sagamaker pipeline support , sending success/failure of the pipeline execution ? I want to see if I can trigger a aba notification when the pipeline completes either with success or failure .
1
answers
0
votes
834
views
asked a year ago
I'm a complete noob with Sagemaker, coming here with AzureML experience. I was very comfortable and liked building ML pipelines with the CLI in AzureML. I've found that Sagemaker has a similar pipel...
1
answers
0
votes
354
views
asked a year ago
I am using below code in sagemaker pipeline (pipeline.py) to register the model in step_register model = Model( image_uri=container, model_data=step_training.properties.ModelArtifacts.S3Model...
1
answers
0
votes
354
views
asked a year ago
I have two already trained ("bring your own model") pytorch models, which I want to run sequentially in a PipelineModel deployed as one endpoint. Each individual model can be loaded and deployed runni...
0
answers
0
votes
103
views
asked a year ago
A SageMaker Pipeline I built using a Jupyter notebook in SageMaker Studio has a SageMaker Processing Job step. However, the step fails within 5 minutes with the following message: `ClientError: Failed...
2
answers
0
votes
718
views
asked a year ago