Stay up to date with the latest from the Knowledge Center. See all new and updated Knowledge Center articles published in the last month and re:Post’s top contributors.
All Content tagged with Amazon SageMaker Pipelines
Amazon SageMaker Pipelines is the first purpose-built, easy-to-use continuous integration and continuous delivery (CI/CD) service for machine learning (ML). With SageMaker Pipelines, you can create, automate, and manage end-to-end ML workflows at scale.
Content language: English
Select tags to filter
Sort by most recent
74 results
Hi,
I'm working on to create an SageMaker Pipeline with two processing phases in a jupyter notebook. In the phases, my goal is to process files with an generative ai model.
When I run my pipeline crea...
I am trying to train a SageMaker built-in KMeans model on data stored in RecordIO-Protobuf format, using the Pipe input mode. However, the training job fails with the following error:
```
UnexpectedSt...
Hi, I am using Sagemaker TrainingJob and it fails when it tries to upload the mode artifact to a bucket that has objectlock enabled.
It throws this error:
ClientError: Artifact upload failed:Error 7:...
Hello,
I am creating the pipeline using aws sage maker visual studio editor, I have the input file placed in s3 and pointed to container path "/opt/ml/processing/input". and output also i ma saving i...
I am following these docs (https://docs.aws.amazon.com/sagemaker/latest/dg/processing-container-run-scripts.html) to Run a script with my own processing container (I need to download a few custom pack...
I am trying to launch a sagemaker pipeline and running into an issue where the container cannot detect the py script that's being launched.
Basic set up:
- A docker container thats been registered i...
Hi all, is it possible to incorporate SageMaker Data Wrangler as a step in SageMaker Pipelines? So that every time the SageMaker Pipelines gets triggered, it starts with sagemaker data wrangler job fi...
I am currently working on a kmeans clustering algorithm for my dataset.
Currently what i have done is to creating a preprocess.py that preprocess my data and stores it in s3 bucket.and train step fu...
Does sagamaker pipeline support , sending success/failure of the pipeline execution ? I want to see if I can trigger a aba notification when the pipeline completes either with success or failure .
I'm a complete noob with Sagemaker, coming here with AzureML experience. I was very comfortable and liked building ML pipelines with the CLI in AzureML. I've found that Sagemaker has a similar pipel...
I am using below code in sagemaker pipeline (pipeline.py) to register the model in step_register
model = Model(
image_uri=container,
model_data=step_training.properties.ModelArtifacts.S3Model...
I have two already trained ("bring your own model") pytorch models, which I want to run sequentially in a PipelineModel deployed as one endpoint. Each individual model can be loaded and deployed runni...