When I run a training job from SageMaker with the XGBoost algorithm, I get the error:
Traceback (most recent call last):
File "/miniconda3/lib/python3.7/site-packages/sagemaker_containers/_trainer.py", line 84, in train
entrypoint()
File "/miniconda3/lib/python3.7/site-packages/sagemaker_xgboost_container/training.py", line 94, in main
train(framework.training_env())
File "/miniconda3/lib/python3.7/site-packages/sagemaker_xgboost_container/training.py", line 90, in train
run_algorithm_mode()
File "/miniconda3/lib/python3.7/site-packages/sagemaker_xgboost_container/training.py", line 68, in run_algorithm_mode
checkpoint_config=checkpoint_config
File "/miniconda3/lib/python3.7/site-packages/sagemaker_xgboost_container/algorithm_mode/train.py", line 121, in sagemaker_train
validated_train_config = hyperparameters.validate(train_config)
File "/miniconda3/lib/python3.7/site-packages/sagemaker_algorithm_toolkit/hyperparameter_validation.py", line 280, in validate
raise exc.UserError("Extraneous hyperparameter found: {}".format(hp))
sagemaker_algorithm_toolkit.exceptions.UserError: Extraneous hyperparameter found: prob_buffer_row
Extraneous hyperparameter found: prob_buffer_row
I cannot delete the prob_buffer_row parameter as it comes by default when defining the algorithm. Am I doing something wrong or is it a bug?
How are you creating the Training Job ? If you are using the SageMaker Python SDK, can you share the Estimator you are using? What version of the XGBoost container are you using? I would suggest testing a different version of the SageMaker XGBoost container and note the behavior.