GitHub / aws/sagemaker-python-sdk issues and pull requests
Labelled with: component: Inference APIs
#4869 - TorchServe inference.py incompatible with default PyTorch inference Transformer for UTF-8 Content Types
Issue -
State: open - Opened by dillon-odonovan about 1 year ago
Labels: bug, PyTorch, component: Inference APIs
#4725 - "AttributeError: 'NoneType' object has no attribute 'len'" when deploying inference component
Issue -
State: closed - Opened by goriri over 1 year ago
Labels: component: Inference APIs
#4234 - Setting `load_in_8bit=True` in `DJLModel` or `HuggingFaceAccelerateModel` forces you to set `dtype=int8` which is incorrect behaviour
Issue -
State: open - Opened by maaquib about 2 years ago
Labels: bug, component: Inference APIs
#4130 - Valid JSONPath failing in QualityCheckStep
Issue -
State: closed - Opened by vmatekole about 2 years ago
- 4 comments
Labels: bug, Pending information, component: pipelines, component: Inference APIs
#3997 - Transform job from Model does not pass role
Issue -
State: open - Opened by shakedel over 2 years ago
Labels: bug, component: Inference APIs
#3987 - Add container_startup_health_check_timeout arg to Predictor.update_endpoint
Issue -
State: open - Opened by stevenpitts over 2 years ago
Labels: type: feature request, component: Inference APIs
#3959 - Cannot schedule model quality job
Issue -
State: closed - Opened by Nick-McElroy over 2 years ago
- 3 comments
Labels: bug, component: Inference APIs
#3491 - Make sourcedir.tar.gz and repacked model.tar.gz structure consistent
Issue -
State: open - Opened by plienhar almost 3 years ago
Labels: bug, component: Inference APIs
#3470 - Support base_transform_job_name param in [Estimator|Model].transformer()
Issue -
State: open - Opened by athewsey almost 3 years ago
Labels: type: feature request, component: Inference APIs
#2952 - SM Elastic Inference Accelerators are not available during inference
Issue -
State: closed - Opened by vinayak-shanawad over 3 years ago
- 1 comment
Labels: bug, component: Inference APIs
#2732 - Support on p4d A100 GPUs for inference
Issue -
State: closed - Opened by hackgoofer about 4 years ago
- 1 comment
Labels: type: feature request, component: Inference APIs
#2582 - S3UploadFailedError does Not get Classified as botocore.exceptions.ClientError when Client has Bad Permissions
Issue -
State: closed - Opened by evakravi about 4 years ago
Labels: bug, component: Inference APIs
#2574 - How to increase inference timeout
Issue -
State: closed - Opened by Harathi123 almost 7 years ago
- 22 comments
Labels: type: feature request, component: Inference APIs
#2274 - `run_baseline` API feature parity with `suggest_baseline`
Issue -
State: closed - Opened by nigenda-amazon over 4 years ago
- 1 comment
Labels: type: feature request, component: Inference APIs
#1882 - need to be able to set return data size for each request in batch transform
Issue -
State: open - Opened by ldong87 about 5 years ago
- 13 comments
Labels: type: feature request, component: Inference APIs
#1023 - write_sparse_tensor_to_spmatrix
Issue -
State: closed - Opened by Jomonsugi about 6 years ago
- 6 comments
Labels: type: feature request, component: Inference APIs