GitHub / aws/sagemaker-tensorflow-serving-container issues and pull requests
#231 - Add deprecation notice
Pull Request -
State: closed - Opened by josephevans almost 2 years ago
#230 - fix port range error
Pull Request -
State: open - Opened by Chen188 over 2 years ago
#229 - IndexError: list index out of range
Issue -
State: open - Opened by Chen188 over 2 years ago
#228 - Feature: Support multiple inference.py files and universal inference.…
Pull Request -
State: closed - Opened by sachanub almost 3 years ago
- 53 comments
#227 - multi gpu support
Pull Request -
State: closed - Opened by waytrue17 about 3 years ago
- 11 comments
#226 - local tests are failing with connection refused
Issue -
State: open - Opened by rbavery about 3 years ago
#225 - no module named boto3 when attempting to start tfserving 1.14 container
Issue -
State: open - Opened by rbavery about 3 years ago
- 1 comment
#224 - Swap order of processing json_request to prevent bad jsonlines request.
Pull Request -
State: closed - Opened by matherit about 3 years ago
- 3 comments
#223 - Docs on how to submit an invoke endpoint request for images, comparison to predict method
Issue -
State: open - Opened by rbavery about 3 years ago
- 1 comment
#222 - ModelError: An error occurred (ModelError) when calling the InvokeEndpoint operation: Received server error (502) from primary with message "<html> <head><title>502 Bad Gateway</title></head>
Issue -
State: closed - Opened by jocelynbaduria about 3 years ago
- 3 comments
#221 - Nginx timeouts
Pull Request -
State: closed - Opened by matherit about 3 years ago
- 8 comments
#220 - Add gunicorn timeout
Pull Request -
State: closed - Opened by matherit over 3 years ago
- 5 comments
#219 - expose gunicorn logging
Pull Request -
State: closed - Opened by mseth10 over 3 years ago
- 4 comments
#218 - Fix tfs start failure due to null version number
Pull Request -
State: closed - Opened by satishpasumarthi over 3 years ago
- 8 comments
#217 - Use js_import instead of js_include
Pull Request -
State: closed - Opened by satishpasumarthi over 3 years ago
- 20 comments
#216 - Tensorflow Serving Error: Tensor name: <> has no shape information
Issue -
State: open - Opened by RajarshiBhadra over 3 years ago
#215 - Fix: Update universal scripts path for MME
Pull Request -
State: open - Opened by satishpasumarthi over 3 years ago
- 3 comments
#214 - fix: Use default model name when model name is None
Pull Request -
State: open - Opened by saimidu over 3 years ago
- 6 comments
#213 - sagemaker.tensorflow.serving.Model with input_handler is much slower than keras.model on GPU instance
Issue -
State: open - Opened by biyer19 over 3 years ago
#212 - Pre-Postprocessing feature seems to not work anymore.
Issue -
State: open - Opened by flacout about 4 years ago
- 38 comments
#211 - Multi model mode not working, OSError: [Errno 30] Read-only file system: '/opt/ml/models/code'
Issue -
State: open - Opened by kushappanavar about 4 years ago
#210 - fix: modify the way port number passing
Pull Request -
State: closed - Opened by jinpengqi about 4 years ago
- 9 comments
#209 - fix: set default SAGEMAKER_SAFE_PORT_RANGE when it is missing
Pull Request -
State: closed - Opened by jinpengqi about 4 years ago
- 1 comment
#208 - curl: (56) Recv failure: Connection reset by peer error when trying to do a curl request to http://localhost:8080/invocations
Issue -
State: open - Opened by alvarobasi about 4 years ago
- 1 comment
#207 - Default handler behaves differently in inference.py and python_service.py
Issue -
State: closed - Opened by kylepula-aws over 4 years ago
- 1 comment
#206 - ECR images outdated
Issue -
State: open - Opened by agoblet over 4 years ago
- 1 comment
#205 - Create test_multi_tfs.py
Pull Request -
State: closed - Opened by schenqian over 4 years ago
- 5 comments
#204 - dummy_test
Pull Request -
State: closed - Opened by schenqian over 4 years ago
#203 - Update buildspec.yml
Pull Request -
State: closed - Opened by schenqian over 4 years ago
#202 - Create test_1.py
Pull Request -
State: closed - Opened by schenqian over 4 years ago
- 35 comments
#201 - Add non safe port set test
Pull Request -
State: closed - Opened by schenqian over 4 years ago
- 3 comments
#200 - Adding gunicorn config file.
Pull Request -
State: closed - Opened by samueleresca over 4 years ago
- 3 comments
#199 - Add test_sagemaker_safe_port_range_not_set.py
Pull Request -
State: closed - Opened by schenqian over 4 years ago
- 2 comments
#198 - dummy test
Pull Request -
State: closed - Opened by schenqian over 4 years ago
- 3 comments
#197 - Created test_sagemaker_safe_port_range_not_set.py
Pull Request -
State: closed - Opened by schenqian over 4 years ago
- 9 comments
#196 - Create test_sagemaker_safe_port_range_not_set
Pull Request -
State: closed - Opened by schenqian over 4 years ago
- 31 comments
#195 - set OMP_NUM_THREADS default value to 1
Pull Request -
State: closed - Opened by liangma8712 over 4 years ago
- 1 comment
#194 - To fix python sdk repo localmode tfs tests failing issue
Pull Request -
State: closed - Opened by schenqian over 4 years ago
- 2 comments
#193 - Fix TFS Deploy codebuild issue
Pull Request -
State: closed - Opened by schenqian over 4 years ago
- 4 comments
#192 - Wait tfs before starting gunicorn
Pull Request -
State: closed - Opened by liangma8712 over 4 years ago
- 9 comments
#191 - Increase Timer
Pull Request -
State: closed - Opened by schenqian over 4 years ago
- 27 comments
#190 - feature: expose tunable parameters to support multiple tfs
Pull Request -
State: closed - Opened by jinpengqi over 4 years ago
- 1 comment
#189 - Batch Transform function starts sending image inference requests before model is actually loaded
Issue -
State: closed - Opened by tbiker over 4 years ago
- 1 comment
#188 - feature: expose tunable parameters to support multiple tfs
Pull Request -
State: closed - Opened by jinpengqi over 4 years ago
- 1 comment
#187 - applied param exposure patch
Pull Request -
State: closed - Opened by garzoand over 4 years ago
- 1 comment
#186 - Remove the 1s lock for inference
Pull Request -
State: closed - Opened by schenqian over 4 years ago
- 4 comments
#185 - Deploy trained TensorFlow 2.0 models using Amazon SageMaker
Issue -
State: closed - Opened by mobassir94 over 4 years ago
- 1 comment
#184 - More clarity on using Context in pre/post-processing scripts
Issue -
State: open - Opened by nathanielrindlaub over 4 years ago
- 1 comment
#183 - Multi model tensorflow container not working
Issue -
State: open - Opened by ag-labs-sys over 4 years ago
- 1 comment
#182 - 'BatchGetImage permission' error when deploying SageMaker endpoint by using Tensorflow base image
Issue -
State: closed - Opened by lu-liu-rft over 4 years ago
#181 - boto3 install is missing in 1.14 image
Issue -
State: closed - Opened by hin7141 over 4 years ago
- 2 comments
Labels: type: bug, contributions welcome
#180 - fix: install boto3
Pull Request -
State: closed - Opened by wukann over 4 years ago
- 1 comment
#179 - Extending the serve.py with GUNICORN workers and threads
Pull Request -
State: closed - Opened by samueleresca over 4 years ago
- 2 comments
#178 - Slow response times
Issue -
State: open - Opened by samueleresca over 4 years ago
- 7 comments
#177 - Filtering by numeric model versions.
Pull Request -
State: closed - Opened by samueleresca over 4 years ago
- 1 comment
#176 - GovCloud support?
Issue -
State: open - Opened by YakDriver over 4 years ago
- 1 comment
#175 - build scripts fail to pull down docker image
Issue -
State: closed - Opened by nathanielrindlaub over 4 years ago
- 2 comments
#173 - fix typo in README.md
Pull Request -
State: closed - Opened by delihiros over 4 years ago
- 1 comment
#172 - feature: universal requirements.txt and inference.py
Pull Request -
State: closed - Opened by chuyang-deng over 4 years ago
- 6 comments
#171 - [Batch Transform] TF Serving receives requests before model is loaded
Issue -
State: open - Opened by sayradley almost 5 years ago
- 2 comments
Labels: type: bug
#170 - [bug] : Model not loading while using existing container image to setup MME on sagemaker
Issue -
State: open - Opened by abhi1793 almost 5 years ago
- 1 comment
Labels: type: bug
#169 - Multi Model Serving for Tensorflow 2.3.0
Issue -
State: closed - Opened by abhi1793 almost 5 years ago
- 2 comments
Labels: type: question
#168 - Updating readme with grpc_port field in the context object
Pull Request -
State: closed - Opened by samueleresca almost 5 years ago
- 1 comment
#167 - Container build not installing/finding dependencies contained in model.tar file
Issue -
State: open - Opened by taylorsweet almost 5 years ago
- 7 comments
Labels: type: question
#166 - Multi-model endpoint: load new model, unload model, update model
Issue -
State: closed - Opened by kevin-yauris almost 5 years ago
- 9 comments
Labels: type: documentation
#165 - doc: fix broken link in README
Pull Request -
State: closed - Opened by ajaykarpur almost 5 years ago
- 2 comments
Labels: priority: low, type: documentation
#164 - Broken Link: Deploying to TensorFlow Serving Endpoints
Issue -
State: closed - Opened by seigenbrode almost 5 years ago
Labels: type: documentation
#163 - Adding pip dependencies in the pre/post processing
Issue -
State: closed - Opened by samueleresca almost 5 years ago
#162 - Unsupported Media Type: application/x-image
Issue -
State: closed - Opened by manantessact almost 5 years ago
- 1 comment
#161 - Install other apt lib for serving container.
Issue -
State: closed - Opened by jason9075 almost 5 years ago
- 2 comments
Labels: related: Docker
#160 - fix: exclude /code directory from model versions
Pull Request -
State: closed - Opened by sudilshr almost 5 years ago
- 6 comments
#159 - Unable to deploy due to missing libGL
Issue -
State: closed - Opened by ckang244 almost 5 years ago
- 1 comment
#158 - fix: return information of all models
Pull Request -
State: closed - Opened by chuyang-deng almost 5 years ago
- 1 comment
#157 - change: Support multiple Accept types
Pull Request -
State: closed - Opened by bveeramani almost 5 years ago
- 4 comments
#156 - Published tensorflow-inference:2.1-cpu image does not support multi-models
Issue -
State: closed - Opened by svpino about 5 years ago
- 1 comment
#155 - feature: add model_version_policy to model config
Pull Request -
State: closed - Opened by laurenyu about 5 years ago
- 5 comments
#154 - Input a csv file for prediction/ batch transform. The output is a json format txt
Issue -
State: open - Opened by xush65 about 5 years ago
- 15 comments
Labels: type: question
#153 - change: update MME Pre/Post-Processing model and script paths
Pull Request -
State: closed - Opened by chuyang-deng about 5 years ago
- 5 comments
#152 - fix: increasing max_retry for model availability check
Pull Request -
State: closed - Opened by chuyang-deng about 5 years ago
- 2 comments
#151 - Add support for passing TensorFlow Serving cli configuration options
Issue -
State: open - Opened by AndreiVoinovTR about 5 years ago
Labels: type: enhancement
#150 - doc: update README for multi-model endpoint
Pull Request -
State: closed - Opened by chuyang-deng about 5 years ago
- 1 comment
#149 - fix: change single quotes to double quotes
Pull Request -
State: closed - Opened by chuyang-deng about 5 years ago
- 4 comments
#148 - inference.py not loading TF2.0.1GPU
Issue -
State: closed - Opened by tekollt about 5 years ago
- 2 comments
#147 - Base64 as input
Issue -
State: closed - Opened by Adblu about 5 years ago
- 1 comment
#145 - How to use a new model to replace the previous one?
Issue -
State: closed - Opened by Ageneinair about 5 years ago
- 2 comments
Labels: type: question
#144 - Does handler function occupy the compute resource of the container?
Issue -
State: closed - Opened by Ageneinair about 5 years ago
- 2 comments
Labels: type: question
#142 - sagemaker notebook instance Elastic Inference tensorflow model local deployment
Issue -
State: open - Opened by pankajxyz about 5 years ago
- 12 comments
Labels: type: question
#140 - multi-model-endpoint support
Pull Request -
State: closed - Opened by chuyang-deng about 5 years ago
- 14 comments
#138 - how to handle application/x-image ?
Issue -
State: open - Opened by Adblu about 5 years ago
- 9 comments
Labels: type: question
#137 - Exposing /monitoring/prometheus/metrics
Issue -
State: open - Opened by moaradwan about 5 years ago
- 3 comments
Labels: type: enhancement
#130 - [CRITICAL] WORKER TIMEOUT
Issue -
State: open - Opened by harshit-HashedIn over 5 years ago
- 4 comments
Labels: type: question
#127 - Request: EIA support for tensorflow 2.1
Issue -
State: closed - Opened by szabadaba over 5 years ago
- 2 comments
#121 - Support for multiple model versions
Issue -
State: open - Opened by henryhu666 over 5 years ago
- 2 comments
Labels: type: enhancement, status: pending release
#118 - How can I forwards effectively numpy arrays to a SageMaker endpoint with inference?
Issue -
State: closed - Opened by bessszilard over 5 years ago
- 17 comments
Labels: type: question
#114 - tensorflow_model_server 2.0 stuck at "Adding visible gpu devices"
Issue -
State: closed - Opened by lucafuji over 5 years ago
- 6 comments
#110 - http.client.RemoteDisconnected: Remote end closed connection without response
Issue -
State: open - Opened by whatdhack over 5 years ago
- 5 comments
Labels: type: question
#109 - Multi-model SM endpoint returns 502
Issue -
State: closed - Opened by ilcartographer over 5 years ago
- 13 comments
#107 - code/inference.py not utilised in multi model containers
Issue -
State: closed - Opened by Freakazo over 5 years ago
- 3 comments
Labels: type: bug
#97 - update readme instructions to reflect new ecr image naming conventions
Issue -
State: closed - Opened by RZachLamberty over 5 years ago
- 3 comments
#94 - MKL - DNN Support
Issue -
State: closed - Opened by gautiese over 5 years ago
- 2 comments
Labels: type: enhancement