Ecosyste.ms: Issues
An open API service for providing issue and pull request metadata for open source projects.
GitHub / neuralmagic/deepsparse issues and pull requests
#1223 - Support Python 3.11
Pull Request -
State: closed - Opened by mgoin about 1 year ago
#1222 - [server] pin anyio support to <4.0.0
Pull Request -
State: open - Opened by bfineran about 1 year ago
#1222 - [server] pin anyio support to <4.0.0
Pull Request -
State: closed - Opened by bfineran about 1 year ago
- 2 comments
#1222 - [server] pin anyio support to <4.0.0
Pull Request -
State: open - Opened by bfineran about 1 year ago
#1221 - Fixes for openai server example
Pull Request -
State: open - Opened by mgoin about 1 year ago
#1221 - Fixes for openai server example
Pull Request -
State: closed - Opened by mgoin about 1 year ago
#1221 - Fixes for openai server example
Pull Request -
State: closed - Opened by mgoin about 1 year ago
#1220 - topk topp penalties
Pull Request -
State: closed - Opened by horheynm about 1 year ago
#1220 - topk topp penalties
Pull Request -
State: closed - Opened by horheynm about 1 year ago
#1220 - topk topp penalties
Pull Request -
State: closed - Opened by horheynm about 1 year ago
#1219 - [Feature Branch][LLM Testing] Create GroundTruthSource objects
Pull Request -
State: closed - Opened by dbogunowicz over 1 year ago
#1218 - [TransformersPipeline] Add in and refactor TransformersPipeline args
Pull Request -
State: closed - Opened by dsikka over 1 year ago
#1218 - [TransformersPipeline] Add in and refactor TransformersPipeline args
Pull Request -
State: open - Opened by dsikka over 1 year ago
#1217 - [TextGeneration] max token refactor
Pull Request -
State: open - Opened by dsikka over 1 year ago
#1217 - [TextGeneration] max token refactor
Pull Request -
State: open - Opened by dsikka over 1 year ago
#1217 - [TextGeneration] max token refactor
Pull Request -
State: closed - Opened by dsikka over 1 year ago
- 3 comments
#1216 - [Feature Branch][LLM Testing] Full Testing Harness for LLMs
Pull Request -
State: open - Opened by dbogunowicz over 1 year ago
#1216 - [Feature Branch][LLM Testing] Full Testing Harness for LLMs
Pull Request -
State: closed - Opened by dbogunowicz over 1 year ago
#1215 - Remove dependency on `transformers` for running `debug_analysis` and `benchmark_model` on KV-Cache supported LLMs
Pull Request -
State: closed - Opened by dbogunowicz over 1 year ago
#1215 - Remove dependency on `transformers` for running `debug_analysis` and `benchmark_model` on KV-Cache supported LLMs
Pull Request -
State: open - Opened by dbogunowicz over 1 year ago
#1214 - [Text Generation][Enhancement] If `prompt_processing_sequence_length` == 1, do not inititalize multitoken_engine
Pull Request -
State: closed - Opened by dbogunowicz over 1 year ago
#1213 - Update max token
Pull Request -
State: closed - Opened by dsikka over 1 year ago
#1213 - Update max token
Pull Request -
State: closed - Opened by dsikka over 1 year ago
#1213 - Update max token
Pull Request -
State: closed - Opened by dsikka over 1 year ago
#1212 - Automatically analyze in auto-regressive setting
Pull Request -
State: closed - Opened by mgoin over 1 year ago
#1212 - Automatically analyze in auto-regressive setting
Pull Request -
State: closed - Opened by mgoin over 1 year ago
#1211 - [Fix][Eval Downstream] Include prompt logits in the perplexity calculation
Pull Request -
State: closed - Opened by dbogunowicz over 1 year ago
#1210 - [Fix][Eval Downstream] Include prompt logits in the perplexity calculation
Pull Request -
State: closed - Opened by dbogunowicz over 1 year ago
#1210 - [Fix][Eval Downstream] Include prompt logits in the perplexity calculation
Pull Request -
State: closed - Opened by dbogunowicz over 1 year ago
#1209 - [Fix][Eval Downstream] Include prompt logits in perplexity calculation
Pull Request -
State: closed - Opened by dbogunowicz over 1 year ago
#1209 - [Fix][Eval Downstream] Include prompt logits in perplexity calculation
Pull Request -
State: closed - Opened by dbogunowicz over 1 year ago
#1209 - [Fix][Eval Downstream] Include prompt logits in perplexity calculation
Pull Request -
State: closed - Opened by dbogunowicz over 1 year ago
#1208 - Increase default text_generation sequence_length to 512
Pull Request -
State: closed - Opened by mgoin over 1 year ago
#1207 - Add example for side-by-side chat comparison
Pull Request -
State: closed - Opened by mgoin over 1 year ago
- 1 comment
#1206 - Add stop argument to TextGenerationInput
Pull Request -
State: closed - Opened by rahul-tuli over 1 year ago
- 1 comment
#1205 - Conform output of TextGenerationPipelines with HuggingfacePipelines
Pull Request -
State: closed - Opened by rahul-tuli over 1 year ago
#1204 - [Feature] Add callback for Text Generation Pipelines
Pull Request -
State: closed - Opened by rahul-tuli over 1 year ago
#1203 - [CLIP] Update clip dependencies, README.md
Pull Request -
State: closed - Opened by dsikka over 1 year ago
#1202 - [BugFix]: `force_max_tokens` not respected by text-gen pipelines
Pull Request -
State: closed - Opened by rahul-tuli over 1 year ago
Labels: bug, mle-team
#1201 - [Test Generation][Fix] Make sure to properly tokenize the input to the non-kv-cache model
Pull Request -
State: closed - Opened by dbogunowicz over 1 year ago
#1200 - [BugFix]: Use `single_stream` scheduler with TextGenerationPipeline(s)
Pull Request -
State: open - Opened by rahul-tuli over 1 year ago
#1200 - [BugFix]: Use `single_stream` scheduler with TextGenerationPipeline(s)
Pull Request -
State: closed - Opened by rahul-tuli over 1 year ago
- 1 comment
#1200 - [BugFix]: Use `single_stream` scheduler with TextGenerationPipeline(s)
Pull Request -
State: open - Opened by rahul-tuli over 1 year ago
#1199 - [Text Generation][Fix] KV Cache mismatch on prompt inference
Pull Request -
State: closed - Opened by dbogunowicz over 1 year ago
#1199 - [Text Generation][Fix] KV Cache mismatch on prompt inference
Pull Request -
State: closed - Opened by dbogunowicz over 1 year ago
#1199 - [Text Generation][Fix] KV Cache mismatch on prompt inference
Pull Request -
State: closed - Opened by dbogunowicz over 1 year ago
#1198 - [Text Generation][Fix] Fix non-kv-cache inference
Pull Request -
State: open - Opened by dbogunowicz over 1 year ago
#1198 - [Text Generation][Fix] Fix non-kv-cache inference
Pull Request -
State: closed - Opened by dbogunowicz over 1 year ago
#1197 - Refactor of perplexity computation
Pull Request -
State: open - Opened by anmarques over 1 year ago
#1197 - Refactor of perplexity computation
Pull Request -
State: closed - Opened by anmarques over 1 year ago
#1197 - Refactor of perplexity computation
Pull Request -
State: open - Opened by anmarques over 1 year ago
#1196 - Add timing for KV cache update
Pull Request -
State: closed - Opened by dbogunowicz over 1 year ago
- 1 comment
#1196 - [WiP] [Ben's work] add timing for KV cache update
Pull Request -
State: open - Opened by dbogunowicz over 1 year ago
#1196 - [WiP] [Ben's work] add timing for KV cache update
Pull Request -
State: open - Opened by dbogunowicz over 1 year ago
#1195 - [Text-Generation] Set kv cache inputs to empty arrays (size 0) when running internally
Pull Request -
State: closed - Opened by bfineran over 1 year ago
- 4 comments
#1195 - [Text-Generation] Set kv cache inputs to empty arrays (size 0) when running internally
Pull Request -
State: open - Opened by bfineran over 1 year ago
- 4 comments
#1195 - Set kv cache inputs to empty arrays (size 0) when running internally
Pull Request -
State: open - Opened by bfineran over 1 year ago
- 4 comments
#1194 - [TextGeneration Pipeline] argument renaming
Pull Request -
State: closed - Opened by horheynm over 1 year ago
- 2 comments
#1193 - DeepSparse uses 100% of the CPU.
Issue -
State: closed - Opened by ik-ids over 1 year ago
- 2 comments
Labels: bug
#1192 - Bug Fix: Make Numpy Array Outputs JSON Serializable for Server
Pull Request -
State: closed - Opened by Satrat over 1 year ago
#1191 - [Text Generation] [Fix] Raise error when we use deepsparse engine and `prompt_processing_length == sequence_lenght`
Pull Request -
State: closed - Opened by dbogunowicz over 1 year ago
- 3 comments
#1190 - [Text Generation] Optimize the slow `update` method in the KVCacheDecoder
Pull Request -
State: closed - Opened by dbogunowicz over 1 year ago
#1189 - Feature/damian/optimized cache
Pull Request -
State: closed - Opened by dbogunowicz over 1 year ago
#1188 - allow spaces in task names to match
Pull Request -
State: closed - Opened by bfineran over 1 year ago
#1187 - [BugFix] Delay torch import until needed for `deepsparse.transformers.eval_downstream`
Pull Request -
State: closed - Opened by rahul-tuli over 1 year ago
- 1 comment
Labels: bug
#1186 - Update text_generation.py
Pull Request -
State: closed - Opened by rsnm2 over 1 year ago
#1185 - [Perplexity Evaluation] [Fix] Update the attribute name
Pull Request -
State: closed - Opened by dbogunowicz over 1 year ago
#1184 - Hotfix 1.5.3 version bump
Pull Request -
State: closed - Opened by dhuangnm over 1 year ago
#1183 - DS eval for OPT on WikiText
Pull Request -
State: closed - Opened by natuan over 1 year ago
- 1 comment
#1182 - Fix text generation
Pull Request -
State: closed - Opened by natuan over 1 year ago
#1181 - [CLIP] Validation Script
Pull Request -
State: closed - Opened by dsikka over 1 year ago
#1180 - Changes to support pass@k evaluation on the HumanEval dataset
Pull Request -
State: open - Opened by shubhra over 1 year ago
#1179 - update mock engine to support cached bools arg
Pull Request -
State: closed - Opened by bfineran over 1 year ago
- 1 comment
#1178 - [TextGeneration] fixes to support text gen pipeline in server
Pull Request -
State: closed - Opened by bfineran over 1 year ago
- 1 comment
#1177 - Don't load external data for `default_cached_outputs`
Pull Request -
State: closed - Opened by mgoin over 1 year ago
#1176 - Add `trust_remote_code` to tokenizer construction
Pull Request -
State: closed - Opened by mgoin over 1 year ago
- 1 comment
#1175 - [Text Generation] Turn off the (currently) inefficient external KV cache logic when internal KV cache management enabled
Pull Request -
State: closed - Opened by dbogunowicz over 1 year ago
#1174 - Fix small typo in causal mask docstring
Pull Request -
State: closed - Opened by dbogunowicz over 1 year ago
#1173 - [Transformers] Add an argument `trust_remote_code` on transformers pipeline initialization.
Pull Request -
State: closed - Opened by dbogunowicz over 1 year ago
#1172 - [Text Generation] Support for causal masks, internal KV cache, and initial testing framework
Pull Request -
State: closed - Opened by dbogunowicz over 1 year ago
#1171 - Implement OpenAI-compatible server
Pull Request -
State: closed - Opened by mgoin over 1 year ago
#1170 - Error when compiling YOLO-NAS model
Issue -
State: closed - Opened by Y-T-G over 1 year ago
- 11 comments
Labels: bug
#1169 - Generalize disabling batch size across engine interfaces
Pull Request -
State: closed - Opened by mgoin over 1 year ago
- 1 comment
#1168 - cheat sheet bug fix
Pull Request -
State: closed - Opened by InquestGeronimo over 1 year ago
#1167 - Examples README refresh
Pull Request -
State: closed - Opened by InquestGeronimo over 1 year ago
#1166 - use max sequence length for tokenization
Pull Request -
State: closed - Opened by horheynm over 1 year ago
#1165 - removing sparsestream
Pull Request -
State: closed - Opened by InquestGeronimo over 1 year ago
#1164 - ServerUI: update
Pull Request -
State: closed - Opened by InquestGeronimo over 1 year ago
#1163 - [Text Generation] Internal KV Cache Support + Initial Testing Framework
Pull Request -
State: closed - Opened by dbogunowicz over 1 year ago
#1162 - [Text Generation][Tests] Text Generation Pipeline
Pull Request -
State: closed - Opened by dbogunowicz over 1 year ago
- 1 comment
#1161 - [Fix] Joining the outputs when running without KV Cache
Pull Request -
State: closed - Opened by dbogunowicz over 1 year ago
#1160 - Cloud Run: update deepsparse version
Pull Request -
State: closed - Opened by InquestGeronimo over 1 year ago
#1159 - Benchmark UI: update deepsparse version
Pull Request -
State: closed - Opened by InquestGeronimo over 1 year ago
#1158 - Sagemaker: update deepsparse version
Pull Request -
State: closed - Opened by InquestGeronimo over 1 year ago
#1157 - Onnx torch match
Pull Request -
State: closed - Opened by horheynm over 1 year ago
#1156 - Fix Flake 6.1 Quality Errors
Pull Request -
State: closed - Opened by Satrat over 1 year ago
#1155 - [Text Generation][Tests] NLDecoderEngine
Pull Request -
State: closed - Opened by dbogunowicz over 1 year ago
#1154 - [Text Generation][Tests] DecoderKVCache
Pull Request -
State: closed - Opened by dbogunowicz over 1 year ago
#1153 - [Text Generation] KVCacheStorage Implementation
Pull Request -
State: closed - Opened by dbogunowicz over 1 year ago
- 2 comments
#1152 - [bugfix] qa tests - `KeyError: 'num_cores'`
Pull Request -
State: closed - Opened by bfineran over 1 year ago