Ecosyste.ms: Issues
An open API service for providing issue and pull request metadata for open source projects.
GitHub / thomasantony/llamacpp-python issues and pull requests
#30 - KeyError: 'transformer.h.0.attn.c_attn.bias'
Issue -
State: closed - Opened by NeoFii 6 months ago
#29 - Update llamacpp
Issue -
State: open - Opened by iaalm about 1 year ago
#28 - batch inference?
Issue -
State: open - Opened by liuxiaohao-xn over 1 year ago
#27 - Update for llama.cpp quantization changes
Issue -
State: open - Opened by s7726 over 1 year ago
#26 - Better support for editable install
Issue -
State: closed - Opened by JohannesGaessler over 1 year ago
- 3 comments
#25 - delete
Issue -
State: closed - Opened by djaffer over 1 year ago
#24 - unable to convert
Issue -
State: open - Opened by chaltik over 1 year ago
#23 - Are you even tested with code?
Issue -
State: closed - Opened by lucasjinreal over 1 year ago
- 2 comments
#22 - Not work
Issue -
State: open - Opened by lucasjinreal over 1 year ago
- 1 comment
#21 - how to set customized tokenizer?
Issue -
State: open - Opened by lucasjinreal over 1 year ago
#20 - Update llamacpp and fix get_logits() and get_embeddings()
Pull Request -
State: closed - Opened by thomasantony over 1 year ago
#19 - AttributeError: module 'llamacpp' has no attribute 'llama_model_quantize'
Issue -
State: closed - Opened by rpfilomeno over 1 year ago
- 2 comments
#18 - Implement context swapping for "infinite text generation"
Pull Request -
State: closed - Opened by thomasantony over 1 year ago
#17 - ImportError: DLL load failed while importing _pyllamacpp: A DLL initialization routine failed.
Issue -
State: closed - Opened by BlueSchnabeltier over 1 year ago
- 1 comment
#16 - Update llamacpp and make tests a bit more reliable
Pull Request -
State: closed - Opened by thomasantony over 1 year ago
- 1 comment
#15 - Updating to latest llama.cpp version
Issue -
State: closed - Opened by regstuff over 1 year ago
- 1 comment
#14 - Fix sending of parameters like -c and -s to llama.cpp
Pull Request -
State: closed - Opened by eiery over 1 year ago
- 3 comments
#13 - Segmentation fault for generations larger than ~512 tokens
Issue -
State: open - Opened by horenbergerb over 1 year ago
- 4 comments
#12 - Create tests and improve LlamaContext example script
Pull Request -
State: closed - Opened by adriacabeza over 1 year ago
- 2 comments
#11 - Fix argument parsing of `llamacpp-cli` and `llamacpp-chat` tools
Pull Request -
State: closed - Opened by mthuurne over 1 year ago
- 1 comment
#10 - It it possible to reset the model context for use as a REST API?
Issue -
State: closed - Opened by gjmulder over 1 year ago
- 1 comment
#9 - Llama Inference not working
Issue -
State: closed - Opened by adriacabeza over 1 year ago
- 3 comments
#8 - Update llama.cpp to new version with mmap()
Pull Request -
State: closed - Opened by thomasantony over 1 year ago
#7 - problem with llama-convert
Issue -
State: closed - Opened by ShudoBlon over 1 year ago
- 4 comments
#6 - Script crash with no output
Issue -
State: open - Opened by GrahamboJangles over 1 year ago
- 1 comment
#5 - Adding new params
Issue -
State: closed - Opened by adriacabeza over 1 year ago
- 3 comments
#4 - Clean up llamacpp-chat interface
Issue -
State: open - Opened by thomasantony over 1 year ago
- 1 comment
Labels: enhancement
#3 - Better command-line argument parsing for cli.py and chat.py
Issue -
State: open - Opened by thomasantony over 1 year ago
Labels: enhancement, good first issue
#2 - Add more steps into the README
Pull Request -
State: open - Opened by nenkoru over 1 year ago
- 1 comment
#1 - Make torch an optional dependency and fix the CI workflows
Pull Request -
State: closed - Opened by thomasantony over 1 year ago