Ecosyste.ms: Issues

An open API service for providing issue and pull request metadata for open source projects.

GitHub / ngxson/wllama issues and pull requests

#158 - update emsdk to 4.0.3

Pull Request - State: open - Opened by ngxson 3 days ago

#157 - Fix a bug with kv_remove, release v2.2.0

Pull Request - State: closed - Opened by ngxson 8 days ago

#156 - temporary apply that viral x2 speedup PR

Pull Request - State: closed - Opened by ngxson 8 days ago

#155 - Remove json.hpp dependency

Pull Request - State: closed - Opened by ngxson 8 days ago

#154 - switch to binary protocol between JS and WASM world (glue.cpp)

Pull Request - State: closed - Opened by ngxson 15 days ago

#153 - Deepseek gguf < 2gb

Issue - State: closed - Opened by Weedshaker 17 days ago - 1 comment

#152 - WebGPU?

Issue - State: open - Opened by hrstoyanov 19 days ago - 3 comments

#151 - add benchmark function, used internally

Pull Request - State: closed - Opened by ngxson 21 days ago

#150 - Sync with upsteam source code, add demo for DeepSeek-R1

Pull Request - State: closed - Opened by ngxson 26 days ago

#149 - Unsupported pre tokenizer

Issue - State: closed - Opened by Nithur-M 26 days ago - 2 comments

#148 - Issue with next.js

Issue - State: closed - Opened by Nithur-M about 1 month ago - 1 comment

#147 - sync with upstream llama.cpp source code

Pull Request - State: closed - Opened by ngxson about 1 month ago

#146 - Suggestion: integration with open-webui

Issue - State: open - Opened by gloryknight about 2 months ago

#145 - sync to latest upstream source code

Pull Request - State: closed - Opened by ngxson about 2 months ago

#144 - Failing to load the chat template from ibm-granite/granite-3.1-1b-a400m-instruct

Issue - State: closed - Opened by felladrin about 2 months ago - 1 comment

#142 - v2 build fails when using ModelManager inside React TypeScript project

Issue - State: closed - Opened by mitulagr2 2 months ago - 1 comment

#141 - Build fails with 'No overload matches this call'

Issue - State: closed - Opened by mitulagr2 2 months ago - 2 comments

#140 - add createChatCompletion

Pull Request - State: closed - Opened by ngxson 2 months ago

#139 - Fix load file order

Pull Request - State: closed - Opened by ngxson 3 months ago

#138 - Fix nextjs build

Pull Request - State: closed - Opened by ngxson 3 months ago

#136 - Set location of cached models and enable local loading

Issue - State: open - Opened by toreo-ai 3 months ago - 2 comments

#134 - Update available types for `cache_type_k` and `cache_type_v`

Pull Request - State: closed - Opened by felladrin 3 months ago

#133 - Version 2.0

Pull Request - State: closed - Opened by ngxson 3 months ago

#132 - sync to latest upstream source code

Pull Request - State: closed - Opened by ngxson 3 months ago

#130 - Add WllamaError class, fix llama_decode hangs on long input text

Pull Request - State: closed - Opened by ngxson 4 months ago

#129 - sync to latest upstream source code

Pull Request - State: closed - Opened by ngxson 4 months ago

#128 - papeg.is ready!

Issue - State: open - Opened by flatsiedatsie 4 months ago - 7 comments

#127 - Differences in template application

Issue - State: open - Opened by flatsiedatsie 4 months ago - 6 comments

#125 - sync to latest upstream source code

Pull Request - State: closed - Opened by ngxson 4 months ago

#123 - Error: Module is already initialized

Issue - State: open - Opened by flatsiedatsie 5 months ago

#122 - How to best use allow_offline?

Issue - State: closed - Opened by flatsiedatsie 5 months ago - 1 comment

#120 - cannot find tokenizer merges in model file

Issue - State: open - Opened by flatsiedatsie 5 months ago - 18 comments
Labels: llama.cpp related

#119 - Update to latest llama.cpp source code

Pull Request - State: closed - Opened by ngxson 5 months ago

#118 - decode/encode : do not fail on empty batch

Pull Request - State: closed - Opened by ngxson 5 months ago

#117 - table_index is out of bounds

Issue - State: open - Opened by flatsiedatsie 5 months ago - 2 comments

#116 - RangeError: Array buffer allocation failed

Issue - State: open - Opened by flatsiedatsie 5 months ago

#115 - Firefox: Error in input stream

Issue - State: open - Opened by flatsiedatsie 5 months ago - 4 comments

#114 - [bug]: llama_decode error when send the same prompt twice

Issue - State: closed - Opened by cbh778899 5 months ago - 1 comment

#113 - v1.16.1

Pull Request - State: closed - Opened by ngxson 5 months ago

#112 - Feature: list the available local models from the cache

Issue - State: open - Opened by synw 6 months ago - 2 comments

#111 - Unable to import in React.js

Issue - State: closed - Opened by cbh778899 6 months ago - 6 comments

#110 - Bug: `createCompletion` stuck when it runs out of context

Issue - State: closed - Opened by ngxson 6 months ago
Labels: bug

#109 - ability to use custom cacheManager

Pull Request - State: closed - Opened by ngxson 6 months ago

#108 - [Feature Request] Allow setting our own Cache Manager

Issue - State: closed - Opened by felladrin 6 months ago - 1 comment

#107 - What does `noTEE` do?

Issue - State: closed - Opened by flatsiedatsie 6 months ago - 2 comments

#106 - Phi-3: error loading model hyperparameters

Issue - State: closed - Opened by flatsiedatsie 6 months ago - 5 comments
Labels: llama.cpp related

#105 - [Feature request] LoRA support

Issue - State: open - Opened by OKUA1 7 months ago - 2 comments

#104 - v1.15.0

Pull Request - State: closed - Opened by ngxson 7 months ago

#103 - implement KV cache reuse

Pull Request - State: closed - Opened by ngxson 7 months ago

#102 - Improve main UI example

Pull Request - State: closed - Opened by ngxson 7 months ago

#101 - implement KV cache reuse for completion

Issue - State: closed - Opened by ngxson 7 months ago

#100 - fix log print and `downloadModel`

Pull Request - State: closed - Opened by ngxson 7 months ago

#99 - Add `main` example (chat UI)

Pull Request - State: closed - Opened by ngxson 7 months ago

#98 - Add prettier

Issue - State: closed - Opened by ngxson 7 months ago

#97 - ci: add e2e test

Issue - State: open - Opened by ngxson 7 months ago

#96 - main: initialize main example

Issue - State: closed - Opened by ngxson 7 months ago - 2 comments

#95 - Add `downloadModel` function

Pull Request - State: closed - Opened by ngxson 7 months ago

#94 - v1.14.2

Pull Request - State: closed - Opened by ngxson 7 months ago

#93 - v1.14.1, update to latest upstream source code

Pull Request - State: closed - Opened by ngxson 7 months ago

#93 - v1.14.1, update to latest upstream source code

Pull Request - State: closed - Opened by ngxson 7 months ago

#92 - v1.14.0

Pull Request - State: closed - Opened by ngxson 7 months ago

#92 - v1.14.0

Pull Request - State: closed - Opened by ngxson 7 months ago

#91 - Support llama_encode (WIP)

Pull Request - State: closed - Opened by ngxson 7 months ago - 4 comments

#91 - Support llama_encode (WIP)

Pull Request - State: closed - Opened by ngxson 7 months ago - 4 comments

#90 - save ETag metadata, add allowOffline

Pull Request - State: closed - Opened by ngxson 7 months ago

#90 - save ETag metadata, add allowOffline

Pull Request - State: closed - Opened by ngxson 7 months ago - 1 comment

#89 - Add support for control vectors

Issue - State: open - Opened by ngxson 7 months ago
Labels: enhancement

#89 - Add support for control vectors

Issue - State: open - Opened by ngxson 7 months ago
Labels: enhancement

#88 - Force use of the cache if there is no internet connection

Pull Request - State: closed - Opened by flatsiedatsie 8 months ago - 6 comments

#88 - Force use of the cache if there is no internet connection

Pull Request - State: closed - Opened by flatsiedatsie 8 months ago - 6 comments

#87 - Model caching with new download manager?

Issue - State: closed - Opened by flatsiedatsie 8 months ago - 1 comment
Labels: enhancement

#87 - Model caching with new download manager?

Issue - State: closed - Opened by flatsiedatsie 8 months ago - 1 comment
Labels: enhancement

#86 - T5 and Flan-T5 models support (llama_encode)

Issue - State: closed - Opened by felladrin 8 months ago - 1 comment
Labels: enhancement

#86 - T5 and Flan-T5 models support (llama_encode)

Issue - State: closed - Opened by felladrin 8 months ago - 1 comment
Labels: enhancement

#85 - v1.13.0

Pull Request - State: closed - Opened by ngxson 8 months ago

#85 - v1.13.0

Pull Request - State: closed - Opened by ngxson 8 months ago

#84 - Fix exit() function crash if model is not loaded

Pull Request - State: closed - Opened by flatsiedatsie 8 months ago - 1 comment

#84 - Fix exit() function crash if model is not loaded

Pull Request - State: closed - Opened by flatsiedatsie 8 months ago - 1 comment

#83 - How to cleanly abort downloading a model?

Issue - State: open - Opened by flatsiedatsie 8 months ago - 1 comment
Labels: enhancement

#83 - Add support for `AbortController` on downloading model

Issue - State: open - Opened by flatsiedatsie 8 months ago - 1 comment
Labels: enhancement

#82 - The mystery of Schrodinger's exit function

Issue - State: closed - Opened by flatsiedatsie 8 months ago - 2 comments

#82 - The mystery of Schrodinger's exit function

Issue - State: closed - Opened by flatsiedatsie 8 months ago - 2 comments

#81 - sync with upstream llama.cpp source code (+gemma2 support)

Pull Request - State: closed - Opened by ngxson 8 months ago

#81 - sync with upstream llama.cpp source code (+gemma2 support)

Pull Request - State: closed - Opened by ngxson 8 months ago

#80 - Improve cache API

Pull Request - State: closed - Opened by ngxson 8 months ago

#80 - Improve cache API

Pull Request - State: closed - Opened by ngxson 8 months ago

#79 - Add delete method to cacheManager

Pull Request - State: closed - Opened by flatsiedatsie 8 months ago - 5 comments

#79 - Add delete method to cacheManager

Pull Request - State: closed - Opened by flatsiedatsie 8 months ago - 5 comments

#78 - Update README.md

Pull Request - State: closed - Opened by flatsiedatsie 8 months ago - 1 comment

#78 - Update README.md

Pull Request - State: closed - Opened by flatsiedatsie 8 months ago - 1 comment

#76 - Failed to build from scratch: llamacpp-wasm-builder, CMake Error (add_executable): Cannot find source file

Issue - State: closed - Opened by flatsiedatsie 8 months ago - 1 comment
Labels: llama.cpp related