Ecosyste.ms: Issues

An open API service for providing issue and pull request metadata for open source projects.

GitHub / ollama/ollama issues and pull requests

#6303 - Llama 3.1 405B fix-update

Issue - State: open - Opened by gileneusz about 1 month ago - 2 comments
Labels: model request

#6302 - Stepwise decoding via websocket, replacing server-side GBNF and JSON schema constraints

Issue - State: open - Opened by James4Ever0 about 1 month ago - 1 comment
Labels: feature request

#6301 - ollama list does not show previously downloaded models

Issue - State: open - Opened by ACodingfreak about 1 month ago - 4 comments
Labels: bug

#6300 - ollama creates a new ollama runner folder each container restart

Issue - State: closed - Opened by Minionflo about 1 month ago - 3 comments
Labels: bug, linux, docker

#6299 - Install Ollama with winget

Pull Request - State: open - Opened by nikiluk about 1 month ago

#6298 - Install Ollama with Winget on Windows

Issue - State: open - Opened by nikiluk about 1 month ago
Labels: feature request

#6297 - Models not loading when using ROCm(Radeon VII)

Issue - State: closed - Opened by WannaBeOCer about 1 month ago - 2 comments
Labels: bug, linux, amd

#6296 - Better to add athene70b f16 and q8

Issue - State: open - Opened by Llamadouble999q about 1 month ago - 2 comments
Labels: model request

#6295 - Ability to preload embedding model

Issue - State: closed - Opened by comunidadio about 1 month ago - 2 comments
Labels: feature request

#6295 - Ability to preload embedding model

Issue - State: closed - Opened by comunidadio about 1 month ago - 2 comments
Labels: feature request

#6294 - AirLLM integration?

Issue - State: open - Opened by blankuserrr about 1 month ago - 1 comment
Labels: feature request

#6293 - "The model you are attempting to pull requires a newer version of Ollama" when Ollama is built from the latest source

Issue - State: closed - Opened by sammcj about 1 month ago - 8 comments
Labels: needs more info

#6291 - Don't hard fail on sparse setup error

Pull Request - State: closed - Opened by dhiltgen about 1 month ago - 1 comment

#6290 - Harden intel boostrap for nil pointers

Pull Request - State: closed - Opened by dhiltgen about 1 month ago

#6289 - some models crash on rocm (7900XT)

Issue - State: open - Opened by markg85 about 1 month ago - 10 comments
Labels: bug, linux, amd

#6287 - UHD intel GPU Accelerate

Issue - State: closed - Opened by jomardyan about 1 month ago - 2 comments
Labels: feature request, intel

#6286 - Context window size cannot be changed

Issue - State: open - Opened by mihaelagrigore about 1 month ago - 19 comments
Labels: bug

#6285 - Encountered an error while drawing inference

Issue - State: open - Opened by ma000-simreka about 1 month ago - 1 comment
Labels: bug, nvidia

#6284 - Intel GPU in Docker container crashes

Issue - State: closed - Opened by Minionflo about 1 month ago - 1 comment
Labels: bug, intel

#6282 - AMD integrated graphic on linux kernel 6.9.9+, GTT memory, loading freeze fix

Pull Request - State: open - Opened by MaciejMogilany about 1 month ago - 12 comments

#6281 - docs(tools): add ingest

Pull Request - State: closed - Opened by sammcj about 1 month ago - 1 comment

#6280 - Need qwen2:math !!

Issue - State: open - Opened by jsrdcht about 1 month ago
Labels: feature request

#6279 - feat: Introduce K/V Context Quantisation (vRAM improvements)

Pull Request - State: open - Opened by sammcj about 1 month ago - 63 comments

#6278 - cmd: print proxy info when OLLAMA_DEBUG is true

Pull Request - State: open - Opened by zhangyunhao116 about 1 month ago

#6277 - Ollama Latest (0.3.4) Will not run models

Issue - State: open - Opened by awptechnologies about 1 month ago - 8 comments
Labels: bug, needs more info

#6276 - feat: K/V cache quantisation (massive vRAM improvement!)

Pull Request - State: closed - Opened by sammcj about 1 month ago

#6275 - provide better hashing algorithm

Issue - State: closed - Opened by olumolu about 1 month ago - 1 comment
Labels: feature request

#6274 - Binary files (*.png, *.ico, *.icns) listed as modified upon cloning the repository

Issue - State: open - Opened by PAN-Chuwen about 1 month ago - 2 comments
Labels: bug

#6273 - unsupported content type: unknown

Issue - State: closed - Opened by little1d about 1 month ago - 6 comments
Labels: bug

#6272 - Ollama Creat 手动部署 报错 Error: invalid file magic

Issue - State: open - Opened by JaminYan about 1 month ago - 5 comments
Labels: bug

#6270 - ollama does not work continuously

Issue - State: closed - Opened by peanutpaste about 1 month ago - 2 comments
Labels: bug, nvidia, needs more info

#6269 - Please add LGAI-EXAONE/EXAONE-3.0-7.8B-Instruct model

Issue - State: open - Opened by xest about 1 month ago - 8 comments
Labels: model request

#6268 - Cannot get to UI Web page

Issue - State: open - Opened by lamachine about 1 month ago - 6 comments
Labels: bug

#6267 - add openbmb MiniCPM-V-2_6

Issue - State: closed - Opened by insinfo about 1 month ago - 20 comments
Labels: feature request

#6265 - Not a feature request, not a bug, problem with LLama3.1

Issue - State: closed - Opened by airdogvan about 1 month ago - 8 comments
Labels: feature request

#6264 - Parse cpuinfo and set default threads

Pull Request - State: open - Opened by dhiltgen about 1 month ago

#6263 - Pull Command Parsing Not Working

Issue - State: closed - Opened by chadwickhar08 about 1 month ago - 6 comments
Labels: bug, windows

#6262 - Batch embeddings get progressively worse with larger batches

Issue - State: open - Opened by jorgetrejo36 about 1 month ago - 14 comments
Labels: bug

#6261 - Offload a model command

Issue - State: closed - Opened by stavsap about 1 month ago - 5 comments
Labels: feature request

#6260 - llama3.1 memory

Pull Request - State: open - Opened by mxyng about 1 month ago - 3 comments

#6259 - Inference fails with "llama_get_logits_ith: invalid logits id 7, reason: no logits"

Issue - State: closed - Opened by yurivict about 1 month ago - 9 comments
Labels: bug

#6257 - Wrong output with the new Llama3.1 and llama3-groq-tool-use pull

Issue - State: closed - Opened by Goekdeniz-Guelmez about 1 month ago - 22 comments
Labels: bug

#6256 - Why the output of generate api is different with 'num_gpu': 0

Issue - State: closed - Opened by QuanZeng about 1 month ago - 5 comments
Labels: bug

#6255 - Update LLaVA to LLaVA OneVision

Issue - State: open - Opened by alexrah about 1 month ago - 4 comments
Labels: model request

#6254 - Lumina-mGPT support

Issue - State: open - Opened by Amazon90 about 1 month ago - 3 comments
Labels: model request

#6253 - When systemMessage exceeds a certain length, ollama is unable to process it.

Issue - State: open - Opened by billrenhero about 1 month ago - 4 comments
Labels: bug

#6253 - When systemMessage exceeds a certain length, ollama is unable to process it.

Issue - State: open - Opened by billrenhero about 1 month ago - 4 comments
Labels: bug

#6251 - Ollama multiuser scale

Issue - State: open - Opened by jamiabailey about 1 month ago
Labels: feature request, question

#6249 - ollama run llama3.1 command outputs nonsense

Issue - State: closed - Opened by erfan-khalaji about 1 month ago - 4 comments
Labels: bug

#6248 - Error: could not connect to ollama app, is it running?

Issue - State: closed - Opened by zzxgraph about 1 month ago - 20 comments
Labels: bug

#6247 - Store layers inside manifests consistently as values.

Pull Request - State: closed - Opened by jessegross about 1 month ago - 1 comment

#6246 - Modelfile - Customize a prompt

Issue - State: closed - Opened by LucasFreitas88 about 1 month ago - 11 comments
Labels: bug

#6243 - get api models

Pull Request - State: open - Opened by mxyng about 1 month ago

#6241 - Speech Prototype

Pull Request - State: open - Opened by royjhan about 1 month ago - 2 comments

#6240 - Not executed in gpu amd rx 6750 GRE

Issue - State: closed - Opened by 21307369 about 1 month ago - 2 comments
Labels: question

#6239 - support keep_alive in config.json

Issue - State: closed - Opened by avelican about 1 month ago - 2 comments
Labels: feature request

#6238 - Ollama server running out of memory when it didn't in previous version

Issue - State: closed - Opened by MxtAppz about 1 month ago - 7 comments
Labels: bug

#6238 - Ollama server running out of memory when it didn't in previous version

Issue - State: closed - Opened by MxtAppz about 1 month ago - 7 comments
Labels: bug

#6237 - Ollama Product Stance on Grammar Feature / Outstanding PRs

Issue - State: open - Opened by Kinglord about 1 month ago - 9 comments
Labels: feature request

#6237 - Ollama Product Stance on Grammar Feature / Outstanding PRs

Issue - State: open - Opened by Kinglord about 1 month ago - 5 comments
Labels: feature request

#6236 - gpu not found in windows

Issue - State: open - Opened by showyoung about 1 month ago - 17 comments
Labels: bug, windows

#6235 - Set *.png and *.ico to be treated as binary files.

Pull Request - State: open - Opened by Nicholas42 about 1 month ago - 5 comments

#6233 - Strange! Each request consumes an additional 2 seconds when I used /api/embed

Issue - State: open - Opened by AlbertXu233 about 1 month ago - 5 comments
Labels: bug

#6233 - Strange! Each request consumes an additional 2 seconds when I used /api/embed

Issue - State: open - Opened by AlbertXu233 about 1 month ago - 5 comments
Labels: bug

#6232 - Experimental SYCL offload for Intel 13g (Raptor Lake w Xe-LP) not offloading

Issue - State: closed - Opened by byjrack about 1 month ago - 12 comments
Labels: bug

#6230 - Add Generate Embedding for Sparse vector

Issue - State: open - Opened by shashade2012 about 1 month ago - 3 comments
Labels: feature request

#6230 - Add Generate Embedding for Sparse vector

Issue - State: open - Opened by shashade2012 about 1 month ago - 2 comments
Labels: feature request

#6227 - ollama cannot start on ubuntu 22.04

Issue - State: closed - Opened by garyyang85 about 1 month ago - 9 comments
Labels: bug

#6226 - Error: unexpected EOF:

Issue - State: open - Opened by KangInKoo about 1 month ago - 10 comments
Labels: bug

#6226 - Error: unexpected EOF:

Issue - State: open - Opened by KangInKoo about 1 month ago - 6 comments
Labels: bug

#6225 - POST "/api/generate" retrun 500

Issue - State: closed - Opened by w16645395520 about 1 month ago - 3 comments
Labels: bug

#6224 - Passing result from tool calling to model

Issue - State: open - Opened by tristanMatthias about 1 month ago - 3 comments
Labels: feature request

#6220 - server: parallelize embeddings in API web handler instead of in subprocess runner

Pull Request - State: closed - Opened by jmorganca about 1 month ago - 2 comments

#6218 - fix memory

Pull Request - State: open - Opened by mxyng about 1 month ago

#6211 - Error: max retries exceeded

Issue - State: closed - Opened by igorschlum about 1 month ago - 4 comments
Labels: bug

#6209 - support for llama3 and llama3.1 uncensored

Issue - State: open - Opened by olumolu about 1 month ago - 1 comment
Labels: model request

#6204 - The Quickstart section in README is missing the 'ollama start' command

Issue - State: open - Opened by yurivict about 1 month ago - 20 comments
Labels: bug

#6201 - feat: add support for running ollama on rocm in wsl

Pull Request - State: open - Opened by evshiron about 1 month ago - 1 comment

#6199 - Ollama crashes with Deepseek-Coder-V2-Lite-Instruct

Issue - State: open - Opened by shockme about 1 month ago - 8 comments
Labels: bug

#6199 - Ollama crashes with Deepseek-Coder-V2-Lite-Instruct

Issue - State: open - Opened by shockme about 1 month ago - 3 comments
Labels: bug

#6198 - Request to Add JAIS 70B Model

Issue - State: open - Opened by umar052001 about 1 month ago - 4 comments
Labels: model request

#6194 - Please add CodeShell to Ollama/library, as llama.cpp already supports it

Issue - State: open - Opened by vimBashMing about 1 month ago - 1 comment
Labels: model request

#6193 - Add New SOTA Models: Palmyra-Med and Palmyra-Fin

Issue - State: closed - Opened by gileneusz about 1 month ago - 5 comments
Labels: model request

#6188 - Allow singular array for CompletionRequest prompt field

Pull Request - State: open - Opened by igor-drozdov about 1 month ago - 2 comments

#6184 - Add InternLM 2.5 family of models

Issue - State: closed - Opened by nviraj about 2 months ago - 4 comments
Labels: model request

#6184 - Add InternLM 2.5 family of models

Issue - State: closed - Opened by nviraj about 2 months ago - 3 comments
Labels: model request

#6183 - LINE FEED problems in recent commit

Issue - State: closed - Opened by FellowTraveler about 2 months ago - 5 comments
Labels: bug

#6182 - Catch one more error log

Pull Request - State: closed - Opened by dhiltgen about 2 months ago

#6181 - llama: Wire up native source file dependencies

Pull Request - State: closed - Opened by dhiltgen about 2 months ago

#6177 - run OI with OLLAMA SERVER IN NETWORK

Issue - State: closed - Opened by RM-S2 about 2 months ago - 2 comments
Labels: bug

#6176 - System Prompts can not work on the first round.

Issue - State: open - Opened by DirtyKnightForVi about 2 months ago - 25 comments
Labels: bug

#6176 - System Prompts can not work on the first round.

Issue - State: open - Opened by DirtyKnightForVi about 2 months ago - 25 comments
Labels: bug

#6174 - Unable to run / pull llama3 model

Issue - State: closed - Opened by Maha-vignesh09 about 2 months ago - 7 comments
Labels: bug