[ sid ]
[ Pakiet źródłowy: llama.vim ]
Pakiet: vim-llama.cpp (0.0~git20260131.a1c8e6e-1)
Odnośniki dla vim-llama.cpp
Zasoby systemu Debian:
- Raporty o błędach
- Developer Information
- Dziennik zmian w systemie Debian
- Informacje nt. praw autorskich
- Śledzenie łatek systemu Debian
Pobieranie pakietu źródłowego llama.vim:
- [llama.vim_0.0~git20260131.a1c8e6e-1.dsc]
- [llama.vim_0.0~git20260131.a1c8e6e.orig.tar.xz]
- [llama.vim_0.0~git20260131.a1c8e6e-1.debian.tar.xz]
Opiekunowie:
- Debian Deep Learning Team (Strona QA, Archiwum e-mail)
- Christian Kastner (Strona QA)
- Mo Zhou (Strona QA)
Zasoby zewnętrzne:
- Strona internetowa [github.com]
Podobne pakiety:
Local LLM-assisted text completion for vim
This is a vim plugin for LLM-assisted text completion using llama.cpp.
To be useful, you will need to provide your own model, or download a LLM model from https://huggingface.co.
Features:
* Auto-suggest on cursor movement in Insert mode * Toggle the suggestion manually by pressing Ctrl+F * Accept a suggestion with Tab * Accept the first line of a suggestion with Shift+Tab * Control max text generation time * Configure scope of context around the cursor * Ring context with chunks from open and edited files and yanked text * Supports very large contexts even on low-end hardware via smart context reuse * Speculative FIM support * Speculative Decoding support * Display performance stats
Inne pakiety związane z vim-llama.cpp
|
|
|
|
-
- dep: llama.cpp-tools (>= 7965)
- LLM inference in C/C++ - main utilities
-
- dep: vim (>= 9.1~)
- Vi IMproved - udoskonalony edytor vi
Pobieranie vim-llama.cpp
| Architektura | Rozmiar pakietu | Rozmiar po instalacji | Pliki |
|---|---|---|---|
| all | 20,8 KiB | 106,0 KiB | [lista plików] |
