[ sid ]
[ Paquet source : llama.vim ]
Paquet : vim-llama.cpp (0.0~git20260131.a1c8e6e-1)
Liens pour vim-llama.cpp
Ressources Debian :
- Rapports de bogues
- Developer Information
- Journal des modifications Debian
- Fichier de licence
- Suivis des correctifs pour Debian
Télécharger le paquet source llama.vim :
- [llama.vim_0.0~git20260131.a1c8e6e-1.dsc]
- [llama.vim_0.0~git20260131.a1c8e6e.orig.tar.xz]
- [llama.vim_0.0~git20260131.a1c8e6e-1.debian.tar.xz]
Responsables :
- Debian Deep Learning Team (Page QA, Archive du courrier électronique)
- Christian Kastner (Page QA)
- Mo Zhou (Page QA)
Ressources externes :
- Page d'accueil [github.com]
Paquets similaires :
Local LLM-assisted text completion for vim
This is a vim plugin for LLM-assisted text completion using llama.cpp.
To be useful, you will need to provide your own model, or download a LLM model from https://huggingface.co.
Features:
* Auto-suggest on cursor movement in Insert mode * Toggle the suggestion manually by pressing Ctrl+F * Accept a suggestion with Tab * Accept the first line of a suggestion with Shift+Tab * Control max text generation time * Configure scope of context around the cursor * Ring context with chunks from open and edited files and yanked text * Supports very large contexts even on low-end hardware via smart context reuse * Speculative FIM support * Speculative Decoding support * Display performance stats
Autres paquets associés à vim-llama.cpp
|
|
|
|
-
- dep: llama.cpp-tools (>= 7965)
- LLM inference in C/C++ - main utilities
-
- dep: vim (>= 9.1~)
- Vi IMproved - éditeur vi amélioré
Télécharger vim-llama.cpp
| Architecture | Taille du paquet | Espace occupé une fois installé | Fichiers |
|---|---|---|---|
| all | 20,8 ko | 106,0 ko | [liste des fichiers] |
