Commit graph

350 commits

Author SHA1 Message Date
ff41e71ed5
Ensure mounted directories exist 2026-01-03 22:39:56 +02:00
c0740c39f8
Enable auto updates for the service containers 2025-12-30 00:26:35 +02:00
93f0d00179
Configure testcontainers to use the podman socket 2025-12-29 09:38:20 +02:00
96568a1103
Install testcontainers in the lsp venv for type inference 2025-12-29 09:38:16 +02:00
331954c204
Install a GPU monitor utility 2025-12-26 17:47:56 +02:00
f6544f6dd7
Rely on podman's built-in update mechanism 2025-12-26 00:23:44 +02:00
5716e68abd
Increase the cache TTL for loaded models 2025-12-24 22:28:32 +02:00
6f3c6f0409
Transparently use run0 instead of sudo 2025-12-24 22:18:18 +02:00
5ca5d6386d
Replace sudo with run0 in all scripts 2025-12-24 22:17:47 +02:00
30aa59ddfd
Replace sudo with run0 in scripts that only use it once 2025-12-24 20:34:34 +02:00
e7ff3d3f16
Run service containers in the transient store for speed and safety 2025-12-24 20:30:24 +02:00
9ee480b959
Use a read-only root filesystem in service containers 2025-12-24 20:30:24 +02:00
0df3008fa0
Collect all the flyspell configuration options together 2025-12-24 20:30:24 +02:00
9d2e483917
Detach the Transmission server from the host network 2025-12-24 20:30:23 +02:00
4cba3af502
Integrate PlantUML mode with org-mode code blocks 2025-12-24 20:30:23 +02:00
deb1bfd574
Setup PlantUML mode in Emacs to use the local server 2025-12-24 20:30:19 +02:00
5f99d941ab
Start a local PlantUML server on boot 2025-12-20 17:09:49 +02:00
82cd587d18
Use long option names 2025-12-20 17:09:49 +02:00
de56d1a1f2
Enable tree-sitter-integrated modes 2025-12-19 15:58:25 +02:00
7e4ddb6dab
Use conventional commenting style for Emacs Lisp 2025-12-13 23:58:49 +02:00
b16f6c410e
Pre-download container images 2025-12-07 00:26:11 +02:00
9432164bc3
Add capabilities to the model library 2025-12-03 19:17:35 +02:00
1b7fb23a2e
Automatically update the list of Ollama models 2025-12-03 18:35:35 +02:00
7900661c82
Add Gemma 3 to the model library 2025-12-03 18:35:34 +02:00
883968b8aa
Fix model listing logic 2025-12-03 18:35:34 +02:00
e7bd7fc24f
Enable spellchecking in Emacs buffers 2025-12-02 22:59:14 +02:00
3a951298a6
Correctly set company mode 2025-12-02 22:59:13 +02:00
43ccc833f6
Run the transmission client inside a contained service 2025-12-02 22:59:13 +02:00
e957ac480b
Make the default system prompt more fun 2025-12-02 22:59:12 +02:00
1756c8c802
Switch to a more powerful default model 2025-11-19 01:04:48 +02:00
3de4e0ab1e
Add Devstral to the model library 2025-11-19 01:04:48 +02:00
fd645f60a2
Retrieve model list directly from the Ollama server 2025-11-18 08:11:26 +02:00
90c310302d
Allow sharing configurations between host and containers 2025-11-18 08:11:25 +02:00
06d2a14d09
Add an Emacs Lisp lexical binding cookie 2025-11-18 08:11:21 +02:00
d187c19d71
Enrich the Ollama models list with metadata for known models 2025-11-17 22:35:49 +02:00
7998f20d52
Compose a library of known local models 2025-11-17 22:35:49 +02:00
a019feb7cd
Reenable autoformatting for Emacs Lisp 2025-11-17 00:18:08 +02:00
131511a2f7
Use conventional commenting style for Emacs Lisp 2025-11-17 00:18:07 +02:00
f533b3ef76
Start the Ollama server automatically on boot 2025-11-17 00:18:05 +02:00
89d667542a
Display advanced options in the gptel menu 2025-11-17 00:18:04 +02:00
7699aa4084
Query the Ollama server for the list of installed models 2025-11-17 00:18:03 +02:00
75ed82967c
Isolate the LLM server from the internet 2025-11-15 17:00:24 +02:00
e5087f5023
Revoke unnecessary filesystem permissions for the server 2025-11-15 17:00:23 +02:00
8a29d8da40
Keep the models in a persistent volume 2025-11-12 17:45:04 +02:00
bd5cfa7a65
Personalize chat UX more 2025-11-12 17:30:15 +02:00
304710810c
Make Ollama the default backend for gptel 2025-11-12 17:29:28 +02:00
be95a1eb42
Allow gptel to use the Ollama server 2025-11-12 11:48:32 +02:00
98dc827a10
Wrap long lines in chat interaction for convenience 2025-11-12 11:48:08 +02:00
9147292b66
Synchronize branch and tag metadata by default 2025-11-12 11:47:40 +02:00
40f24304ea
Declare a quadlet file for an Ollama server 2025-11-12 11:42:22 +02:00