Eclipse extension that provides inline completion backed by LLMs
-
Updated
Apr 3, 2026 - Java
Eclipse extension that provides inline completion backed by LLMs
Docker service stack for a comprehensive local ai experience, running on a normal home hardware setup with single GPU. DevOps-Dashboards included. WIP.
OllamaLLM Desktop App. Golang API. Utilizes 10 LLMs. ChatGPT-like GUI . In it's current form, it operates as a desktop app / local host API, but, has the foundation to be scaled up to become a complete web app. See the documentation for more.
OpenAI-compatible proxy that makes local LLM tool calling actually work — schema validation, retry with feedback, model escalation, and context condensing
💻 Enhance your coding experience with Mistral Vibe, an open-source CLI assistant that lets you interact with your projects using natural language.
The open-source agentic coding CLI. Focused on Ollama Cloud Models.
Autonomous dev agent that runs on your hardware. No cloud. No API keys. Ships code while you sleep.
Add a description, image, and links to the devstral topic page so that developers can more easily learn about it.
To associate your repository with the devstral topic, visit your repo's landing page and select "manage topics."