π§ π» Local LLM Integration in VSCodium β Supported or Any Workarounds? #2585
aaronthefullstackengineer
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi everyone! π
Iβm looking to integrate local LLMs into VSCodium for offline, privacy-focused AI code assistance.
Specifically:
π Does VSCodium support connecting to AI assistants that run locally (e.g., Ollama, LM Studio, Tabby, or other local endpoints)?
π§© Are there open-source extensions that allow using a local REST API, OpenAI-compatible API, or self-hosted model server?
π§ Are there any restrictions in VSCodium due to Microsoft service removals that affect AI integrations?
If itβs not natively supported, Iβd appreciate:
π§ Suggested workarounds
π οΈ Custom configs, forks, or recommended extensions that make this possible
Goal:
Privacy-friendly, AI-assisted coding in VSCodium β without sending code to external servers. π
Thanks in advance! π
Beta Was this translation helpful? Give feedback.
All reactions