Dual‑Mode Processing
Seamlessly switch between Local (Ollama) and Cloud (GPT‑4o, Claude 3.5, Gemini, Groq).
Dual‑mode AI (Local + Cloud), semantic search, intelligent generation, and contextual explanations — all inside VS Code.
A production‑ready Reflyx exceeding the capabilities of Augment Code, Cursor, and Windsurf — with local privacy, cloud performance, semantic search, intelligent code generation, and contextual explanations.
Seamlessly switch between Local (Ollama) and Cloud (GPT‑4o, Claude 3.5, Gemini, Groq).
Keys stored via VS Code SecretStorage for safe usage.
Groq integration delivering 500+ tokens/sec for snappy responses.
Intelligently select and fallback between providers for reliability.
Context‑aware chat, inline suggestions, and streaming responses.
Index and search your workspace for relevant code instantly.
git clone <repo-url>
cd ai-coding-assistant
python setup.pycd extension
npm install
npm run compile
npm run package{
"aiCodingAssistant.modelProvider": "ollama",
"aiCodingAssistant.embeddingModel": "all-MiniLM-L6-v2",
"aiCodingAssistant.maxChunkSize": 500,
"aiCodingAssistant.retrievalCount": 10
}curl http://localhost:11434/api/tags).curl http://localhost:6333/health).PRs welcome — features, fixes, and documentation improvements.
MIT — free for personal and commercial use.