- Antigravity (with Gemini 3 inside -- not free)
- VS Code (with Cline -- FREE for local model)
- VS Code (with Continue+Ollama -- FREE for local model)
Antigravity with Gemini 3 (or Claude Sonnet)
- Comprehensive, at engineer level to follow instruction to code/edit features
- Good to work with this engineer level for a few hours and allowance runs out!
Antigravity (or) VS Code, with Cline
- Cline extension installable on both VS Code and Antigravity
- Cline extension auth fails (currently Jan 2026)
OR: TRY AGAIN LATER
VS Code with Continue+Ollama
- Continue extension UI tells to download Ollama & models, this is clear how to use.
- Ollama is shipped with Vulkan (see in Ollama installation dir) means it can use GPU
(download GPU-Z to run along Ollama to see if GPU used) - However, a GPU of 4GB VRAM won't be enough for Llama 3.1 8B (suggested by Continue)
- Results:
- Slow processing of prompts
- Low comprehension
eg. only 'dev' command in package.json but agent suggests 'npm run start' - Feels like one should code by himself better.
Suggestions to anyone want to run vibe coding locally:
- Buy a GPU with huge VRAM, not the latest generation (not needed)
No comments:
Post a Comment