- Code Snippets AI
- Posts
- Local AI models powered by Ollama!
Local AI models powered by Ollama!
We have some exciting news to share with you about our latest integration into our suite of desktop apps. Continue reading to find out what we have been working on!
Hello there {{email}}, hope you are well!
With our latest version of our desktop applications (v0.0.7) we have integrated Local Ollama models!
Open-source Large Language Models (LLMs) paired with Ollama offer developers a powerful, flexible, and privacy-focused solution for coding assistance.
Here's why you should consider using these local LLMs in our Desktop apps:
Local execution: Run models on your machine, ensuring data privacy and offline access.
Cost-effective: Avoid subscription fees associated with cloud-based AI coding assistants.
Wide language support: Access models trained on various programming languages and frameworks.
Community-driven: Benefit from continuous improvements and updates from the open-source community.
Easy setup: Our desktop apps simplify model management and deployment, thanks to our direct integration with the Ollama docs.
Getting started
To use our new local AI features, please ensure you are using the latest version of the Desktop apps v0.0.7
Starting a local Ollama Model
Select Local from the LLM Provider dropdown in the chat settings screen

Download or Start Ollama!
(a) If you don’t have Ollama setup, simply click Download.
(b) If you do have Ollama on your system, run ollama serve in a terminal.
ollama serve
You will then be shown the available models screen.
If you don’t have any models installed on your system, search for a model you want to use on ollama.com and run the following command in a terminal:
ollama pull model_name
After the Ollama model has been downloaded to your system,
Click Start Ollama, or run the following command in your terminal.
ollama serve
Select the Ollama model you have downloaded from the dropdown.
A. Navigate to the Model Selection ![]() | B. Select the model you wish to use ![]() |
Start Chatting with your very own Local AI model!
All AI features in our desktop apps are compatible with the Local LLMs.
Using multiple AI models in the same chat
Thanks to our amazing dev team ❤️ you can use a combination of Local and Cloud models in the same chat, with full chat context carried across your messages.
This provides you with an professional AI coding experience, enhancing your debugging, refactoring and code snippet sharing capabilities!

Try Ollama today in our desktop applications
codesnippets.ai
With Love ❤️
Code Snippets AI Team
[email protected]