A Complete Guide to Running Your First AI Model with Ollama
-
Penulis : adityo
- Tanggal :
- Tag : Ollama
For many people, artificial intelligence (AI) sounds complicated and only accessible through online services like ChatGPT or Copilot. But did you know that you can actually run AI directly on your own laptop or PC?
With Ollama, that becomes possible. Ollama is an open-source application that allows you to run various AI models locally, without always relying on an internet connection.
This guide will walk you through the steps to run your very first AI model using Ollama.
1. Make Sure Ollama is Installed
Before you can use Ollama, you need to install it.
- Windows: Download the
.exeinstaller from ollama.com → run the installer. - macOS: Download the
.pkginstaller from the official site → follow the installation process. - Linux: Run this command in the terminal:
Bash
curl -fsSL https://ollama.com/install.sh | sh
Once installed, open Terminal/Command Prompt and check:
ollama --version
If you see a version number, the installation was successful.
2. Download Your First AI Model
Ollama provides many open-source AI models. Some popular ones are:
- Llama 2 – Great for chatting and general experiments.
- Mistral – Lightweight and runs well on most laptops.
- Gemma / Phi – Small models, resource-friendly.
- DeepSeek / Nomic – Useful for research and data analysis.
To download and run Llama 2:
ollama run llama2
If the model isn’t available locally, Ollama will automatically download it.
3. Start Interacting with the AI
Once the model is running, you can type in questions or commands directly. For example:
> What is Artificial Intelligence?
The AI will respond in the terminal instantly.
4. Run Other Models
If you’d like to try different models, just replace the model name. For example:
ollama run mistral
ollama run gemma
You can also configure advanced options (like response length or style), but for beginners, the default setup is more than enough.
5. Tips for a Smooth Experience
- Start with lightweight models if your laptop/PC has limited RAM.
- Make sure your internet connection is stable when downloading models for the first time.
- Once downloaded, models can be used offline.
- To see a list of installed models, use:
Bash
ollama list
Conclusion
With Ollama, running AI on your personal computer is no longer a challenge. Just install the app, download a model, and start chatting with AI directly from your terminal.
This is a great first step to exploring AI further—whether for learning, experimenting, or building your own applications.
Read Also:
Adityo GW
I am currently working as a freelancer, involved in several website projects and server maintenance tasks.