Ollama SDK Workshop Setup Guide
This document contains instructions for setting up your environment for the Ollama SDK workshop.Prerequisites
Before starting the workshop, ensure you have the following installed on your system:-
Node.js (v16+) - Download Node.js
- Verify installation:
node --version
- Verify installation:
-
Ollama - Download Ollama
- Verify installation:
ollama --version
- Verify installation:
-
Git - Download Git
- Verify installation:
git --version
- Verify installation:
- A code editor - VS Code recommended
- Terminal or command prompt
Environment Setup
1. Clone the Workshop Repository
2. Install Dependencies
3. Ensure Ollama is Running
Start the Ollama service:- macOS/Linux:
ollama serve - Windows: Ollama should run automatically as a service after installation
http://localhost:11434 in your browser. You should see a simple “Ollama is running” message.
4. Download Required Models
For the workshop, we recommend having these models available:5. Install the Ollama SDK CLI
Workshop Directory Structure
After setup, your workshop directory will look like this:Create Documents for Semantic Search
The semantic search example requires text documents. Create a sample documents directory:Verify Setup
Run the setup verification script:- Ollama is running
- Required models are available
- The Ollama SDK is properly installed
- Example files are accessible
