Code Your Way Through Ancient Secrets
Download and install Ollama from https://ollama.ai
curl -fsSL https://ollama.ai/install.sh | sh
To allow this website (https://jtm.io/codepedagogy/) to connect to your local Ollama server, you need to set environment variables:
set OLLAMA_ORIGINS=https://jtm.io/codepedagogy/,http://localhost:*
ollama serve
$env:OLLAMA_ORIGINS="https://jtm.io/codepedagogy/,http://localhost:*"
ollama serve
Win + R
, type sysdm.cpl
OLLAMA_ORIGINS
https://jtm.io/codepedagogy/,http://localhost:*
ollama serve
export OLLAMA_ORIGINS="https://jtm.io/codepedagogy/,http://localhost:*"
ollama serve
echo 'export OLLAMA_ORIGINS="https://jtm.io/codepedagogy/,http://localhost:*"' >> ~/.bashrc
source ~/.bashrc
ollama serve
launchctl setenv OLLAMA_ORIGINS "https://jtm.io/codepedagogy/,http://localhost:*"
ollama serve
Download a model suitable for coding assistance:
ollama pull codellama:7b
(good for coding, 4GB)ollama pull llama3.1:8b
(general purpose, 5GB)ollama pull qwen2.5-coder:7b
(excellent for coding, 4GB)
Once Ollama is running with CORS configured:
Ctrl+C
in the terminalollama pull model-name