Ollama text generation - Azgaar/Fantasy-Map-Generator GitHub Wiki
What is Ollama?
Ollama is a free program that lets you run AI models directly on your computer. This means:
- No internet required - Works offline
- No monthly fees - Completely free to use
- Private - Your data stays on your computer
- Fast - No waiting for online services
Think of it like having your own personal AI generator living on your computer!
Step 1: Download and Install Ollama
- Go to [ollama.com](https://ollama.com/)
- Click the download button and select your system (Windows, Mac, or Linux)
- Run the downloaded file and follow the installation steps
- That's it! Ollama is now installed.
Step 2: Download an AI Model
After installing Ollama, you need to download an AI model (think of it as the "brain" for your AI):
-
Open Command Prompt (Windows) or Terminal (Mac/Linux):
- Windows: Press
Windows key + R
, typecmd
, press Enter - Mac: Press
Cmd + Space
, typeTerminal
, press Enter - Linux: Press
Ctrl + Alt + T
- Windows: Press
-
Type the command and press Enter:
ollama run llama3.2
-
Wait for the download - This might take 5-30 minutes depending on your internet speed
-
You'll see a prompt like
>>>
when it's ready - just typeexit
to close it -
Visit ollama.com/search to see what other models you can download.
Step 3: Allow Ollama to be accessed from the web
If you are running FMG locally, you don't need this step. But if you want Ollama to be available form the web-version of the FMG, you need to set OLLAMA_ORIGINS.
For Windows Users:
- Press
Windows key + R
- Type
sysdm.cpl
and press Enter - Click "Environment Variables..." button
- Click "New..." and enter:
- Variable name:
OLLAMA_ORIGINS
- Variable value:
https://azgaar.github.io,https://*afmg.netlify.app,http://127.0.0.1:5501
- Variable name:
- Click "OK" on everything
- Restart ollama sever if it's running. You may need to restart you computer as well
For Mac Users:
- Open Terminal
- Type this command:
nano ~/.zshrc
- Add these two lines at the end:
export OLLAMA_ORIGINS="https://azgaar.github.io,https://*afmg.netlify.app,http://127.0.0.1:5501"
- Save and exit:
- Press
Ctrl + X
- Press
Y
to confirm - Press
Enter
to save
- Press
- Apply the changes:
source ~/.zshrc
- Restart your computer
For Linux Users:
- Open Terminal
- Type these commands one by one:
echo 'export OLLAMA_HOST="0.0.0.0"' >> ~/.bashrc echo 'export OLLAMA_ORIGINS="https://azgaar.github.io,https://*afmg.netlify.app,http://127.0.0.1:5501"' >> ~/.bashrc source ~/.bashrc
- Restart your computer
Step 4: Start the server
- Open Command Prompt/Terminal. Type:
ollama serve
. Leave this window open** - Ollama is now running! - Open Fantasy Map Generator. Open AI notes generator and select "ollama" from the AI model list
- In the key field, type:
llama3.2
(or whatever model you downloaded) - Update the prompt and click on
generate
Important: Fantasy Map Generator connects to Ollama at http://localhost:11434/api/generate
. This should work automatically by default. If you need to change the connection address, you can modify the endpoint in the FMG ai-generator.js
file.
That's It! You can now generate text using your local AI model.
Troubleshooting
If it doesn't work:
- Check that
ollama serve
is still running in your command prompt/terminal - Try typing
ollama list
to see if your model downloaded correctly