Installation Guide - rmc0315/firecrawl-ollama-system GitHub Wiki

๐Ÿš€ Installation Guide Complete installation instructions for the Universal Firecrawl + Ollama Integration System.

๐Ÿ“‹ Prerequisites System Requirements Python 3.8+ (Python 3.9+ recommended) 4GB+ RAM (for running AI models) Internet connection (for web scraping) 2GB+ free disk space (for models and reports) Required Services Ollama - Local AI runtime Firecrawl API Key - Web scraping service (free tier available) ๐ŸŽฏ Quick Start (Recommended) Option 1: Automated Setup

Clone the repository

git clone https://github.com/rmc0315/firecrawl-ollama-system.git cd firecrawl-ollama-system

Run automated setup

python setup.py Option 2: Windows Users

Clone the repository

git clone https://github.com/rmc0315/firecrawl-ollama-system.git cd firecrawl-ollama-system

Run Windows-optimized setup

python setup_windows.py The setup script will:

โœ… Check Python compatibility โœ… Install all dependencies โœ… Verify Ollama installation โœ… Test Firecrawl connection โœ… Create launcher scripts โœ… Set up configuration files ๐Ÿ”ง Manual Installation Step 1: Install Python Dependencies pip install -r requirements.txt Step 2: Install Ollama Windows Download from ollama.com/download/windows Run the installer Open Command Prompt and run: ollama serve macOS

Option 1: Download installer

Visit: https://ollama.com/download/mac

Option 2: Homebrew

brew install ollama Linux curl -fsSL https://ollama.com/install.sh | sh Step 3: Install AI Models

Install recommended models

ollama pull llama3.2 # Fast analysis (2GB) ollama pull qwen3 # Deep reasoning (5GB) ollama pull qwen2.5-coder # Structured data (4.7GB) ollama pull phi4 # Comprehensive analysis (9GB)

Or start with just one model

ollama pull llama3.2 Step 4: Get Firecrawl API Key Visit firecrawl.dev Sign up for free account Get your API key from the dashboard The system will prompt you to enter it on first run Step 5: Run the System python universal_firecrawl_ollama.py ๐Ÿ–ฅ๏ธ Platform-Specific Instructions Windows Setup Using PowerShell

Enable script execution (if needed)

Set-ExecutionPolicy -ExecutionPolicy RemoteSigned -Scope CurrentUser

Clone and setup

git clone https://github.com/rmc0315/firecrawl-ollama-system.git cd firecrawl-ollama-system python setup_windows.py

Launch the system

.\launch.bat

or

.\launch.ps1 Using Command Prompt git clone https://github.com/rmc0315/firecrawl-ollama-system.git cd firecrawl-ollama-system python setup_windows.py

Launch the system

launch.bat macOS/Linux Setup

Clone repository

git clone https://github.com/rmc0315/firecrawl-ollama-system.git cd firecrawl-ollama-system

Run setup

python3 setup.py

Make launcher executable and run

chmod +x launch.sh ./launch.sh ๐Ÿณ Docker Installation (Coming Soon)

Pull the image

docker pull firecrawl-ollama-system:latest

Run with GPU support

docker run --gpus all -it firecrawl-ollama-system

Run CPU-only

docker run -it firecrawl-ollama-system โš™๏ธ Configuration Environment Variables (Optional)

Set API key to skip prompts

export FIRECRAWL_API_KEY="fc-your-api-key-here"

Custom Ollama host

export OLLAMA_HOST="http://localhost:11434" Windows Environment Variables

Command Prompt

set FIRECRAWL_API_KEY=fc-your-api-key-here

PowerShell

$env:FIRECRAWL_API_KEY="fc-your-api-key-here" Configuration File The system creates config.py automatically, or you can create it manually:

config.py

FIRECRAWL_API_KEY = "fc-your-api-key-here" OLLAMA_HOST = "http://localhost:11434" REPORTS_DIR = "firecrawl_reports" DEFAULT_SAVE_FORMAT = "html" ๐Ÿงช Testing Installation Quick Test

Test Ollama

ollama list

Test Python packages

python -c "import firecrawl, ollama; print('โœ… Dependencies OK')"

Run system test

python universal_firecrawl_ollama.py Comprehensive Test Run the system: python universal_firecrawl_ollama.py Choose option 1 (Single Website Analysis) Enter URL: https://example.com Enter task: What is this website about? Select any available model Verify analysis completes and offers to save report ๐Ÿšจ Troubleshooting Common Issues "No module named 'firecrawl'"

Ensure you're in the right virtual environment

source venv/bin/activate # Linux/Mac

or

venv\Scripts\activate.bat # Windows

Reinstall dependencies

pip install -r requirements.txt "Ollama connection failed"

Start Ollama service

ollama serve

Check if running

curl http://localhost:11434/api/tags "No models detected"

Install at least one model

ollama pull llama3.2

Check installed models

ollama list "Permission denied" (Linux/Mac)

Make scripts executable

chmod +x launch.sh chmod +x setup.py Getting Help ๐Ÿ“– Check TROUBLESHOOTING.md ๐Ÿ› Report issues on GitHub ๐Ÿ’ฌ Join our Discord community ๐ŸŽฏ What's Next? After installation:

Run your first analysis - Try option 1 with a simple website Explore model comparison - See how different AI models analyze the same content Generate reports - Try different export formats Configure settings - Use option 6 to customize your setup Join the community - Share your results and get help Ready to analyze the web with local AI? Let's go! ๐Ÿš€