Installation Guide - sharkoil/variancepro GitHub Wiki
Installation Guide
This guide walks you through installing VariancePro and its dependencies.
📋 Requirements
- Python 3.8 or higher
- 8GB+ RAM (recommended for AI model performance)
- Ollama installed locally with Gemma3 model
- Modern web browser (Chrome, Firefox, Safari, Edge)
- Internet connection (for news intelligence features only)
🚀 Installation Steps
1. Clone the Repository
git clone https://github.com/sharkoil/variancepro.git
cd variancepro
2. Install Python Dependencies
You have three options for installing dependencies:
Core Dependencies (Recommended)
pip install -r requirements.txt
Minimal Dependencies
pip install -r requirements-minimal.txt
Full Feature Set
pip install -r requirements-full.txt
3. Install and Setup Ollama
Install Ollama
Visit https://ollama.ai for platform-specific installation instructions.
Pull Recommended Models
# Primary model for analysis
ollama pull gemma3:latest
# Advanced reasoning model
ollama pull deepseek-r1:14b
# Fast response model
ollama pull qwen3:8b
# Multi-modal capabilities
ollama pull llava:latest
Verify Installation
ollama list
4. Verify Installation
# Check Python version
python --version # Should be 3.8+
# Validate Python environment
python -c "import gradio, pandas, requests; print('Dependencies OK')"
# Check Ollama connectivity
curl http://localhost:11434/api/tags
⚙️ Environment Setup
Environment Variables
# Set custom ports
export VARIANCEPRO_PORT=7871
export TESTING_FRAMEWORK_PORT=7862
export OLLAMA_HOST=localhost:11434
# Set default model
export DEFAULT_MODEL=gemma3:latest
Windows Environment Variables
set VARIANCEPRO_PORT=7871
set TESTING_FRAMEWORK_PORT=7862
set OLLAMA_HOST=localhost:11434
set DEFAULT_MODEL=gemma3:latest
🐍 Virtual Environment (Recommended)
Create Virtual Environment
# Create virtual environment
python -m venv venv
# Activate virtual environment
# Linux/macOS:
source venv/bin/activate
# Windows:
venv\Scripts\activate
Install Dependencies in Virtual Environment
pip install -r requirements.txt
🔧 Alternative Installation Methods
Using Conda
# Create conda environment
conda create -n variancepro python=3.9
# Activate environment
conda activate variancepro
# Install dependencies
pip install -r requirements.txt
Using Docker (Coming Soon)
Docker support is planned for future releases.
✅ Post-Installation Verification
Test Main Application
python app_new.py
# Should start on http://localhost:7871
Test Enhanced Testing Framework
python test_enhanced_nl_to_sql_ui.py
# Should start on http://localhost:7862
Validate Framework Components
# Validate syntax
python validate_nl_to_sql_syntax.py
# Test basic functionality
python test_framework_basic.py
🛠️ Troubleshooting Installation
Common Issues
Python Version Issues
# Check Python version
python --version
# If using multiple Python versions
python3 --version
python3 -m pip install -r requirements.txt
Dependency Conflicts
# Clean install
pip uninstall -r requirements.txt -y
pip install -r requirements.txt
Ollama Connection Issues
# Start Ollama service
ollama serve
# Check if running
curl http://localhost:11434/api/tags
Port Conflicts
# Check port usage (Windows)
netstat -an | findstr :7871
# Check port usage (Linux/macOS)
lsof -i :7871
# Use alternative port
python app_new.py --port 7872
Performance Optimization
Memory Usage
# Monitor memory usage
python -c "import psutil; print(f'RAM: {psutil.virtual_memory().percent}%')"
# Use lightweight models for faster responses
ollama pull qwen3:8b # Faster alternative to gemma3
Disk Space
Ensure you have sufficient disk space for:
- Python dependencies: ~500MB
- Ollama models: 2-8GB per model
- Sample data: ~100MB
🎯 Next Steps
After successful installation:
- Read the Quick Start Guide to begin using VariancePro
- Review Configuration for customization options
- Explore Usage Guide for detailed features
- Check Analysis Types to understand capabilities
📞 Installation Support
If you encounter issues during installation:
- Check the Troubleshooting Guide
- Search GitHub Issues
- Create a new issue with:
- Your operating system and version
- Python version
- Complete error messages
- Steps you followed