AI Service Setup - smart-coder997/recommendarr GitHub Wiki
AI Service Setup
To enable AI-powered recommendations, you need to configure an AI service.
- Navigate to the Settings page in Recommendarr.
- Select the AI Service tab.
- Enter your AI service details:
- API URL:
- For OpenAI:
https://api.openai.com/v1
- For local models (like Ollama or LM Studio): Use your local server URL (e.g.,
http://localhost:11434/v1
orhttp://localhost:1234/v1
). - For other OpenAI-compatible APIs: Enter the appropriate base URL provided by the service.
- For OpenAI:
- API Key:
- Enter your OpenAI API key if using OpenAI.
- Enter the appropriate API key if required by your chosen service.
- This may not be needed for some local AI servers (often ollama or lm-studio can be used as a placeholder).
- Model: Select a specific model from the dropdown list compatible with your API endpoint, or leave it as the default if your endpoint uses a standard model. See Compatible AI Services for recommendations.
- Parameters: Adjust advanced settings like Max Tokens (controls response length) and Temperature (controls creativity/randomness) as needed. Defaults are usually sufficient.
- Click Save Settings.