Choosing an LLM Provider - protospatial/NodeToCode GitHub Wiki

Best Practices → Choosing an LLM Provider


Provider/Model Breakdown

Model Strengths Best For
Anthropic Claude 3.5 Sonnet - Exceptional use of UE C++ patterns- Macro usage- Great documentation Complex systems, Performant code
Anthropic Claude 3.5 Haiku - Use of UE C++ patterns- Macro usage- Minimal or no documentation- Fast Less complex systems, Prototypes, Affordability
OpenAI GPT o3-mini - Great use of UE C++ patterns- Macro usage- Great documentation Complex systems requiring thoughtful logic reasoning, Performance-minded code, Great pricing
OpenAI GPT o1 - Exceptional use of UE C++ patterns- Macro usage- Exceptional documentation Complex systems requiring advanced logic reasoning, Performance-minded code
OpenAI GPT-4o - Decent use of UE C++ patterns- Moderate code structure Medium complexity translations
OpenAI GPT-4o-mini - Inconsistent use of UE C++ patterns- General code structure- Fast Less complex translations, Prototypes, Affordability
OpenAI GPT o3-mini - Great use of UE C++ patterns- Macro usage- Great documentation Complex systems requiring thoughtful logic reasoning, Performance-minded code, Great pricing
Gemini 2.0 Flash Thinking - Great use of UE C++ patterns- Macro usage- Decent documentation Complex systems requiring advanced logic reasoning, Performance-minded code, Massive 2M token context window
Gemini 2.0 Pro - Good use of UE C++ patterns- Macro usage- Good documentation Complex systems, Performance-minded code, Massive 2M token context window
Gemini 2.0 Flash - Decent use of UE C++ patterns- Macro usage- Good documentation Fast, Complex systems, Performant code, Huge 1M token context window
Gemini 1.5 Pro - Adequate use of UE C++ patterns- Okay documentation Less complex systems, Massive 2M token context window
Gemini 1.5 Flash - Use of UE C++ patterns Okay documentation Very fast, Quick Conversions, Huge 1M token context window
DeepSeek R1 - Adequate use of UE C++ patterns- Macro usage- Acceptable documentation Complex systems requiring thoughtful logic reasoning, Performance-minded code, Cost-Effective Capabilities
DeepSeek V3 - Adequate use of UE C++ patterns- Moderate code structure- Moderate Macro usage- Documentation Quick conversions, Prototypes, Affordability
Ollama - Local processing- Wide range of model capabilities- Ability to use custom models Full control of model, 100% Privacy, Offline work, Cost-effectiveness

[!NOTE] Claude 3.5 Sonnet and o3-mini typically produce the most idiomatic Unreal Engine C++ code with proper macro usage and memory management patterns for reasonable pricing.

[!IMPORTANT] DeepSeek very likely does use API data for training their models, so use at your own discretion.

Model Selection Guidelines

  1. Consider Your Needs

    • Translation complexity
    • Budget constraints
    • Privacy requirements
    • Quality expectations
  2. Balancing Factors

    • Cost per token
    • Translation quality
    • Processing speed
    • Context window size

[!NOTE] More capable models typically cost more but require less manual refinement of the generated code.