Choose between local offline AI or cloud-based NVIDIA NIMs
Offline (Local)
Uses locally running models via Ollama or llama.cpp. No internet required.
Online (NVIDIA NIMs)
Uses NVIDIA NIMs cloud API for more powerful models. Requires internet connection.