Settings

Configure your AI preferences

AI Mode

Choose between local offline AI or cloud-based NVIDIA NIMs

OfflineOnline

Offline (Local)

Uses locally running models via Ollama or llama.cpp. No internet required.

Online (NVIDIA NIMs)

Uses NVIDIA NIMs cloud API for more powerful models. Requires internet connection.