Tutorials#
Learn how to deploy models and run inference.
Prerequisites#
NMP_BASE_URLenvironment variable set to your platform URLAppropriate API credentials (NGC API key, HuggingFace token, or external provider keys)
Guides#
Deploy Models — Deploy from NGC, HuggingFace, Customizer, or external providers
Run Inference — Route requests via model entity, provider, or OpenAI routing