Skip to content

Examples

We provide examples how to use Model Navigator to optimize models in frameworks (PyTorch, TensorFlow2, JAX, ONNX), from existing .nav packages, and also how to deploy optimized models on the NVIDIA Triton Inference Server.

Optimize models in frameworks

You can find examples per each supported framework.

PyTorch:

TensorFlow:

JAX:

ONNX:

Optimize Navigator Package

The Navigator Package can be reused for optimize e.g. on the new hardware or with newer libraries. The example code can be found in examples/package.

Using model on Triton Inference Server

The optimized model by Triton Model Navigator can be used for serving inference through Triton Inference Server. The example code can be found in examples/triton.