Model Inputs and Outputs
model_navigator.api.triton.InputTensorSpec
dataclass
Bases: BaseTensorSpec
Stores specification of single input tensor.
This includes name, shape, dtype and more parameters available for input tensor in Triton Inference Server:
Read more in Triton Inference server model configuration
Parameters:
-
optional
(bool
, default:False
) –Flag marking the input is optional for the model execution
-
format
(Optional[InputTensorFormat]
, default:None
) –The format of the input.
-
allow_ragged_batch
(bool
, default:False
) –Flag marking the input is allowed to be "ragged" in a dynamically created batch.
model_navigator.api.triton.InputTensorFormat
Bases: Enum
Format for input tensor.
Read more in Triton Inference server model configuration
Parameters:
-
FORMAT_NONE
–0
-
FORMAT_NHWC
–1
-
FORMAT_NCHW
–2
model_navigator.api.triton.OutputTensorSpec
dataclass
Bases: BaseTensorSpec
Stores specification of single output tensor.
This includes name, shape, dtype and more parameters available for output tensor in Triton Inference Server:
Read more in Triton Inference server model configuration
Parameters: