kedro_onnx.inference.inference#

Contains functions for scoring Onnx models.

Module Contents#

Classes#

InferenceOptions

Options for the inference session.

Functions#

run(→ Any)

Runs an ONNX model.

class kedro_onnx.inference.inference.InferenceOptions[source]#

Bases: TypedDict

Options for the inference session.

See more info at: https://onnxruntime.ai/docs/api/python/api_summary.html#inferencesession

sess_options :Any[source]#
providers :Any[source]#
provider_options :Any[source]#
kwargs :Dict[str, Any][source]#
kedro_onnx.inference.inference.run(model: kedro_onnx.typing.ModelProto, inputs: Union[Dict[str, Any], Any], output_names: Union[Any, None] = None, inference_options: Union[InferenceOptions, None] = None, run_options: Union[Dict[str, Any], None] = None) Any[source]#

Runs an ONNX model.

Parameters:
  • model (ModelProto) – ONNX model.

  • inputs (Union[Dict[str, Any], Any]) – Inputs to the model. Keys are the input names defined in the initial_types or in the model itself. Values are the input data. If the input passed is not a dictionary, the function creates a dictionary like this {“input”: inputs}.

  • inference_options (Dict[str, Any], optional) – Options for the inference sess_options: Session options. providers: List of providers. provider_options: List of provider options. kwargs (Dict[str, Any]): Other options.

  • run_options – Options for the inference run.

Returns:

Output of the model.

Return type:

Any