public class OnnxRuntimeInferEngine : IModelInferEngine,
IDisposable
This class provides ONNX Runtime-specific implementation of the IModelInferEngine interface, supporting multiple execution providers (CPU, CUDA, TensorRT, OpenVINO, etc.) 该类提供IModelInferEngine接口的ONNX Runtime特定实现,支持多种执行提供程序(CPU、CUDA、TensorRT、OpenVINO等)
The engine handles model loading, session management, and tensor data conversions. 该引擎处理模型加载、会话管理和张量数据转换
OnnxRuntimeInferEngine | Initializes a new instance of the ONNX Runtime inference engine 初始化ONNX Runtime推理引擎的新实例 |
CreateInvalidProviderException | Creates standardized exception for invalid execution provider combinations 为无效的执行提供程序组合创建标准化异常 |
Dispose | Releases all resources used by the inference engine 释放推理引擎使用的所有资源 |
Equals | Determines whether the specified object is equal to the current object. (继承自 Object。) |
ExecuteModelInference | Executes model inference session 执行模型推理会话 |
Finalize | Allows an object to try to free resources and perform other cleanup operations before it is reclaimed by garbage collection. (继承自 Object。) |
GetHashCode | Serves as the default hash function. (继承自 Object。) |
GetType | Gets the Type of the current instance. (继承自 Object。) |
LoadModel | Loads and configures the ONNX model based on the provided configuration 根据提供的配置加载和配置ONNX模型 |
MemberwiseClone | Creates a shallow copy of the current Object. (继承自 Object。) |
Predict | Executes model inference/prediction using the provided input tensor 使用提供的输入张量执行模型推理/预测 |
PrepareInputTensors | Converts input DataTensor to ONNX Runtime format 将输入DataTensor转换为ONNX Runtime格式 |
ProcessBoolOutput | Processes bool output tensor (converted to byte) and adds to result collection 处理bool输出张量(转换为byte)并添加到结果集合 |
ProcessFloatOutput | Processes float32 output tensor and adds to result collection 处理float32输出张量并添加到结果集合 |
ProcessIntOutput | Processes int32 output tensor and adds to result collection 处理int32输出张量并添加到结果集合 |
ProcessOutputTensors | Converts ONNX Runtime outputs to DataTensor format 将ONNX Runtime输出转换为DataTensor格式 |
ToString | Returns a string that represents the current object. (继承自 Object。) |
inferenceSession | ONNX Runtime inference session ONNX Runtime推理会话 |
inputNodeSize | Count of input nodes 输入节点数量 |
inputNodeTypes | List of input node data types 输入节点类型列表 |
modelConfig | Model configuration reference 模型配置引用 |
outputNodeSize | Count of output nodes 输出节点数量 |
outputNodeTypes | List of output node data types 输出节点类型列表 |
sessionOptions | ONNX Runtime session options ONNX Runtime会话选项 |