Please use. NET 5,. NET 6, NET 7,. NET 8, NET Framework 4.6, NET Framework 4.61, NET Framework 4.7, NET Framework 4.72, NET Framework 4.8, NET Framework 4.81, and. NET Core 3.1 versions

DeploySharp.Engine 命名空间

Contains runtime execution engines for model deployment 包含模型部署的运行时执行引擎

备注

Provides various execution environments and runtime implementations for different deployment targets (CPU, GPU, edge devices etc.) 为不同部署目标(CPU、GPU、边缘设备等)提供多种执行环境和运行时实现

DisplayNameAttribute Specifies a display name for enumeration fields 为枚举字段指定显示名称
EnumExtensions Provides extension methods for working with enums 提供处理枚举的扩展方法
InferEngineFactory Factory class for creating inference engine instances based on backend type. 根据后端类型创建推理引擎实例的工厂类
OnnxRuntimeInferEngine ONNX Runtime inference engine implementation ONNX Runtime推理引擎实现
OpenVinoInferEngine OpenVINO inference engine implementation conforming to IModelInferEngine interface OpenVINO推理引擎实现类,遵循IModelInferEngine接口

接口

IModelInferEngine Defines the core interface for model inference engines. 定义模型推理引擎的核心接口

枚举

DeviceType Represents hardware device types available for inference operations. 表示可用于推理操作的硬件设备类型
InferenceBackend Represents software inference backends supported by the system. 表示系统支持的软件推理后端
OnnxRuntimeDeviceType Defines hardware acceleration device types supported by ONNX Runtime 定义ONNX Runtime支持的硬件加速设备类型