Please use. NET 5,. NET 6, NET 7,. NET 8, NET Framework 4.6, NET Framework 4.61, NET Framework 4.7, NET Framework 4.72, NET Framework 4.8, NET Framework 4.81, and. NET Core 3.1 versions

InferenceBackend 枚举

Represents software inference backends supported by the system. 表示系统支持的软件推理后端

Definition

命名空间: DeploySharp.Engine
程序集: DeploySharp (在 DeploySharp.dll 中) 版本:0.0.4+6e8a2e904469617cd59619d666c0e272985c0e33
C#
public enum InferenceBackend

备注

Each backend provides optimized execution for specific hardware configurations. 每种后端为特定的硬件配置提供优化执行

成员

OpenVINO0 Intel's OpenVINO toolkit Optimized for Intel CPUs, integrated GPUs and VPUs 英特尔OpenVINO工具套件 针对Intel CPU、集成GPU和VPU优化
OnnxRuntime1 Microsoft's ONNX Runtime Cross-platform inference accelerator with multiple execution providers 微软ONNX运行时 跨平台推理加速器,支持多执行提供程序
TensorRT2 NVIDIA's TensorRT High-performance deep learning inference optimizer and runtime 英伟达TensorRT 高性能深度学习推理优化器和运行时

参见