Please use. NET 5,. NET 6, NET 7,. NET 8, NET Framework 4.6, NET Framework 4.61, NET Framework 4.7, NET Framework 4.72, NET Framework 4.8, NET Framework 4.81, and. NET Core 3.1 versions

InferEngineFactoryCreate 方法

Creates an inference engine instance for the specified backend type. 为指定的后端类型创建推理引擎实例

Definition

命名空间: DeploySharp.Engine
程序集: DeploySharp (在 DeploySharp.dll 中) 版本:0.0.4+6e8a2e904469617cd59619d666c0e272985c0e33
C#
public static IModelInferEngine Create(
	InferenceBackend backend
)

参数

backend  InferenceBackend
The inference backend type to create (OpenVINO, ONNX Runtime etc.) 要创建的推理后端类型(OpenVINO、ONNX Runtime等)

返回值

IModelInferEngine
Initialized inference engine implementing IModelInferEngine. Initialized engine requires calling LoadModel(IConfig). 实现了IModelInferEngine的初始化推理引擎。 初始化后的引擎需要调用LoadModel(IConfig)

备注

The created engine is in unloaded state - LoadModel must be called before inference. 创建的引擎处于未加载状态 - 在推理前必须调用LoadModel。

Implementations should optimize backend-specific initialization but defer heavy operations until LoadModel is called. 实现应优化特定后端的初始化,但推迟繁重操作直到调用LoadModel。

异常

NotSupportedException Thrown when requesting an unsupported backend type. 当请求不受支持的后端类型时抛出。
InvalidOperationException Thrown when engine initialization fails. 当引擎初始化失败时抛出。

参见