-
Notifications
You must be signed in to change notification settings - Fork 413
Description
Describe the bug
When using WinML with ONNX Runtime (ORT), we encounter an issue where both model compilation and execution fail if the model path includes non ASCII Unicode characters. Specifically, when providing a model file path containing such characters, either while compiling the model for each Execution Provider or when loading and running a precompiled model, WinML throws the following exception.
[E:onnxruntime:, inference_session.cc:2545 onnxruntime::InferenceSession::Initialize::<lambda_b590a375cc4159bef6c92b76b4894c14>::operator ()] Exception during initialization: No mapping for the Unicode character exists in the target multi-byte code page.
The issue does not occur when using the CPU Execution Provider, and inference completes successfully. However, when using the OpenVINO Execution Provider, an exception is thrown. I have not tested other Execution Providers, but they may also be affected.
Steps to reproduce the bug
- Select OpenVINO Execution Provider.
- Follow the compilation and inference instructions of CSharp in the documentation: Run ONNX models using the ONNX Runtime included in Windows ML
// Prepare compilation options
OrtModelCompilationOptions compileOptions = new(sessionOptions);
compileOptions.SetInputModelPath(modelPath);
compileOptions.SetOutputModelPath(compiledModelPath);
// Compile the model
compileOptions.CompileModel();
// Create inference session using compiled model
using InferenceSession session = new(compiledModelPath, sessionOptions);- Provide a model path that contains non ASCII characters (in my case, Japanese characters) when compiling the model.
- Use the same model path when running inference with the compiled model.
Expected behavior
The model should compile successfully, and inference should run without errors.
Screenshots
No response
NuGet package version
None
Packaging type
No response
Windows version
No response
IDE
No response
Additional context
Windows App SDK 1.8.251106002
Windows ML Runtime Intel OpenVINO Execution Provider 1.8.26.0