Skip to content

[Linux] [Arch] [AMD GPU] onnxruntime unable to load ROCm drivers #13

@torkd

Description

@torkd

I have been trying to run FoxyFace on my machine with moderate success. I can run Babble fine on the CPU, but if I try to run it on the GPU it seems that the onnxruntime is unable to find the ROCm drivers, even if they are installed.

For clarity, my setup is as follows:

  • OS: Arch Linux x86_64
  • Kernel: 6.18.2-arch2-1
  • CPU: AMD Ryzen 7 7800X3D
  • GPU: AMD ATI Radeon RX 7900 XT

pacman -Qs rocm:

rocm packages installed
local/hipblas 7.1.1-1
  ROCm BLAS marshalling library
local/hsa-rocr 7.1.1-2
  HSA Runtime API and runtime for ROCm
local/rocblas 7.1.1-1
  Next generation BLAS implementation for ROCm platform
local/rocm-core 7.1.1-1
  AMD ROCm core package (version files)
local/rocm-device-libs 2:7.1.1-2
  AMD specific device-side language runtime libraries
local/rocm-llvm 2:7.1.1-2
  Radeon Open Compute - LLVM toolchain (llvm, clang, lld)
local/rocm-opencl-runtime 7.1.1-1
  OpenCL implementation for AMD
local/rocm-smi-lib 7.1.1-1
  ROCm System Management Interface Library
local/rocminfo 7.1.1-1
  ROCm Application for Reporting System Info
local/rocrand 7.1.1-1
  Pseudo-random and quasi-random number generator on ROCm
local/rocsolver 7.1.1-1
  Subset of LAPACK functionality on the ROCm platform
local/rocsparse 7.1.1-1
  BLAS for sparse computation on top of ROCm
local/roctracer 7.1.1-1
  ROCm tracer library for performance tracing

Those are the steps I followed to install FoxyFace (as per the Install Guide):

pyenv install 3.12.9 # Using pyenv to manage different Python versions
eval "$(pyenv init - bash)"
pyenv shell 3.12.9
git clone --recurse-submodules https://github.com/Jeka8833/FoxyFace.git
cd FoxyFace/FoxyFace
python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt
pip install onnxruntime-rocm -f https://repo.radeon.com/rocm/manylinux/rocm-rel-7.1.1/ # Installing version 7.1.1 since that's what I have on my system
python Main.py

Here the program would fail with this error:
/home/dav/VR/FaceTracking/FoxyFace/FoxyFace/venv/lib/python3.12/site-packages/onnxruntime/capi/onnxruntime_pybind11_state.so: cannot enable executable stack as shared object requires: Invalid argument

To fix that I ran:

patchelf --clear-execstack /home/dav/VR/FaceTracking/FoxyFace/FoxyFace/venv/lib/python3.12/site-packages/onnxruntime/capi/onnxruntime_pybind11_state.so

Now, when I run the application I get those logs:
latest.log:

logs
2025-12-30 14:42:00,185 [INFO] src.ui.UiImageUtil - (UiImageUtil.py).allow_change_windows_icon(18) [MainThread]: Failed to set app id, app is not Windows
2025-12-30 14:42:00,190 [INFO] __main__ - (Main.py).__init__(46) [MainThread]: Hello, I'm FoxyFace 1.0.4.3
2025-12-30 14:42:00,199 [INFO] src.config.ConfigManager - (ConfigManager.py).__write_task(120) [Config Manager_0]: Config saved, 0 listeners called
2025-12-30 14:42:00,380 [INFO] src.stream.camera.CameraStream - (CameraStream.py).start_new_camera(53) [MainThread]: Camera started
2025-12-30 14:42:00,405 [ERROR] std - (LoggerManager.py).write(11) [MainThread]: /home/dav/VR/FaceTracking/FoxyFace/FoxyFace/venv/lib/python3.12/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:121: UserWarning: Specified provider 'DmlExecutionProvider' is not in available provider names.Available providers: 'MIGraphXExecutionProvider, ROCMExecutionProvider, CPUExecutionProvider'
warnings.warn(

2025-12-30 14:42:00,405 [ERROR] std - (LoggerManager.py).write(11) [MainThread]: /home/dav/VR/FaceTracking/FoxyFace/FoxyFace/venv/lib/python3.12/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:121: UserWarning: Specified provider 'CUDAExecutionProvider' is not in available provider names.Available providers: 'MIGraphXExecutionProvider, ROCMExecutionProvider, CPUExecutionProvider'
warnings.warn(

2025-12-30 14:42:00,405 [ERROR] std - (LoggerManager.py).write(11) [MainThread]: /home/dav/VR/FaceTracking/FoxyFace/FoxyFace/venv/lib/python3.12/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:121: UserWarning: Specified provider 'CoreMLExecutionProvider' is not in available provider names.Available providers: 'MIGraphXExecutionProvider, ROCMExecutionProvider, CPUExecutionProvider'
warnings.warn(

2025-12-30 14:42:00,421 [INFO] std - (LoggerManager.py).write(11) [MainThread]: *************** EP Error ***************
2025-12-30 14:42:00,421 [INFO] std - (LoggerManager.py).write(11) [MainThread]: EP Error /root/onnx_runtime/onnxruntime/onnxruntime-1.22.2/onnxruntime/python/onnxruntime_pybind_state.cc:1087 std::unique_ptr<onnxruntime::IExecutionProvider> onnxruntime::python::CreateExecutionProviderInstance(const onnxruntime::SessionOptions&, const string&, const ProviderOptionsMap&) ROCM_PATH is set but ROCM wasn't able to be loaded. Please install the correct version of ROCM and MIOpen as mentioned in the GPU requirements page, make sure they're in the PATH, and that your GPU is supported.
when using [('DmlExecutionProvider', {'device_id': '0'}), ('CUDAExecutionProvider', {'device_id': '0'}), ('ROCMExecutionProvider', {'device_id': '0'}), 'CoreMLExecutionProvider', 'CPUExecutionProvider']
2025-12-30 14:42:00,421 [INFO] std - (LoggerManager.py).write(11) [MainThread]: Falling back to ['ROCMExecutionProvider', 'CPUExecutionProvider'] and retrying.
2025-12-30 14:42:00,421 [INFO] std - (LoggerManager.py).write(11) [MainThread]: ****************************************
2025-12-30 14:42:00,435 [WARNING] src.config.ConfigUpdateListener - (ConfigUpdateListener.py).__init__(27) [MainThread]: Failed to call update
Traceback (most recent call last):
File "/home/dav/VR/FaceTracking/FoxyFace/FoxyFace/venv/lib/python3.12/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 472, in __init__
  self._create_inference_session(providers, provider_options, disabled_optimizers)
File "/home/dav/VR/FaceTracking/FoxyFace/FoxyFace/venv/lib/python3.12/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 561, in _create_inference_session
  sess.initialize_session(providers, provider_options, disabled_optimizers)
RuntimeError: /root/onnx_runtime/onnxruntime/onnxruntime-1.22.2/onnxruntime/python/onnxruntime_pybind_state.cc:1087 std::unique_ptr<onnxruntime::IExecutionProvider> onnxruntime::python::CreateExecutionProviderInstance(const onnxruntime::SessionOptions&, const string&, const ProviderOptionsMap&) ROCM_PATH is set but ROCM wasn't able to be loaded. Please install the correct version of ROCM and MIOpen as mentioned in the GPU requirements page, make sure they're in the PATH, and that your GPU is supported.


The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/home/dav/VR/FaceTracking/FoxyFace/FoxyFace/src/config/ConfigUpdateListener.py", line 25, in __init__
  self.__update_callback(self.__config_manager)
File "/home/dav/VR/FaceTracking/FoxyFace/FoxyFace/src/pipline/BabblePipeline.py", line 143, in __update_babble_loader_options
  self.__babble_loader.start_new_session(config_manager.config.babble.model_path,
File "/home/dav/VR/FaceTracking/FoxyFace/FoxyFace/src/stream/babble/BabbleModelLoader.py", line 53, in start_new_session
  session = InferenceSession(path, opts, providers=provider)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/dav/VR/FaceTracking/FoxyFace/FoxyFace/venv/lib/python3.12/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 485, in __init__
  raise fallback_error from e
File "/home/dav/VR/FaceTracking/FoxyFace/FoxyFace/venv/lib/python3.12/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 480, in __init__
  self._create_inference_session(self._fallback_providers, None)
File "/home/dav/VR/FaceTracking/FoxyFace/FoxyFace/venv/lib/python3.12/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 561, in _create_inference_session
  sess.initialize_session(providers, provider_options, disabled_optimizers)
RuntimeError: /root/onnx_runtime/onnxruntime/onnxruntime-1.22.2/onnxruntime/python/onnxruntime_pybind_state.cc:1087 std::unique_ptr<onnxruntime::IExecutionProvider> onnxruntime::python::CreateExecutionProviderInstance(const onnxruntime::SessionOptions&, const string&, const ProviderOptionsMap&) ROCM_PATH is set but ROCM wasn't able to be loaded. Please install the correct version of ROCM and MIOpen as mentioned in the GPU requirements page, make sure they're in the PATH, and that your GPU is supported.
Stack (most recent call last):
File "/home/dav/VR/FaceTracking/FoxyFace/FoxyFace/Main.py", line 96, in <module>
  with RunMainStream(__splash):
File "/home/dav/VR/FaceTracking/FoxyFace/FoxyFace/Main.py", line 53, in __init__
  self.__babble_pipeline: BabblePipeline = BabblePipeline(self.__config_manager, self.__media_pipe_pipeline)
File "/home/dav/VR/FaceTracking/FoxyFace/FoxyFace/src/pipline/BabblePipeline.py", line 37, in __init__
  self.__babble_loader_options_listener: ConfigUpdateListener = self.__register_change_babble_loader_options()
File "/home/dav/VR/FaceTracking/FoxyFace/FoxyFace/src/pipline/BabblePipeline.py", line 140, in __register_change_babble_loader_options
  return self.__config_manager.create_update_listener(self.__update_babble_loader_options, watch_array, True)
File "/home/dav/VR/FaceTracking/FoxyFace/FoxyFace/src/config/ConfigManager.py", line 60, in create_update_listener
  listener = ConfigUpdateListener(self, update_callback, call_on_create, watched_elements)
File "/home/dav/VR/FaceTracking/FoxyFace/FoxyFace/src/config/ConfigUpdateListener.py", line 27, in __init__
  _logger.warning("Failed to call update", exc_info=True, stack_info=True)
2025-12-30 14:42:00,438 [INFO] src.stream.vrcft.VrcftAutoConnect - (VrcftAutoConnect.py).__start_loop(66) [VRCFT Auto Connect]: Starting VRCFT Auto Connect
2025-12-30 14:42:02,864 [INFO] src.config.ConfigManager - (ConfigManager.py).__write_task(108) [MainThread]: Config did not change
2025-12-30 14:42:04,443 [INFO] src.stream.vrcft.VrcftAutoConnect - (VrcftAutoConnect.py).__start_loop(86) [VRCFT Auto Connect]: Stopped VRCFT Auto Connect
2025-12-30 14:42:04,444 [ERROR] std - (LoggerManager.py).flush(14) [MainThread]: <src.LoggerManager.LoggerWriter object at 0x7f1d5495a930>
2025-12-30 14:42:04,444 [INFO] std - (LoggerManager.py).flush(14) [MainThread]: <src.LoggerManager.LoggerWriter object at 0x7f1d5495a930>
2025-12-30 14:42:04,447 [ERROR] std - (LoggerManager.py).flush(14) [MainThread]: <src.LoggerManager.LoggerWriter object at 0x7f1d5495a930>

The line that catches my eye is this one:
RuntimeError: /root/onnx_runtime/onnxruntime/onnxruntime-1.22.2/onnxruntime/python/onnxruntime_pybind_state.cc:1087 std::unique_ptr<onnxruntime::IExecutionProvider> onnxruntime::python::CreateExecutionProviderInstance(const onnxruntime::SessionOptions&, const string&, const ProviderOptionsMap&) ROCM_PATH is set but ROCM wasn't able to be loaded. Please install the correct version of ROCM and MIOpen as mentioned in the GPU requirements page, make sure they're in the PATH, and that your GPU is supported.

Is it supposed to look into the /root folder?
Also, since it mentions MIOpen, I went on and installed that as well (sudo pacman -S miopen-hip), but nothing changed.
ROCm is installed correctly, I've used it successfully with other applications (and rocminfo shows correct information).
I'm at a bit of a loss, I'm pretty sure I've tried everything I could think of/found online.

Metadata

Metadata

Assignees

No one assigned

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions