You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
onnxruntime using CUDA is packaged with both onnxruntime_USE_CUDA and onnxruntime_DISABLE_CONTRIB_OPS which effectively disables CUDA and leads to following error on runtime
> onnxruntime_test Phi-3-mini-128k-instruct-onnx/cuda/cuda-int4-rtn-block-32/phi3-mini-128k-instruct-cuda-int4-rtn-block-32.onnx2024-05-11 10:27:29.056205252 [W:onnxruntime:Default, onnxruntime_pybind_state.cc:2103 CreateInferencePybindStateModule] Init provider bridge failed.Traceback (most recent call last): File "/nix/store/3wc2a16gdvms53vgr2jp9f8z2mv55dkw-python3.11-onnxruntime-1.17.3/bin/.onnxruntime_test-wrapped", line 9, in <module> sys.exit(main()) ^^^^^^ File "/nix/store/igdsm7xzsfsbyjfhrvgw23xsxj21fgln-python3-3.11.9-env/lib/python3.11/site-packages/onnxruntime/tools/onnxruntime_test.py", line 159, in main exit_code, _, _ = run_model(args.model_path, args.num_iters, args.debug, args.profile, args.symbolic_dims) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/nix/store/igdsm7xzsfsbyjfhrvgw23xsxj21fgln-python3-3.11.9-env/lib/python3.11/site-packages/onnxruntime/tools/onnxruntime_test.py", line 88, in run_model sess = onnxrt.InferenceSession( ^^^^^^^^^^^^^^^^^^^^^^^^ File "/nix/store/igdsm7xzsfsbyjfhrvgw23xsxj21fgln-python3-3.11.9-env/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 419, in __init__ self._create_inference_session(providers, provider_options, disabled_optimizers) File "/nix/store/igdsm7xzsfsbyjfhrvgw23xsxj21fgln-python3-3.11.9-env/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 472, in _create_inference_session sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^onnxruntime.capi.onnxruntime_pybind11_state.InvalidGraph: [ONNXRuntimeError] : 10 : INVALID_GRAPH : Load model from Phi-3-mini-128k-instruct-onnx/cuda/cuda-int4-rtn-block-32/phi3-mini-128k-instruct-cuda-int4-rtn-block-32.onnx failed:This is an invalid model. In Node, ("/model/layers.0/input_layernorm/LayerNorm", SimplifiedLayerNormalization, "", -1) : ("/model/embed_tokens/Gather/output_0": tensor(float16),"model.layers.0.input_layernorm.weight": tensor(float16),) -> ("/model/layers.0/input_layernorm/output_0": tensor(float16),) , Error No Op registered for SimplifiedLayerNormalization with domain_version of 14
> onnxruntime_test Phi-3-mini-128k-instruct-onnx/cuda/cuda-fp16/phi3-mini-128k-instruct-cuda-fp16.onnx2024-05-11 10:27:46.458111088 [W:onnxruntime:Default, onnxruntime_pybind_state.cc:2103 CreateInferencePybindStateModule] Init provider bridge failed.Traceback (most recent call last): File "/nix/store/3wc2a16gdvms53vgr2jp9f8z2mv55dkw-python3.11-onnxruntime-1.17.3/bin/.onnxruntime_test-wrapped", line 9, in <module> sys.exit(main()) ^^^^^^ File "/nix/store/igdsm7xzsfsbyjfhrvgw23xsxj21fgln-python3-3.11.9-env/lib/python3.11/site-packages/onnxruntime/tools/onnxruntime_test.py", line 159, in main exit_code, _, _ = run_model(args.model_path, args.num_iters, args.debug, args.profile, args.symbolic_dims) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/nix/store/igdsm7xzsfsbyjfhrvgw23xsxj21fgln-python3-3.11.9-env/lib/python3.11/site-packages/onnxruntime/tools/onnxruntime_test.py", line 88, in run_model sess = onnxrt.InferenceSession( ^^^^^^^^^^^^^^^^^^^^^^^^ File "/nix/store/igdsm7xzsfsbyjfhrvgw23xsxj21fgln-python3-3.11.9-env/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 419, in __init__ self._create_inference_session(providers, provider_options, disabled_optimizers) File "/nix/store/igdsm7xzsfsbyjfhrvgw23xsxj21fgln-python3-3.11.9-env/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 472, in _create_inference_session sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^onnxruntime.capi.onnxruntime_pybind11_state.InvalidGraph: [ONNXRuntimeError] : 10 : INVALID_GRAPH : Load model from Phi-3-mini-128k-instruct-onnx/cuda/cuda-fp16/phi3-mini-128k-instruct-cuda-fp16.onnx failed:This is an invalid model. In Node, ("/model/layers.0/input_layernorm/LayerNorm", SimplifiedLayerNormalization, "", -1) : ("/model/embed_tokens/Gather/output_0": tensor(float16),"model.layers.0.input_layernorm.weight": tensor(float16),) -> ("/model/layers.0/input_layernorm/output_0": tensor(float16),) , Error No Op registered for SimplifiedLayerNormalization with domain_version of 14
Enter shell with onnxruntime and CUDA enabled (sorry, but I'm not sure how to run one-liner nix-shell command with CUDA enabled)
In your config or overlay
Describe the bug
onnxruntime
using CUDA is packaged with bothonnxruntime_USE_CUDA
andonnxruntime_DISABLE_CONTRIB_OPS
which effectively disables CUDA and leads to following error on runtimeRelated issue mainstream microsoft/onnxruntime#20658
Steps To Reproduce
Steps to reproduce the behavior:
nix-shell
command with CUDA enabled)In your config or overlay
shell.nix
Expected behavior
onnxruntime starts executing on GPU
Additional context
Removing
onnxruntime_DISABLE_CONTRIB_OPS
allows the model to startNotify maintainers
@jonringer @puffnfresh @ck3d @cbourjau @wexder
Metadata
Add a 👍 reaction to issues you find important.
The text was updated successfully, but these errors were encountered: