Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

This is an invalid model #935

Open
KokinSok opened this issue Feb 9, 2024 · 4 comments
Open

This is an invalid model #935

KokinSok opened this issue Feb 9, 2024 · 4 comments

Comments

@KokinSok
Copy link

KokinSok commented Feb 9, 2024

Describe the bug
Microsoft.ML.OnnxRuntime.OnnxRuntimeException: '[ErrorCode:InvalidGraph] Load model from C:\Users\User\Desktop\C# Code\OnnxConsoleApp\MLWebApi\Models\WhisperSmall.en\model.onnx failed:This is an invalid model. In Node, ("BeamSearch_node", BeamSearch, "com.microsoft", -1) : ("log_mel": tensor(float),"max_length": tensor(int32),"min_length": tensor(int32),"num_beams": tensor(int32),"num_return_sequences": tensor(int32),"length_penalty": tensor(float),"repetition_penalty": tensor(float),"","","","","",) -> ("sequences",) , Error Unrecognized attribute: unidirectional for operator MultiHeadAttention

==> Context: Bad node spec for node. Name: Attention_0 OpType: MultiHeadAttention'

To Reproduce
I have converted the Hugging Face Model: '', to Onnx runtime using Olive in a dedicated environment.

python prepare_whisper_configs.py --model_name openai/whisper-small.en
python -m olive.workflows.run --config whisper_cpu_int8.json --setup
python -m olive.workflows.run --config whisper_cpu_int8.json

Expected behavior
Olive has tons of potential, it needs much better instructions, especially on the package support, because I had a lot of trouble getting packages to work properly, even from a clean fresh env.

Olive config
C:>pip list
Package Version


accelerate 0.26.1
aiohttp 3.9.1
aiosignal 1.3.1
alembic 1.13.1
async-timeout 4.0.3
attrs 23.2.0
bitstring 4.0.2
certifi 2022.9.24
cffi 1.15.1
charset-normalizer 2.1.1
colorama 0.4.5
coloredlogs 15.0.1
colorlog 6.8.2
contextlib2 21.6.0
contourpy 1.2.0
cryptography 40.0.2
cycler 0.12.1
datasets 2.16.1
Deprecated 1.2.14
dill 0.3.7
ecdsa 0.18.0
esptool 4.5.1
ffmpeg-python 0.2.0
filelock 3.8.0
flatbuffers 23.5.26
fonttools 4.48.1
frozenlist 1.4.1
fsspec 2023.10.0
future 0.18.2
greenlet 3.0.3
huggingface-hub 0.20.2
humanfriendly 10.0
idna 3.4
importlib-resources 6.1.1
Jinja2 3.1.3
joblib 1.3.2
kiwisolver 1.4.5
lightning-utilities 0.10.1
Mako 1.3.2
MarkupSafe 2.1.3
matplotlib 3.8.2
more-itertools 9.0.0
mpmath 1.3.0
multidict 6.0.4
multiprocess 0.70.15
networkx 3.2.1
neural-compressor 2.4.1
numpy 1.23.4
olive-ai 0.4.0
onnx 1.15.0
onnxruntime 1.17.0
opencv-python-headless 4.9.0.80
optimum 1.16.1
optuna 3.5.0
packaging 21.3
pandas 2.1.4
pillow 10.2.0
pip 24.0
prettytable 3.9.0
protobuf 3.20.3
psutil 5.9.7
py-cpuinfo 9.0.0
pyarrow 14.0.2
pyarrow-hotfix 0.6
pycocotools 2.0.7
pycparser 2.21
pydantic 1.10.14
pyparsing 3.0.9
pyreadline3 3.4.1
pyserial 3.5
python-dateutil 2.8.2
python-dotenv 1.0.1
pytz 2023.3.post1
PyYAML 6.0
reedsolo 1.6.0
regex 2022.9.13
requests 2.28.1
safetensors 0.4.1
schema 0.7.5
scikit-learn 1.4.0
scipy 1.12.0
semantic-version 2.10.0
sentencepiece 0.1.99
setuptools 65.5.0
setuptools-rust 1.5.2
six 1.16.0
SQLAlchemy 2.0.25
sympy 1.12
tabulate 0.9.0
threadpoolctl 3.2.0
timm 0.9.12
tokenizers 0.15.0
torch 2.1.2
torchaudio 2.1.2
torchmetrics 0.10.3
torchvision 0.16.2
tqdm 4.64.1
transformers 4.36.2
typing_extensions 4.9.0
tzdata 2023.4
urllib3 1.26.12
wcwidth 0.2.13
whisper 1.1.10
wrapt 1.16.0
xxhash 3.4.1
yarl 1.9.4
zipp 3.17.0

Olive logs
Add logs here.

Other information

  • OS: Win 10 22H2 build 19045.3930
  • Olive version: olive-ai 0.4.0
  • ONNXRuntime package and version: onnxruntime 1.17.0

Additional context
This application has tons of potential, it just needs some debugging to make more stable and easier to install and get working!

@KokinSok
Copy link
Author

KokinSok commented Feb 9, 2024

import onnx, onnxruntime

model_name = "C:/OpenAI/OutputModels/CandidateModels/cpu-cpu/BestCandidateModel_1/model.onnx"
onnx_model = onnx.load(model_name)
onnx.checker.check_model(onnx_model)
Traceback (most recent call last):
File "", line 1, in
File "C:\Users\User\anaconda3\envs\Onnx\lib\site-packages\onnx\checker.py", line 148, in check_model
C.check_model(protobuf_string, full_check, skip_opset_compatibility_check)
onnx.onnx_cpp2py_export.checker.ValidationError: No opset import for domain ''

==> Context: Bad node spec for node. Name: /Constant OpType: Constant

@jambayk
Copy link
Contributor

jambayk commented Feb 13, 2024

Hi, thanks for sharing this issue.

What version of ORT are you using for your C# application? Since the model has onnxruntime contrib operators, the version of ORT used to create the model and run it must be the same. Otherwise, there might be mismatch between the operator specs which appears to be the case here judging from the error Unrecognized attribute: unidirectional for operator MultiHeadAttention

The attribute unidirectional was only added in the most recent release 1.17.0 microsoft/onnxruntime#19112

@KokinSok
Copy link
Author

Thank You @jambayk! I will try the whole process again and ensure the same version of the OnnxRunTime Environment is used.

OnnxRuntime 1.15.0 is where Models run, after that, we get errors in the Verify method. However, Whisper wont run in these later versions. I will do a complete convers again using the same version as you suggested and come back to you.
Thank You!

@KokinSok
Copy link
Author

KokinSok commented Feb 21, 2024

Ok, no, I still have the same problem:

dotnet MLWebApi.dll
Unhandled exception. Microsoft.ML.OnnxRuntime.OnnxRuntimeException: [ErrorCode:Fail] subgraph_whisper_encoder.cc:43 onnxruntime::contrib::transformers::WhisperEncoderSubgraph::Validate expect 2 inputs, got:3
at Microsoft.ML.OnnxRuntime.InferenceSession.Init(String modelPath, SessionOptions options, PrePackedWeightsContainer prepackedWeightsContainer)
at Microsoft.ML.OnnxRuntime.InferenceSession..ctor(String modelPath, SessionOptions options)
at MLWebApi.MLServices.AutomaticSpeechRecognition..ctor(String modelPath) in ...\AutomaticSpeechRecognition.cs:line 87
at Program.

$(String[] args) in ...Program.cs:line 45

Line 87:

Session = new InferenceSession(WhisperConfig.ModelPath, sessionOptions);

Pretty darn painful!

This code is running in a WebAPI, so IIS is running the code in a App Pool. I have updated all the Runtimes and so on, and other AI API's run on the same server. Odd...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants