Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Conversion of some models are buggy #939

Open
KokinSok opened this issue Feb 13, 2024 · 0 comments
Open

Conversion of some models are buggy #939

KokinSok opened this issue Feb 13, 2024 · 0 comments

Comments

@KokinSok
Copy link

KokinSok commented Feb 13, 2024

Describe the bug
In converting the Huggingface OpenAi Whisper models to Onnx with the Olive configuration, I am getting a few different problems.

Partly this depends on the version of the Onnxruntime I have tried. Current versions, from 1.15.0 onwards, the models are giving different errors, depending on how I load them.

To Reproduce
Convert the Whisper Model using Olive
set model="openai/whisper-large.en"
set config="whisper_cpu_int8.json"
python prepare_whisper_configs.py --model_name %model%
python -m olive.workflows.run --config %config% --setup
python -m olive.workflows.run --config %config%

Then try to load the model:
string path = "C:\Your\Path\Goes\Here\";
string ModelPath = Path.Combine(path, "Models\Whisper.Small.en\model.onnx");
Function Model = Function.Load(ModelPath, DeviceDescriptor.CPUDevice, ModelFormat.ONNX);
WhisperConfig Config = new WhisperConfig(ModelPath);
var Inputs = WhisperConfig.BuildWhisperInput(Path.Combine(path, "Audio\sampleaudio.wav"));

Gives the Error:

System.ApplicationException: Failed to load model: 'At top level graph without matching NodeArg that subgraph consumes. Name=s_d_decoder.model.decoder.embed_tokens.weight_quantized Graph may not conform to the ONNX spec and contain initializers that are not graph inputs.'

[CALL STACK]
> CNTK::TrainingParameterSchedule:: GetMinibatchSize
- CNTK:: XavierInitializer
- CNTK::Function:: Load
- CSharp_CNTK_Function__Load__SWIG_0
- 00007FF7AAEF2975 (SymFromAddr() error: The specified module could not be found.)

Or, using the example code found here, and loading the model using the Onnxruntime, I get the error:

at Microsoft.ML.OnnxRuntime.NativeApiStatus.VerifySuccess(IntPtr nativeStatus) in Microsoft.ML.OnnxRuntime\NativeApiStatus.cs:line 23
at Microsoft.ML.OnnxRuntime.InferenceSession.Init(String modelPath, SessionOptions options, PrePackedWeightsContainer prepackedWeightsContainer) in Microsoft.ML.OnnxRuntime\InferenceSession.cs:line 595
at Microsoft.ML.OnnxRuntime.InferenceSession..ctor(String modelPath, SessionOptions options) in Microsoft.ML.OnnxRuntime\InferenceSession.cs:line 124
at MLWebApi.MLServices.AutomaticSpeechRecognition..ctor(String modelPath) in C:\Users\User\Desktop\C# Code\OnnxConsoleApp\MLWebApi\MLServices\AutomaticSpeechRecognition.cs:line 88
at Program.

$(String[] args) in C:...

Sometimes, using different versions of the Runtime lib, I get this error:

[ErrorCode:Fail] subgraph_whisper_encoder.cc:43 onnxruntime::contrib::transformers::WhisperEncoderSubgraph::Validate expect 2 inputs, got:3

at Microsoft.ML.OnnxRuntime.NativeApiStatus.VerifySuccess(IntPtr nativeStatus)
at Microsoft.ML.OnnxRuntime.InferenceSession.Init(String modelPath, SessionOptions options, PrePackedWeightsContainer prepackedWeightsContainer) in Microsoft.ML.OnnxRuntime\InferenceSession.cs:line 595
at Microsoft.ML.OnnxRuntime.InferenceSession..ctor(String modelPath, SessionOptions options) in Microsoft.ML.OnnxRuntime\InferenceSession.cs:line 124
at MLWebApi.MLServices.AutomaticSpeechRecognition..ctor(String modelPath) in C:\Users\User\Desktop\C# Code\OnnxConsoleApp\MLWebApi\MLServices\AutomaticSpeechRecognition.cs:line 87
at Program.

$(String[] args) in C:...

Expected behavior
Would be fantastic for this to work!

Olive config
Standard, out of the box config with new env and all requirements.txt installed.

Olive logs
No errors were reported.

Other information

  • OS: Windows 10
  • Olive version: 0.5.0 or main]
  • ONNXRuntime package and version: CPU atm

This is a fantastic tool, it is very useful! Once a few bugs are ironed out, it will be even better! Thank You All!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant