Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Does Microsoft.ML.OnnxRuntimeGenAI.Cuda (version 0.4.0) support Phi-3.5 Vision Onnx format? #943

Open
MaxAkbar opened this issue Sep 28, 2024 · 4 comments

Comments

@MaxAkbar
Copy link

MaxAkbar commented Sep 28, 2024

Describe the bug
After migrating Phi-3.5-vision-instruct to Onnx format I am not able to use the NuGet package Microsoft.ML.OnnxRuntimeGenAI.Cuda version 0.4.0 to load the Onnx model. When referencing the folder where the Onnx model is I get an error that file not found.

Microsoft.ML.OnnxRuntimeGenAI.OnnxRuntimeGenAIException: 
'Load model from C:\Users\***\source\repos\models\microsoft\Phi-3.5-vision-instruct-to-onnx\phi-3.5-v-128k-instruct-vision-onnx\ 
failed:Load model C:\Users\***\source\repos\models\microsoft\Phi-3.5-vision-instruct-to-onnx\phi-3.5-v-128k-instruct-vision-onnx\ failed. 
File doesn't exist'

To Reproduce
Steps to reproduce the behavior:

  1. Follow instructions for converting the Phi-3.5-vision-instruct to onnx format.
  2. Create a simple c# console application and load the model:
using Microsoft.ML.OnnxRuntimeGenAI;

string modelPath = @"C:\models\microsoft\Phi-3.5-vision-instruct-onnx";
using Model model = new Model(modelPath);
  1. After running the application you will get an error Microsoft.ML.OnnxRuntimeGenAI.OnnxRuntimeGenAIException
  2. See error in description above.

Expected behavior
The expected behavior is to have the model loaded and be able to run inference.

Desktop (please complete the following information):

  • OS: Windows 11 Pro
  • Build: Version 10.0.26120 Build 26120
  • Browser Edge

Additional context
I have converted the Phi-3.5-mini-instruct to Phi-3.5-mini-instruct-cuda-fp32-onnx and able to run it without any issues.

@MaxAkbar MaxAkbar changed the title Does Microsoft.ML.OnnxRuntimeGenAI.Cuda support Phi-3.5 Onnx format? Does Microsoft.ML.OnnxRuntimeGenAI.Cuda (version 0.4.0) support Phi-3.5 Onnx format? Sep 28, 2024
@MaxAkbar MaxAkbar changed the title Does Microsoft.ML.OnnxRuntimeGenAI.Cuda (version 0.4.0) support Phi-3.5 Onnx format? Does Microsoft.ML.OnnxRuntimeGenAI.Cuda (version 0.4.0) support Phi-3.5 Vision Onnx format? Sep 28, 2024
@kunal-vaishnavi
Copy link
Contributor

The Phi-3 vision and Phi-3.5 vision models are split into three separate ONNX models: a vision component, an embedding component, and a text component. The build.py file in the instructions you linked should create all three components for you.

According to your error, the vision component cannot be found. Can you check your modelPath folder to see if you have any subfolders named vision_init_export, vision_after_export, or vision_after_opt? It's possible that something failed during the export --> optimize --> quantize process for creating the vision component. If the process failed at any point, then the latest vision component is temporarily saved in one of those subfolders before it is finally saved in modelPath. You may need to delete the modelPath folder and then re-run the build.py file with the latest ONNX Runtime version installed so that the process does not fail.

Please note that re-designed ONNX models for Phi-3 vision and Phi-3.5 vision will be published to enable multi-image support.

@natke
Copy link
Contributor

natke commented Oct 3, 2024

Hi @MaxAkbar, did you check your model directory for the files that Kunal described above?

@MaxAkbar
Copy link
Author

MaxAkbar commented Oct 3, 2024

I just noticed that the file sizes are way too small, so something failed during the conversion :(. Has anyone been able to convert the vision into onnx format? I did look at the output but nothing jumped out at me as an error.

I had a thread here about how to convert to onnx: microsoft/Phi-3CookBook#187

@natke
Copy link
Contributor

natke commented Oct 7, 2024

Thank you @MaxAkbar, would you be able to attach the output from the build.py script here, so that we can parse for errors?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants