New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Tensorflow compatibility with pyinstaller #66421
Labels
Comments
Hi @Bhavi-cd ,
Thank you! |
Venkat6871
added
the
stat:awaiting response
Status - Awaiting response from author
label
Apr 26, 2024
Hi @Venkat6871 |
google-ml-butler
bot
removed
the
stat:awaiting response
Status - Awaiting response from author
label
Apr 26, 2024
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Labels
Issue type
Support
Have you reproduced the bug with TensorFlow Nightly?
No
Source
source
TensorFlow version
2.16.1
Custom code
Yes
OS platform and distribution
Windows
Mobile device
No response
Python version
3.11.9
Bazel version
No response
GCC/compiler version
No response
CUDA/cuDNN version
No response
GPU model and memory
No response
Current behavior?
I have encountered a problem while running my custom trained YOLOv8-obb model for object detection using an executable (.exe) file generated with PyInstaller. The model loads properly, but when attempting to perform inference, the code halts without providing any output.
Steps to Reproduce:
Standalone code to reproduce the issue
Relevant log output
124851 WARNING: Failed to collect submodules for 'keras.src.backend.torch' because importing 'keras.src.backend.torch' raised: AttributeError: module 'torch' has no attribute 'float8_e4m3fn'
The text was updated successfully, but these errors were encountered: