-
Notifications
You must be signed in to change notification settings - Fork 28
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
loss rise after 6k #12
Comments
in train config "phoneme_level_encoder_step=60000". It mean before 60000 steps, no gradient apply for "phoneme_level_predictor" and i do not add "phone_level_loss" in "total_loss", after 60000 step gradient is apply for "phoneme_level_predictor" and "phone_level_loss" is add to "total_loss" ( Hence the toal loss rise from 60000) |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hello, I use the AIShell dataset to synthesize very poorly, making it difficult to read the entire sentence and only uttering a few syllables. (However, I previously used the Vivos Vietnamese training set to produce a fairly good result. Although I don't understand Vietnamese very well, it is at least fluent, so I think the code should be okay.) I observed your total_ Loss began to rise after 8K steps, and my model also had similar problems at 6K steps. Besides, my phone_ level_loss has been in a state of shock. Do you know the probable cause?
![image](https://user-images.githubusercontent.com/118953161/226503370-2d83cc40-474a-42d9-81e9-1f6ba48ca4df.png)
![image](https://user-images.githubusercontent.com/118953161/226503398-4adeb494-5e95-4032-b4a7-4db32273189e.png)
The text was updated successfully, but these errors were encountered: