Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

loss rise after 6k #12

Open
tuntun990606 opened this issue Mar 21, 2023 · 3 comments
Open

loss rise after 6k #12

tuntun990606 opened this issue Mar 21, 2023 · 3 comments

Comments

@tuntun990606
Copy link

tuntun990606 commented Mar 21, 2023

Hello, I use the AIShell dataset to synthesize very poorly, making it difficult to read the entire sentence and only uttering a few syllables. (However, I previously used the Vivos Vietnamese training set to produce a fairly good result. Although I don't understand Vietnamese very well, it is at least fluent, so I think the code should be okay.) I observed your total_ Loss began to rise after 8K steps, and my model also had similar problems at 6K steps. Besides, my phone_ level_loss has been in a state of shock. Do you know the probable cause?
image
image

@tuanh123789
Copy link
Owner

in train config "phoneme_level_encoder_step=60000". It mean before 60000 steps, no gradient apply for "phoneme_level_predictor" and i do not add "phone_level_loss" in "total_loss", after 60000 step gradient is apply for "phoneme_level_predictor" and "phone_level_loss" is add to "total_loss" ( Hence the toal loss rise from 60000)

@tuanh123789
Copy link
Owner

You can check the loss function here
image

@tuntun990606
Copy link
Author

You can check the loss function here image

thank you , i will check it

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants