-
Notifications
You must be signed in to change notification settings - Fork 181
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Tuning number of topics in LDA K #171
Comments
Hello @Enixam! 🙌 I have been moving away from using the topicmodels package in favor of the stm package for topic modeling, for a variety of reasons (speed, ease of use, document-level covariates, etc) so I'd be more interested pursuing options in that direction. In 2018 I published this blog post showing how to set up training many models at different values for K, similar to stm's own You can also see how I covered this material at rstudio::conf in January. So this is possible already but does require folks to directly use
So to sum up,
|
Thanks Julia, as always you posts helped me a lot! |
Hi Julia! I'm big fan of the tidy text mining book, but it seems it does not have too much emphasis on how to tune the number of topics (K) in a LDA model, or comparisons of LDA of different K. I find the package ldatuning quite helpful . Would you be interested in implement a wrapper or a similar function in the tidytext package?
The text was updated successfully, but these errors were encountered: