Skip to content

It's a repo for figuring out ways how to get automatic text summaries to Auto-ML models that perform regression (get compressed and factorized latent representation). Ideally it should be even able to answer questions on properties of that model.

Notifications You must be signed in to change notification settings

kiwi0fruit/talking-regressor

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 

Repository files navigation

talking-regressor

It's a repo for figuring out ways how to get automatic text summaries to Auto-ML models that perform regression (get compressed and factorized latent representation). Ideally it should be even able to answer questions on properties of that model.

I think it would be nice to have a tool that can take some statistical data from you, then it makes a regression that gets a probalistic model with disentangled latent representation (via Beta-TCVAE, simply Factor Analysis or even GAN - but I'm not aware of the state of Auto-ML for GANs). Then the desired "magic" part. First it asks you some questions on obtained disentangled representation. Then it would be able to write an automatic report on the nature of the probabilistic model and even answer questions on it.

At the moment my expertise in this area is limited to models implemented in the jats-semi-supervised-pytorch project. That's not enough for sure. So I would return to this problem sometimes and would write what I found relevant.

May be useful:

  • Study Shows Transformers Possess the Compositionality Power for Mathematical Reasoning
  • Compositional Processing Emerges in Neural Networks Solving Math Problems
  • This may be useful as we are able to generate automatic math formulas from learned model. After all we know the math definition of the model from Auto-ML. Then we can try convert math-seq to text-seq using transformer.
  • But we still don't have a way to get math answers to math-formulated questions to the model (assuming we translated natural language question to math question via transformer)...
  • That if we need general questions. But we can create a big list of allowed hardcoded questions and have math formulations for them. Then I guess we can extract answers to these questions from the model. This way we don't really need a transformer model (or only use it for something simple like style, or getting names and terms, or reducing general questions to hardcoded ones).

About

It's a repo for figuring out ways how to get automatic text summaries to Auto-ML models that perform regression (get compressed and factorized latent representation). Ideally it should be even able to answer questions on properties of that model.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published