Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Training custom classifier #65

Open
FOX111 opened this issue Apr 16, 2020 · 1 comment
Open

Training custom classifier #65

FOX111 opened this issue Apr 16, 2020 · 1 comment

Comments

@FOX111
Copy link

FOX111 commented Apr 16, 2020

Hello, I've seen your code at the front page for training a language model

`from fastai.text import *
import multifit

exp = multifit.from_pretrained("name of the model")
fa_config =  exp.pretrain_lm.tokenizer.get_fastai_config(add_open_file_processor=True)
data_lm = (TextList.from_folder(imdb_path, **fa_config)
            .filter_by_folder(include=['train', 'test', 'unsup']) 
            .split_by_rand_pct(0.1)
            .label_for_lm()           
            .databunch(bs=bs))
learn = exp.finetune_lm.get_learner(data_lm)  
# learn is a preconfigured fastai learner with a pretrained model loaded
learn.fit_one_cycle(10)
learn.save_encoder("enc")
...`

I would like to ask how I can then train my own classifier on top of this model, since all guidlines described here https://docs.fast.ai/text.html assume AWD-LSTM architecture, so they will not work with MULTIFIT language model as an encoder.

Thanks

@ghost
Copy link

ghost commented Jun 8, 2021

Hi @FOX111, did you manage to solve it or find any workaround?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant