-
[bengio-NPLM] A Neural Probabilistic Language Model
-
[Mikolov-word2vec-models] Efficient Estimation of Word Representations in Vector Space
-
[Mikolov-word2vec-training for skip-gram] Distributed Representations of Words and Phrases and their Compositionality
-
[doc2vec] Distributed representations of sentences and documents
-
[ELMO] Deep contextualized word representations
-
[GPT] Improving Language Understandingby Generative Pre-Training
-
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
-
[GPT2.0] Language Models are Unsupervised Multitask Learners
-
XLNet: Generalized Autoregressive Pretraining for Language Understanding
-
Notifications
You must be signed in to change notification settings - Fork 0
carlos9310/pre-trained-language-models
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
总结NLP中常见的几种预训练语言模型
Topics
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published