Skip to content

carlos9310/pre-trained-language-models

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 

Repository files navigation

pre-trained-language-models

  • [bengio-NPLM] A Neural Probabilistic Language Model

  • [Mikolov-word2vec-models] Efficient Estimation of Word Representations in Vector Space

  • [Mikolov-word2vec-training for skip-gram] Distributed Representations of Words and Phrases and their Compositionality

  • [doc2vec] Distributed representations of sentences and documents

  • [ELMO] Deep contextualized word representations

  • [GPT] Improving Language Understandingby Generative Pre-Training

  • BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

  • [GPT2.0] Language Models are Unsupervised Multitask Learners

  • XLNet: Generalized Autoregressive Pretraining for Language Understanding

About

总结NLP中常见的几种预训练语言模型

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages