Usage example for the AllenNLP BiDAF pre-trained model
-
Updated
Oct 12, 2018 - Jupyter Notebook
Usage example for the AllenNLP BiDAF pre-trained model
Keyphrase or Keyword Extraction 基于预训练模型的中文关键词抽取方法(论文SIFRank: A New Baseline for Unsupervised Keyphrase Extraction Based on Pre-trained Language Model 的中文版代码)
We start a company-name recognition task with a small scale and low quality training data, then using skills to enhanced model training speed and predicting performance with least artificial participation. The methods we use involve lite pre-training models such as Albert-small or Electra-small with financial corpus, knowledge of distillation an…
2020年贝壳找房问答匹配竞赛
IndoELECTRA: Pre-Trained Language Model for Indonesian Language Understanding
The code of our paper "SIFRank: A New Baseline for Unsupervised Keyphrase Extraction Based on Pre-trained Language Model"
Calculating FLOPs of Pre-trained Models in NLP
Question Answering Chatbot with DistilRoBERTa Sentence Embeddings, Dialogflow and Ngrok
Super Tickets in Pre-Trained Language Models: From Model Compression to Improving Generalization (ACL 2021)
The source code for the project FormalWriter.com
Code for CascadeBERT, Findings of EMNLP 2021
PLM 기반 한국어 개체명 인식 (NER)
Towards Comprehensive Understanding of Bias in Pre-trained Neural Language Models: A Survey with Special Emphasis on Affective Bias
Zero-shot Transfer Learning from English to Arabic
EMNLP 2020 GigaBERT Arabic Relation extraction system, named entity recognition, IE
Code for EMNLP 2021 main conference paper "Dynamic Knowledge Distillation for Pre-trained Language Models"
Codes and material used for evaluating PLMs on dialogue response dynamics
A novel method to tune language models. Codes and datasets for paper ``GPT understands, too''.
Add a description, image, and links to the pre-trained-language-models topic page so that developers can more easily learn about it.
To associate your repository with the pre-trained-language-models topic, visit your repo's landing page and select "manage topics."