Skip to content
#

pre-trained-language-models

Here are 68 public repositories matching this topic...

We start a company-name recognition task with a small scale and low quality training data, then using skills to enhanced model training speed and predicting performance with least artificial participation. The methods we use involve lite pre-training models such as Albert-small or Electra-small with financial corpus, knowledge of distillation an…

  • Updated Aug 10, 2020
  • Python

Improve this page

Add a description, image, and links to the pre-trained-language-models topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the pre-trained-language-models topic, visit your repo's landing page and select "manage topics."

Learn more