This repository uses pretrain BERT embeddings for transfer learning in QA domain
-
Updated
Dec 18, 2018 - Jupyter Notebook
This repository uses pretrain BERT embeddings for transfer learning in QA domain
Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN
Temporary repository used for collaboration on application of for multiple languages.
PyTorch port of BERT ML model
OpenCV, nltk with Tensorflow all together
Short overview on the must popular models for Named Entity Recognition
Generating English Rock lyrics using BERT
Code related to jigsaw-unintended-bias-in-toxicity-classification based on https://github.com/huggingface/pytorch-pretrained-BERT/
Implementation of BERT that could load official pre-trained models for feature extraction and prediction
Token and Sentence Level Classification with Google's BERT (TensorFlow)
Multiple-Relations-Extraction-Only-Look-Once. Just look at the sentence once and extract the multiple pairs of entities and their corresponding relations. 端到端联合多关系抽取模型,可用于 http://lic2019.ccf.org.cn/kg 信息抽取。
Bidirectional Encoder Representations from Transformers (BERT) transfer learning for named entity recognition and de-identification of sensitive data
使用BERT模型做文本分类;面向工业用途
BERT Base Uncased is used for multi-class sentiment analysis. Hugginface's pytorch implementation of BERT is used.
Google BERT implementation on pytorch-template
TensorFlow code for a lite bert reimplementation
Trying to adapt BERT for images
Add a description, image, and links to the bert-model topic page so that developers can more easily learn about it.
To associate your repository with the bert-model topic, visit your repo's landing page and select "manage topics."