You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
我想使用哈工大讯飞联合实验室发布法律领域ELECTRA预训练模型,legal-ELECTRA-small, Chinese: 12-layer, 256-hidden, 4-heads, 12M parameters。这个来做对比实验,这两行代码应该怎么修改?1.plm, tokenizer, _model_config, WrapperClass = load_plm("bert", "./lawformer")
model = AutoModelForMaskedLM.from_pretrained('./lawformer')
我在实例文档中没看到示例,帮忙解答下!谢谢!
Beta Was this translation helpful? Give feedback.
All reactions