Skip to content

MorenoLaQuatra/AMED

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 

Repository files navigation

AMED

Data and code for the paper "Inferring multilingual domain-specific word embeddings from large document corpora"

Pretraining Multilingual models on Wikipedia

The initial training of a general purpose Word2Vec model can be achieved by exploiting the following high-level python library: wiki-word2vec

About

Data and code for the paper "Inferring multilingual domain-specific word embeddings from large document corpora"

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages