Skip to content
/ PLM-ICD Public

PLM-ICD: Automatic ICD Coding with Pretrained Language Models

License

Notifications You must be signed in to change notification settings

MiuLab/PLM-ICD

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PLM-ICD: Automatic ICD Coding with Pretrained Language Models

image

Reference

Please cite the following paper:

    @inproceedings{huang-etal-2022-plm,
        title = "{PLM}-{ICD}: Automatic {ICD} Coding with Pretrained Language Models",
        author = "Huang, Chao-Wei and Tsai, Shang-Chi and Chen, Yun-Nung",
        booktitle = "Proceedings of the 4th Clinical Natural Language Processing Workshop",
        month = jul,
        year = "2022",
        address = "Seattle, WA",
        publisher = "Association for Computational Linguistics",
        url = "https://aclanthology.org/2022.clinicalnlp-1.2",
        pages = "10--20",
    }

Requirements

  • Python >= 3.6
  • Install the required Python packages with pip3 install -r requirements.txt
  • If the specific versions could not be found in your distribution, you could simple remove the version constraint. Our code should work with most versions.

Dataset

Unfortunately, we are not allowed to redistribute the MIMIC dataset. Please follow the instructions from caml-mimic to preprocess the MIMIC-2 and MIMIC-3 dataset and place the files under data/mimic2 and data/mimic3 respectively.

How to run

Pretrained LMs

Please download the pretrained LMs you want to use from the following link:

  • BioLM: RoBERTa-PM models
  • BioBERT
  • PubMedBERT: you can also set --model_name_or_path microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract when training the model, the script will download the checkpoint automatically.

Trained Models

You can also download our trained models to skip the training part. We provide 3 trained models:

Training

  1. cd src
  2. Run the following command to train a model on MIMIC-3 full.
python3 run_icd.py \
    --train_file ../data/mimic3/train_full.csv \
    --validation_file ../data/mimic3/dev_full.csv \
    --max_length 3072 \
    --chunk_size 128 \
    --model_name_or_path ../models/RoBERTa-base-PM-M3-Voc-distill-align-hf \
    --per_device_train_batch_size 1 \
    --gradient_accumulation_steps 8 \
    --per_device_eval_batch_size 1 \
    --num_train_epochs 20 \
    --num_warmup_steps 2000 \
    --output_dir ../models/roberta-mimic3-full \
    --model_type roberta \
    --model_mode laat

Notes

  • If you would like to train BERT-based or Longformer-base models, please set --model_type [bert|longformer].
  • If you would like to train models on MIMIC-3 top-50, please set --code_50 --code_file ../data/mimic3/ALL_CODES_50.txt
  • If you would like to train models on MIMIC-2, please set --code_file ../data/mimic2/ALL_CODES.txt

Inference

  1. cd src
  2. Run the following command to evaluate a model on the test set of MIMIC-3 full.
python3 run_icd.py \
    --train_file ../data/mimic3/train_full.csv \
    --validation_file ../data/mimic3/test_full.csv \
    --max_length 3072 \
    --chunk_size 128 \
    --model_name_or_path ../models/roberta-mimic3-full \
    --per_device_eval_batch_size 1 \
    --num_train_epochs 0 \
    --output_dir ../models/roberta-mimic3-full \
    --model_type roberta \
    --model_mode laat

About

PLM-ICD: Automatic ICD Coding with Pretrained Language Models

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages