Skip to content

🦾 PyTorch Implementation for the ICRA'24 Paper, "PROGrasp: Pragmatic Human-Robot Communication for Object Grasping"

License

Notifications You must be signed in to change notification settings

gicheonkang/prograsp

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PROGrasp: Pragmatic Human-Robot Communication for Object Grasping

Gi-Cheon Kang,   Junghyun Kim,   Jaein Kim,   Byoung-Tak Zhang

ICRA 2024 (Paper)

This repository contains the pytorch implementation for the ICRA'24 paper titled "PROGrasp: Pragmatic Human-Robot Communication for Object Grasping".

Demo Video

demo.mp4

Setup and Dependencies

The source code is based on PyTorch v1.9.1+, CUDA 11+ and CuDNN 7+. Anaconda/Miniconda is the recommended to set up this codebase:

  1. Install Anaconda or Miniconda distribution based on Python3.7+ from their downloads' site.
  2. Clone this repository and create an environment:
git clone https://www.github.com/gicheonkang/gst-visdial
conda create -n prograsp python=3.7.16 -y

# activate the environment and install all dependencies
conda activate prograsp
pip install torch==1.9.1+cu111 torchvision==0.10.1+cu111 torchaudio==0.9.1 -f https://download.pytorch.org/whl/torch_stable.html
pip install -r requirements.txt

If you have trouble installing the above, please consult OFA repository. The repository has rich installation know-how.

Download Data

Download the preprocessed and raw data. Simply run the following scripts.

chmod +x scripts/download_data.sh
./scripts/download_data.sh

Train

Run the following scripts if you want to train the visual grounding module.

chmod +x OFA/run_scripts/prograsp/train_progrounding.sh
./OFA/run_scripts/prograsp/train_progrounding.sh

If you want to see the data loader for each module, please see OFA/data/mm_data/.

The file OFA/utils/eval_utils.py contains codes for evaluation

Pre-trained Checkpoints

Please download the checkpoints below.

Model Link
Visual Grounding Download
Question Generation Download
Answer Interpretation Download

Inference & Evaluation

We implement evaluation / inference codes for interactive object discovery. Please check the following jupyter notebook file.

OFA/prograsp_eval.ipynb

Citation

If you use this code or preprocessed data in your research, please consider citing:

@article{kang2023prograsp,
  title={PROGrasp: Pragmatic Human-Robot Communication for Object Grasping},
  author={Kang, Gi-Cheon and Kim, Junghyun and Kim, Jaein and Zhang, Byoung-Tak},
  journal={arXiv preprint arXiv:2309.07759},
  year={2023}
}

Acknowledgements

We use OFA as reference code. Thanks!

License

MIT License

About

🦾 PyTorch Implementation for the ICRA'24 Paper, "PROGrasp: Pragmatic Human-Robot Communication for Object Grasping"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published