Skip to content

This repository contains implementations of abstractive text summarization using RNN ,RNN with Reinforcement learning and Transformer architectures.

Notifications You must be signed in to change notification settings

Navya0203/Abstractive-Text-Summarization-Using-RNN-and-Transformers

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Abstractive-Text-Summarization-Using-RNN-and-Transformers

This repository contains implementations of abstractive text summarization using Recurrent Neural Networks (RNN) with Gated Recurrent Units (GRU),Recurrent Neural Networks with Reinforcement learning and Transformer architectures. The project aims to explore and compare how different neural network models perform in generating concise and coherent summaries from extensive text data.

Repository Structure

  • Transformer/Custom_Transformer_Model.ipynb: Jupyter notebook with a custom Transformer model tailored for text summarization.
  • Transformer/T5_implementation.ipynb: Application of the pre-trained T5 model to the text summarization task.
  • Transformer/Bart_Implementation.ipynb: Application of the pre-trained BART model to the text summarization task.
  • Transformer/BART_T5_GRAPH.ipynb: Plots graphs from output logs generated by our BART and T5 implemementation
  • RNN/RNN_Model.ipynb: Demonstrates abstractive text summarization using a GRU-based RNN.
  • RNN/RL_RNN_Implementation.ipynb : Implementation of a Hybrid RNN with RL based model.
  • README.md: Provides an overview and instructions for setting up and running the models.

Models Overview

  1. Custom Transformer Model

    • Implements self-attention and multi-headed attention mechanism to assess the importance of each word in the text, aiming to capture complex word relationships more effectively than RNNs.
  2. T5 Model

    • Utilizes a pre-trained Transformer model fine-tuned for summarization, benefiting from transfer learning to enhance summary quality efficiently.
  3. BART Model

    • Implements a denoising autoencoder using a bidirectional encoder to capture rich text features and a left-to-right autoregressive decoder for generating coherent text summaries.
  4. RNN Model

    • Uses GRUs to sequentially process text data and generate summaries, focusing on capturing temporal text dependencies.
  5. RNN with RL

    • Integrates Reinforcement Learning with a recurrent neural network, optimizing the summarization process through reward-based learning to improve the relevance.

Setup and Usage

To set up this project locally, run the following commands:

git clone [email protected]:Navya0203/Abstractive-Text-Summarization-Using-RNN-and-Transformers.git
cd Abstractive-Text-Summarization-Using-RNN-and-Transformers

Setting Up the Data

Prepare the dataset:

Running the Notebooks

Navigate to the RNN directory for its implementations:

cd RNN

you will find two notebooks for RNN implementations RNN_Model.ipynb and RL_RNN_Implementation.ipynb

Navigate to the Transformer directory for its implementations:

cd Transformer

you will find 4 notebooks with 3 Transformer implementations, Custom_Transformer_Model.ipynb, T5_implementation.ipynb and Bart_Implementation.ipynb. And another notebook BART_T5_GRAPH.ipynb pulls the graphs from your output log after running its implementation files.

Checkpoints

As the model checkpoints were about 3GB we couldnt add them to the repo. Hence is the link to acess them. https://drive.google.com/drive/folders/18zteggWePbrnLoqO1J_YTrO7ABYEycwy?usp=sharing https://drive.google.com/drive/u/0/folders/1V8XbXvDASKm8ewP4pdcSTeHBas43HU3p

About

This repository contains implementations of abstractive text summarization using RNN ,RNN with Reinforcement learning and Transformer architectures.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published