Use GPT-2 for Text-generation
-
Updated
Apr 22, 2023 - Jupyter Notebook
Use GPT-2 for Text-generation
Character Embeddings Recurrent Neural Network Text Generation Models
Generates text in the style of Arthur Conan Doyle's Sherlock Canon using deep learning.
Build a Text Generator using LSTM and Keras in Python
Probabilistic Noising of Natural Language
A collection of Tensorflow implementation for different types of attention mechanism in text-related tasks.
Mimetizing literary styles using text generation by means of neural networks.
Learning Repo for PyToch and basic Neural Network
A chatbot builder that uses gptj that will let you make chatbots that will have a conversation with you if you feel lonely or ever need a friend to talk too
Source code and datasets for the CIKM 2020 paper "Knowledge-Enhanced Personalized Review Generation with Capsule Graph Neural Network".
A small weekend project exploring recurrent neural networks for text generation
Converts images to a textual representation.
🎨 🡲 📱 Draw To Text!
All the project works, presentations and course notes done by me during the Master of Science in Artificial Intelligence at the University of Bologna
My First Transformer Text Generation Model Deployment
Automatic Hashtag Generation for Social Media Posts Using Neural Text Generation Models
Codebase for Arpagen: A Corpus and Baseline for Phoneme-Level Text Generation.
Listen. Write. Speak. Read. Think.
Successfully developed an encoder-decoder based sequence to sequence (Seq2Seq) model which can summarize the entire text of an Indian news summary into a short paragraph with limited number of words.
Add a description, image, and links to the text-generation topic page so that developers can more easily learn about it.
To associate your repository with the text-generation topic, visit your repo's landing page and select "manage topics."