The official GitHub page for the survey paper "A Survey of Large Language Models".
-
Updated
May 19, 2024 - Python
The official GitHub page for the survey paper "A Survey of Large Language Models".
Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
Oscar and VinVL
Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN
Papers about pretraining and self-supervised learning on Graph Neural Networks (GNN).
Code for TKDE paper "Self-supervised learning on graphs: Contrastive, generative, or predictive"
Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo
An Open-sourced Knowledgable Large Language Model Framework.
Code for ICLR 2020 paper "VL-BERT: Pre-training of Generic Visual-Linguistic Representations".
Research code for ECCV 2020 paper "UNITER: UNiversal Image-TExt Representation Learning"
[NeurIPS 2020] "Graph Contrastive Learning with Augmentations" by Yuning You, Tianlong Chen, Yongduo Sui, Ting Chen, Zhangyang Wang, Yang Shen
A one-stop data processing system to make data higher-quality, juicier, and more digestible for LLMs! 🍎 🍋 🌽 ➡️ ➡️🍸 🍹 🍷为大语言模型提供更高质量、更丰富、更易”消化“的数据!
Code for KDD'20 "Generative Pre-Training of Graph Neural Networks"
OpenAI GPT2 pre-training and sequence prediction implementation in Tensorflow 2.0
Awesome resources for in-context learning and prompt engineering: Mastery of the LLMs such as ChatGPT, GPT-3, and FlanT5, with up-to-date and cutting-edge updates.
The repository of ET-BERT, a network traffic classification model on encrypted traffic. The work has been accepted as The Web Conference (WWW) 2022 accepted paper.
Awesome list for research on CLIP (Contrastive Language-Image Pre-Training).
GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training @ KDD 2020
A professional list on Large (Language) Models and Foundation Models (LLM, LM, FM) for Time Series, Spatiotemporal, and Event Data.
Add a description, image, and links to the pre-training topic page so that developers can more easily learn about it.
To associate your repository with the pre-training topic, visit your repo's landing page and select "manage topics."