欢迎关注我的CSDN:https://spike.blog.csdn/
本文地址:https://blog.csdn/caroline_wendy/article/details/128909400
GPT、GPT-2、GPT-3:Generative Pre-trained Transformer,生成式预训练Transformer
-
Wiki: https://en.wikipedia/wiki/GPT-3
-
GPT-3 Demo: https://gpt3demo/
时间线:
- Transformer, 2017.6, Attention is all you need
- GPT, 2018.6, Improving Language Understanding by Generative Pre-Training: 使用Transformer的解码器,在没有标签的文本上,预训练模型
- BERT, 2018.10, BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding: Bidirectional Encoder Representations from Transformers,Transformer的编码器
- GPT-2, 2019.2,
更多推荐
Paper简读 - ChatGPT相关的GPT-1、GPT-2、GPT-3
发布评论