metadata
library_name: transformers
tags:
- dialogue
- gpt2
- open-domain
- chit-chat
license: mit
datasets:
- daily_dialog
- facebook/empathetic_dialogues
- AlekseyKorshuk/persona-chat
- blended_skill_talk
language:
- en
metrics:
- perplexity
Model Card for Model ID
This model is based on GPT2LMHeadModel which has been trained with a total of 4 open-domain dialogue dataset.
Model Details
The details can be found on my blog posts:
Model Description
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by: devjwsong
- Model type: GPT2LMHeadModel
- Language(s) (NLP): English
- License: MIT
- Finetuned from model [optional]: openai-community/gpt2
Model Sources
Citation
- Radford, A., Wu, J., Child, R., Luan, D., Amodei, D., & Sutskever, I. (2019). Language models are unsupervised multitask learners. OpenAI blog, 1(8), 9.
- https://songstudio.info/tech/tech-33
- https://songstudio.info/tech/tech-35
BibTeX:
@misc{Song_2020,
title={Multi-turn chatbot project (3): GPT-2 chatbot with multi-turn generation settings},
url={https://songstudio.info/tech/tech-35},
journal={SongStudio},
author={Song, Jaewoo},
year={2020},
month={Nov}
}
Model Card Authors
- Jaewoo (Kyle) Song