Japanese DialoGPT trained with Aozora
(ja) 青空文庫のセリフで学習した日本語のDialoGPT Smallです
(en) Japanese DialoGPT Small trained on Aozora Bunko.
Demo
Demo in this page is not working so well. I recommend you to try it on Hugging Face Spaces Version.
Reference
- Aozora-bunko
- Japanese public domain books.
- I extracted the dialogue part from the books and used it as the training data.
- japanese-gpt2-small
- Novel Japanese GPT2. I used a small model because of the limitation of GPU memory of my desktop PC(with RTX3060x1) 😢.
- I used this model as a pre-trained model.
- DialoGPT: Large-Scale Generative Pre-training for Conversational Response Generation
- Downloads last month
- 6
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.