cloudyu commited on
Commit
437230c
1 Parent(s): 74d460b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -17,6 +17,6 @@ tags:
17
  * [DPO Trainer](https://huggingface.co/docs/trl/main/en/dpo_trainer)
18
 
19
 
20
- Metrics improved by DPO traingin after 100 steps
21
  ![Metrsc improment](mixtral-dpo.jpg)
22
 
 
17
  * [DPO Trainer](https://huggingface.co/docs/trl/main/en/dpo_trainer)
18
 
19
 
20
+ Metrics improved by Truthful DPO traingin after 100 steps
21
  ![Metrsc improment](mixtral-dpo.jpg)
22