我做SFT微调时,发现tokenizer save的问题,需要改下
#13
by
shibing624
- opened
tokenization_chatglm.py 文件,137行 self.vocab_file 找不到对应的文件, 需要在73行前加一行代码 self.vocab_file = vocab_file
Fixed.
zxdu20
changed discussion status to
closed