Segmentation fault error

#2
by yangxia20000 - opened

I got Segmentation fault when I ran vocab_transplant.py. Besides, tokenizer_target.vocab_size is only 128000, not 128256, why do we use tokenizer_target.vocab_size rather than target_vocab_size for model.lm_head.out_features ? Thanks a lot!

yangxia20000 changed discussion status to closed

Sign up or log in to comment