I believe that the readme says 175B model when it should be 175M. 10b1382 noobmaster29 commited on Dec 14, 2023
correct checkpoints see: https://github.com/facebookresearch/metaseq/pull/164 507a399 patrickvonplaten commited on Jun 21, 2022
Change BOS token from 0 to 2 as BOS token is equal to EOS for OPT. See: https://github.com/huggingface/transformers/issues/17431 (#1) da93426 patrickvonplaten commited on May 26, 2022
Merge branch 'main' of https://huggingface.co/facebook/opt-125m into main 141c875 patrickvonplaten commited on May 12, 2022