File size: 972 Bytes
7b3bcf1 81b2e6f 7b3bcf1 81b2e6f |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 |
---
license: cc-by-nc-4.0
---
JAX weights converted from Torch checkpoint at `facebook/galactica-6.7b`.
```python
(env) ubuntu@vm:~$ JAX_PLATFORM_NAME=cpu python3
>>> import jax
>>> print(jax.devices())
[CpuDevice(id=0)] # Ensure that model weights are loaded into CPU RAM, not accelerator memory.
>>> from transformers import FlaxOPTForCausalLM
>>> model = FlaxOPTForCausalLM.from_pretrained("facebook/galactica-6.7b", from_pt=True)
>>> model.push_to_hub(hf_model_repo)
```
## Citation and Attribution
Citation from the original repo is reproduced below as per the cc-by-nc-4.0 licsense.
```bibtex
@inproceedings{GALACTICA,
title={GALACTICA: A Large Language Model for Science},
author={Ross Taylor and Marcin Kardas and Guillem Cucurull and Thomas Scialom and Anthony Hartshorn and Elvis Saravia and Andrew Poulton and Viktor Kerkez and Robert Stojnic},
year={2022}
}
```
> Research supported with Cloud TPUs from Google's TPU Research Cloud (TRC) |