arnosimons commited on
Commit
e13c079
1 Parent(s): ec7dd7f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -39,9 +39,9 @@ tags:
39
 
40
  # Model Card for Astro-HEP-BERT
41
 
42
- **Astro-HEP-BERT** is a bidirectional transformer designed primarily to generate contextualized word embeddings for analyzing conceptual change in astrophysics and high-energy physics. Built upon Google's `bert-base-uncased`, the model underwent additional training for three epochs using approximately 21.5 million paragraphs extracted from around 600,000 scholarly articles sourced from arXiv, all pertaining to astrophysics and/or high-energy physics (HEP). The sole training objective was masked language modeling.
43
 
44
- The Astro-HEP-BERT project embodies the spirit of a tabletop experiment or grassroots scientific effort. It exclusively utilized open-source inputs during training, and the entire training process was completed on a single MacBook Pro M2/96GB over a span of 6 weeks for 3 epochs. This project stands as a proof of concept, showcasing the viability of employing a bidirectional transformer for research ventures in the history, philosophy, and sociology of science (HPSS) even with limited financial resources.
45
 
46
  For further insights into the model, the corpus, and the underlying research project (<a target="_blank" rel="noopener noreferrer" href="https://doi.org/10.3030/101044932" >Network Epistemology in Practice</a>) please refer to the Astro-HEP-BERT paper [link coming soon].
47
 
 
39
 
40
  # Model Card for Astro-HEP-BERT
41
 
42
+ **Astro-HEP-BERT** is a bidirectional transformer designed primarily to generate contextualized word embeddings for analyzing conceptual change in astrophysics and high-energy physics (HEP). Built upon Google's `bert-base-uncased`, the model underwent additional training for three epochs using approximately 21.5 million paragraphs extracted from around 600,000 scholarly articles sourced from arXiv, all pertaining to astrophysics and/or high-energy physics (HEP). The sole training objective was masked language modeling.
43
 
44
+ The Astro-HEP-BERT project embodies the spirit of a tabletop experiment or grassroots scientific effort. It exclusively utilized open-source inputs during training, and the entire training process was completed on a single MacBook Pro M2/96GB in 48 days for 3 epochs. This project stands as a proof of concept, showcasing the viability of employing a bidirectional transformer for research ventures in the history, philosophy, and sociology of science (HPSS) even with limited financial resources.
45
 
46
  For further insights into the model, the corpus, and the underlying research project (<a target="_blank" rel="noopener noreferrer" href="https://doi.org/10.3030/101044932" >Network Epistemology in Practice</a>) please refer to the Astro-HEP-BERT paper [link coming soon].
47