dnoble's picture
Upload folder using huggingface_hub
0b479b2 verified

PP489 Optimization Variants Iter1 x TIGIT (YM_0988)

Overview

YM_0988 includes ABC001 against 2 TIGIT homologs. We explored several model hypothesis: (i) Does pre-training aid predicitivity and (ii) does the featurization of the input sequences matter. To test pretraining, we refer to mata_descriptions with the term \textbf{warm} to include pretraining, and \textbf{cold} to start from a randomly initialized seed. For featurization, we explored \textbf{label-encoded} sequences with a one-hot-encoder of amino acid identities, versus an \textbf{ESM}-featurized embedding to represent each sequence in the PPI. Optimization was performed on the human ortholog.

Experimental details

We studied the efficacy of generating binders with different model hyperparameters. This dataset includes 26726 unique scFvs and 2 unique target sequences.

A more extensive methods section can be found in our publication here.

Misc dataset details

We define the following binders:

A-library (scFvs)

There are several terms you can filter by:

  • ABC001_WT_<i>: These are WT replicates.
  • ABC001_label_encoded_cold: Label encoded sequences with no pretraining
  • ABC001_label_encoded_warm: Label encoded sequences with pretraining
  • ABC001_esm_cold: ESM featurized sequences with no pretraining
  • ABC001_esm_warm: ESM featurized sequences with pretraining

Alpha-library

  • TIGIT_22-137_POI-AGA2: Human TIGIT
  • TIGIT_Mouse: Mouse TIGIT