These are the results for models other than `bert-base-uncased`. We run experiments for - datasets: `agnews_business` and `amazon_agri` - active learning trategies: `entropy` - pool filtering strategies: `anchoral`, `randomsubset`, `seals` - models: `albert-base-v2`, `deberta-v3-base`, `t5-base`, `gpt2-base`, `bert-tiny (google/bert_uncased_L-2_H-128_A-2)`