Model Name
stringlengths 5
122
| URL
stringlengths 28
145
| Crawled Text
stringlengths 1
199k
⌀ | text
stringlengths 180
199k
|
---|---|---|---|
DeadBeast/mbert-base-cased-finetuned-bengali-fakenews | https://huggingface.co/DeadBeast/mbert-base-cased-finetuned-bengali-fakenews | This model is a fine-tune checkpoint of mBERT-base-cased over Bengali-fake-news Dataset for Text classification. This model reaches an accuracy of 96.3 with an f1-score of 79.1 on the dev set. Task: binary-classification | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : DeadBeast/mbert-base-cased-finetuned-bengali-fakenews
### Model URL : https://huggingface.co/DeadBeast/mbert-base-cased-finetuned-bengali-fakenews
### Model Description : This model is a fine-tune checkpoint of mBERT-base-cased over Bengali-fake-news Dataset for Text classification. This model reaches an accuracy of 96.3 with an f1-score of 79.1 on the dev set. Task: binary-classification |
DeadBeast/roberta-base-pretrained-mr-2 | https://huggingface.co/DeadBeast/roberta-base-pretrained-mr-2 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : DeadBeast/roberta-base-pretrained-mr-2
### Model URL : https://huggingface.co/DeadBeast/roberta-base-pretrained-mr-2
### Model Description : No model card New: Create and edit this model card directly on the website! |
DeadBeast/roberta-base-pretrained-mr | https://huggingface.co/DeadBeast/roberta-base-pretrained-mr | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : DeadBeast/roberta-base-pretrained-mr
### Model URL : https://huggingface.co/DeadBeast/roberta-base-pretrained-mr
### Model Description : No model card New: Create and edit this model card directly on the website! |
Dean/summarsiation | https://huggingface.co/Dean/summarsiation | null | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Dean/summarsiation
### Model URL : https://huggingface.co/Dean/summarsiation
### Model Description : |
DecafNosebleed/DialoGPT-small-ScaraBot | https://huggingface.co/DecafNosebleed/DialoGPT-small-ScaraBot | #Scaramouche DialoGPT Model | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : DecafNosebleed/DialoGPT-small-ScaraBot
### Model URL : https://huggingface.co/DecafNosebleed/DialoGPT-small-ScaraBot
### Model Description : #Scaramouche DialoGPT Model |
DecafNosebleed/ScaraBot | https://huggingface.co/DecafNosebleed/ScaraBot | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : DecafNosebleed/ScaraBot
### Model URL : https://huggingface.co/DecafNosebleed/ScaraBot
### Model Description : No model card New: Create and edit this model card directly on the website! |
DecafNosebleed/scarabot-model | https://huggingface.co/DecafNosebleed/scarabot-model | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : DecafNosebleed/scarabot-model
### Model URL : https://huggingface.co/DecafNosebleed/scarabot-model
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/Breitbart_model_v1 | https://huggingface.co/Declan/Breitbart_model_v1 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/Breitbart_model_v1
### Model URL : https://huggingface.co/Declan/Breitbart_model_v1
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/Breitbart_model_v2 | https://huggingface.co/Declan/Breitbart_model_v2 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/Breitbart_model_v2
### Model URL : https://huggingface.co/Declan/Breitbart_model_v2
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/Breitbart_model_v3 | https://huggingface.co/Declan/Breitbart_model_v3 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/Breitbart_model_v3
### Model URL : https://huggingface.co/Declan/Breitbart_model_v3
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/Breitbart_model_v4 | https://huggingface.co/Declan/Breitbart_model_v4 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/Breitbart_model_v4
### Model URL : https://huggingface.co/Declan/Breitbart_model_v4
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/Breitbart_model_v5 | https://huggingface.co/Declan/Breitbart_model_v5 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/Breitbart_model_v5
### Model URL : https://huggingface.co/Declan/Breitbart_model_v5
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/Breitbart_model_v6 | https://huggingface.co/Declan/Breitbart_model_v6 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/Breitbart_model_v6
### Model URL : https://huggingface.co/Declan/Breitbart_model_v6
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/Breitbart_model_v7 | https://huggingface.co/Declan/Breitbart_model_v7 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/Breitbart_model_v7
### Model URL : https://huggingface.co/Declan/Breitbart_model_v7
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/Breitbart_model_v8 | https://huggingface.co/Declan/Breitbart_model_v8 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/Breitbart_model_v8
### Model URL : https://huggingface.co/Declan/Breitbart_model_v8
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/Breitbart_modelv7 | https://huggingface.co/Declan/Breitbart_modelv7 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/Breitbart_modelv7
### Model URL : https://huggingface.co/Declan/Breitbart_modelv7
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/CNN_model_v1 | https://huggingface.co/Declan/CNN_model_v1 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/CNN_model_v1
### Model URL : https://huggingface.co/Declan/CNN_model_v1
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/CNN_model_v2 | https://huggingface.co/Declan/CNN_model_v2 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/CNN_model_v2
### Model URL : https://huggingface.co/Declan/CNN_model_v2
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/CNN_model_v3 | https://huggingface.co/Declan/CNN_model_v3 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/CNN_model_v3
### Model URL : https://huggingface.co/Declan/CNN_model_v3
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/CNN_model_v4 | https://huggingface.co/Declan/CNN_model_v4 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/CNN_model_v4
### Model URL : https://huggingface.co/Declan/CNN_model_v4
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/CNN_model_v5 | https://huggingface.co/Declan/CNN_model_v5 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/CNN_model_v5
### Model URL : https://huggingface.co/Declan/CNN_model_v5
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/CNN_model_v6 | https://huggingface.co/Declan/CNN_model_v6 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/CNN_model_v6
### Model URL : https://huggingface.co/Declan/CNN_model_v6
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/CNN_model_v7 | https://huggingface.co/Declan/CNN_model_v7 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/CNN_model_v7
### Model URL : https://huggingface.co/Declan/CNN_model_v7
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/CNN_model_v8 | https://huggingface.co/Declan/CNN_model_v8 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/CNN_model_v8
### Model URL : https://huggingface.co/Declan/CNN_model_v8
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/ChicagoTribune_model_v1 | https://huggingface.co/Declan/ChicagoTribune_model_v1 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/ChicagoTribune_model_v1
### Model URL : https://huggingface.co/Declan/ChicagoTribune_model_v1
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/ChicagoTribune_model_v2 | https://huggingface.co/Declan/ChicagoTribune_model_v2 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/ChicagoTribune_model_v2
### Model URL : https://huggingface.co/Declan/ChicagoTribune_model_v2
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/ChicagoTribune_model_v3 | https://huggingface.co/Declan/ChicagoTribune_model_v3 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/ChicagoTribune_model_v3
### Model URL : https://huggingface.co/Declan/ChicagoTribune_model_v3
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/ChicagoTribune_model_v4 | https://huggingface.co/Declan/ChicagoTribune_model_v4 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/ChicagoTribune_model_v4
### Model URL : https://huggingface.co/Declan/ChicagoTribune_model_v4
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/ChicagoTribune_model_v5 | https://huggingface.co/Declan/ChicagoTribune_model_v5 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/ChicagoTribune_model_v5
### Model URL : https://huggingface.co/Declan/ChicagoTribune_model_v5
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/ChicagoTribune_model_v6 | https://huggingface.co/Declan/ChicagoTribune_model_v6 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/ChicagoTribune_model_v6
### Model URL : https://huggingface.co/Declan/ChicagoTribune_model_v6
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/ChicagoTribune_model_v7 | https://huggingface.co/Declan/ChicagoTribune_model_v7 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/ChicagoTribune_model_v7
### Model URL : https://huggingface.co/Declan/ChicagoTribune_model_v7
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/ChicagoTribune_model_v8 | https://huggingface.co/Declan/ChicagoTribune_model_v8 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/ChicagoTribune_model_v8
### Model URL : https://huggingface.co/Declan/ChicagoTribune_model_v8
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/FoxNews_model_v1 | https://huggingface.co/Declan/FoxNews_model_v1 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/FoxNews_model_v1
### Model URL : https://huggingface.co/Declan/FoxNews_model_v1
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/FoxNews_model_v2 | https://huggingface.co/Declan/FoxNews_model_v2 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/FoxNews_model_v2
### Model URL : https://huggingface.co/Declan/FoxNews_model_v2
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/FoxNews_model_v3 | https://huggingface.co/Declan/FoxNews_model_v3 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/FoxNews_model_v3
### Model URL : https://huggingface.co/Declan/FoxNews_model_v3
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/FoxNews_model_v4 | https://huggingface.co/Declan/FoxNews_model_v4 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/FoxNews_model_v4
### Model URL : https://huggingface.co/Declan/FoxNews_model_v4
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/FoxNews_model_v5 | https://huggingface.co/Declan/FoxNews_model_v5 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/FoxNews_model_v5
### Model URL : https://huggingface.co/Declan/FoxNews_model_v5
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/FoxNews_model_v6 | https://huggingface.co/Declan/FoxNews_model_v6 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/FoxNews_model_v6
### Model URL : https://huggingface.co/Declan/FoxNews_model_v6
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/FoxNews_model_v8 | https://huggingface.co/Declan/FoxNews_model_v8 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/FoxNews_model_v8
### Model URL : https://huggingface.co/Declan/FoxNews_model_v8
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/HuffPost_model_v1 | https://huggingface.co/Declan/HuffPost_model_v1 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/HuffPost_model_v1
### Model URL : https://huggingface.co/Declan/HuffPost_model_v1
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/HuffPost_model_v2 | https://huggingface.co/Declan/HuffPost_model_v2 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/HuffPost_model_v2
### Model URL : https://huggingface.co/Declan/HuffPost_model_v2
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/HuffPost_model_v3 | https://huggingface.co/Declan/HuffPost_model_v3 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/HuffPost_model_v3
### Model URL : https://huggingface.co/Declan/HuffPost_model_v3
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/HuffPost_model_v4 | https://huggingface.co/Declan/HuffPost_model_v4 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/HuffPost_model_v4
### Model URL : https://huggingface.co/Declan/HuffPost_model_v4
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/HuffPost_model_v5 | https://huggingface.co/Declan/HuffPost_model_v5 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/HuffPost_model_v5
### Model URL : https://huggingface.co/Declan/HuffPost_model_v5
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/HuffPost_model_v6 | https://huggingface.co/Declan/HuffPost_model_v6 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/HuffPost_model_v6
### Model URL : https://huggingface.co/Declan/HuffPost_model_v6
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/HuffPost_model_v8 | https://huggingface.co/Declan/HuffPost_model_v8 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/HuffPost_model_v8
### Model URL : https://huggingface.co/Declan/HuffPost_model_v8
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/Independent__model | https://huggingface.co/Declan/Independent__model | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/Independent__model
### Model URL : https://huggingface.co/Declan/Independent__model
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/NPR_model_v1 | https://huggingface.co/Declan/NPR_model_v1 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/NPR_model_v1
### Model URL : https://huggingface.co/Declan/NPR_model_v1
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/NPR_model_v2 | https://huggingface.co/Declan/NPR_model_v2 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/NPR_model_v2
### Model URL : https://huggingface.co/Declan/NPR_model_v2
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/NPR_model_v3 | https://huggingface.co/Declan/NPR_model_v3 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/NPR_model_v3
### Model URL : https://huggingface.co/Declan/NPR_model_v3
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/NPR_model_v4 | https://huggingface.co/Declan/NPR_model_v4 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/NPR_model_v4
### Model URL : https://huggingface.co/Declan/NPR_model_v4
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/NPR_model_v5 | https://huggingface.co/Declan/NPR_model_v5 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/NPR_model_v5
### Model URL : https://huggingface.co/Declan/NPR_model_v5
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/NPR_model_v6 | https://huggingface.co/Declan/NPR_model_v6 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/NPR_model_v6
### Model URL : https://huggingface.co/Declan/NPR_model_v6
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/NPR_model_v8 | https://huggingface.co/Declan/NPR_model_v8 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/NPR_model_v8
### Model URL : https://huggingface.co/Declan/NPR_model_v8
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/NewYorkPost_model_v1 | https://huggingface.co/Declan/NewYorkPost_model_v1 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/NewYorkPost_model_v1
### Model URL : https://huggingface.co/Declan/NewYorkPost_model_v1
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/NewYorkTimes_model_v1 | https://huggingface.co/Declan/NewYorkTimes_model_v1 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/NewYorkTimes_model_v1
### Model URL : https://huggingface.co/Declan/NewYorkTimes_model_v1
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/NewYorkTimes_model_v2 | https://huggingface.co/Declan/NewYorkTimes_model_v2 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/NewYorkTimes_model_v2
### Model URL : https://huggingface.co/Declan/NewYorkTimes_model_v2
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/NewYorkTimes_model_v3 | https://huggingface.co/Declan/NewYorkTimes_model_v3 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/NewYorkTimes_model_v3
### Model URL : https://huggingface.co/Declan/NewYorkTimes_model_v3
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/NewYorkTimes_model_v4 | https://huggingface.co/Declan/NewYorkTimes_model_v4 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/NewYorkTimes_model_v4
### Model URL : https://huggingface.co/Declan/NewYorkTimes_model_v4
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/NewYorkTimes_model_v6 | https://huggingface.co/Declan/NewYorkTimes_model_v6 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/NewYorkTimes_model_v6
### Model URL : https://huggingface.co/Declan/NewYorkTimes_model_v6
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/NewYorkTimes_model_v8 | https://huggingface.co/Declan/NewYorkTimes_model_v8 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/NewYorkTimes_model_v8
### Model URL : https://huggingface.co/Declan/NewYorkTimes_model_v8
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/Politico_model_v1 | https://huggingface.co/Declan/Politico_model_v1 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/Politico_model_v1
### Model URL : https://huggingface.co/Declan/Politico_model_v1
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/Politico_model_v2 | https://huggingface.co/Declan/Politico_model_v2 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/Politico_model_v2
### Model URL : https://huggingface.co/Declan/Politico_model_v2
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/Politico_model_v3 | https://huggingface.co/Declan/Politico_model_v3 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/Politico_model_v3
### Model URL : https://huggingface.co/Declan/Politico_model_v3
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/Politico_model_v4 | https://huggingface.co/Declan/Politico_model_v4 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/Politico_model_v4
### Model URL : https://huggingface.co/Declan/Politico_model_v4
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/Politico_model_v5 | https://huggingface.co/Declan/Politico_model_v5 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/Politico_model_v5
### Model URL : https://huggingface.co/Declan/Politico_model_v5
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/Politico_model_v6 | https://huggingface.co/Declan/Politico_model_v6 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/Politico_model_v6
### Model URL : https://huggingface.co/Declan/Politico_model_v6
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/Politico_model_v8 | https://huggingface.co/Declan/Politico_model_v8 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/Politico_model_v8
### Model URL : https://huggingface.co/Declan/Politico_model_v8
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/Reuters_model_v1 | https://huggingface.co/Declan/Reuters_model_v1 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/Reuters_model_v1
### Model URL : https://huggingface.co/Declan/Reuters_model_v1
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/Reuters_model_v2 | https://huggingface.co/Declan/Reuters_model_v2 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/Reuters_model_v2
### Model URL : https://huggingface.co/Declan/Reuters_model_v2
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/Reuters_model_v3 | https://huggingface.co/Declan/Reuters_model_v3 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/Reuters_model_v3
### Model URL : https://huggingface.co/Declan/Reuters_model_v3
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/Reuters_model_v4 | https://huggingface.co/Declan/Reuters_model_v4 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/Reuters_model_v4
### Model URL : https://huggingface.co/Declan/Reuters_model_v4
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/Reuters_model_v5 | https://huggingface.co/Declan/Reuters_model_v5 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/Reuters_model_v5
### Model URL : https://huggingface.co/Declan/Reuters_model_v5
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/Reuters_model_v6 | https://huggingface.co/Declan/Reuters_model_v6 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/Reuters_model_v6
### Model URL : https://huggingface.co/Declan/Reuters_model_v6
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/Reuters_model_v8 | https://huggingface.co/Declan/Reuters_model_v8 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/Reuters_model_v8
### Model URL : https://huggingface.co/Declan/Reuters_model_v8
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/WallStreetJournal_model_v1 | https://huggingface.co/Declan/WallStreetJournal_model_v1 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/WallStreetJournal_model_v1
### Model URL : https://huggingface.co/Declan/WallStreetJournal_model_v1
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/WallStreetJournal_model_v2 | https://huggingface.co/Declan/WallStreetJournal_model_v2 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/WallStreetJournal_model_v2
### Model URL : https://huggingface.co/Declan/WallStreetJournal_model_v2
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/WallStreetJournal_model_v3 | https://huggingface.co/Declan/WallStreetJournal_model_v3 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/WallStreetJournal_model_v3
### Model URL : https://huggingface.co/Declan/WallStreetJournal_model_v3
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/WallStreetJournal_model_v4 | https://huggingface.co/Declan/WallStreetJournal_model_v4 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/WallStreetJournal_model_v4
### Model URL : https://huggingface.co/Declan/WallStreetJournal_model_v4
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/WallStreetJournal_model_v5 | https://huggingface.co/Declan/WallStreetJournal_model_v5 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/WallStreetJournal_model_v5
### Model URL : https://huggingface.co/Declan/WallStreetJournal_model_v5
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/WallStreetJournal_model_v6 | https://huggingface.co/Declan/WallStreetJournal_model_v6 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/WallStreetJournal_model_v6
### Model URL : https://huggingface.co/Declan/WallStreetJournal_model_v6
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/WallStreetJournal_model_v8 | https://huggingface.co/Declan/WallStreetJournal_model_v8 | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/WallStreetJournal_model_v8
### Model URL : https://huggingface.co/Declan/WallStreetJournal_model_v8
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/test_model | https://huggingface.co/Declan/test_model | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/test_model
### Model URL : https://huggingface.co/Declan/test_model
### Model Description : No model card New: Create and edit this model card directly on the website! |
Declan/test_push | https://huggingface.co/Declan/test_push | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : Declan/test_push
### Model URL : https://huggingface.co/Declan/test_push
### Model Description : No model card New: Create and edit this model card directly on the website! |
DeepBasak/Slack | https://huggingface.co/DeepBasak/Slack | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : DeepBasak/Slack
### Model URL : https://huggingface.co/DeepBasak/Slack
### Model Description : No model card New: Create and edit this model card directly on the website! |
DeepChem/ChemBERTa-10M-MLM | https://huggingface.co/DeepChem/ChemBERTa-10M-MLM | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : DeepChem/ChemBERTa-10M-MLM
### Model URL : https://huggingface.co/DeepChem/ChemBERTa-10M-MLM
### Model Description : No model card New: Create and edit this model card directly on the website! |
DeepChem/ChemBERTa-10M-MTR | https://huggingface.co/DeepChem/ChemBERTa-10M-MTR | More information needed Developed by: DeepChem Shared by [Optional]: DeepChem Model type: Token Classification Language(s) (NLP): More information needed License: More information needed Parent Model: RoBERTa Resources for more information: More information needed More information needed. More information needed. The model should not be used to intentionally create hostile or alienating environments for people. Significant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)). Predictions generated by the model may include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups. Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. More information needed More information needed More information needed More information needed More information needed More information needed More information needed More information needed Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). More information needed More information needed More information needed More information needed. BibTeX: APA: More information needed More information needed More information needed DeepChem in collaboration with Ezi Ozoani and the Hugging Face team More information needed Use the code below to get started with the model. | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : DeepChem/ChemBERTa-10M-MTR
### Model URL : https://huggingface.co/DeepChem/ChemBERTa-10M-MTR
### Model Description : More information needed Developed by: DeepChem Shared by [Optional]: DeepChem Model type: Token Classification Language(s) (NLP): More information needed License: More information needed Parent Model: RoBERTa Resources for more information: More information needed More information needed. More information needed. The model should not be used to intentionally create hostile or alienating environments for people. Significant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)). Predictions generated by the model may include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups. Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. More information needed More information needed More information needed More information needed More information needed More information needed More information needed More information needed Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). More information needed More information needed More information needed More information needed. BibTeX: APA: More information needed More information needed More information needed DeepChem in collaboration with Ezi Ozoani and the Hugging Face team More information needed Use the code below to get started with the model. |
DeepChem/ChemBERTa-5M-MLM | https://huggingface.co/DeepChem/ChemBERTa-5M-MLM | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : DeepChem/ChemBERTa-5M-MLM
### Model URL : https://huggingface.co/DeepChem/ChemBERTa-5M-MLM
### Model Description : No model card New: Create and edit this model card directly on the website! |
DeepChem/ChemBERTa-5M-MTR | https://huggingface.co/DeepChem/ChemBERTa-5M-MTR | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : DeepChem/ChemBERTa-5M-MTR
### Model URL : https://huggingface.co/DeepChem/ChemBERTa-5M-MTR
### Model Description : No model card New: Create and edit this model card directly on the website! |
DeepChem/ChemBERTa-77M-MLM | https://huggingface.co/DeepChem/ChemBERTa-77M-MLM | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : DeepChem/ChemBERTa-77M-MLM
### Model URL : https://huggingface.co/DeepChem/ChemBERTa-77M-MLM
### Model Description : No model card New: Create and edit this model card directly on the website! |
DeepChem/ChemBERTa-77M-MTR | https://huggingface.co/DeepChem/ChemBERTa-77M-MTR | No model card New: Create and edit this model card directly on the website! | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : DeepChem/ChemBERTa-77M-MTR
### Model URL : https://huggingface.co/DeepChem/ChemBERTa-77M-MTR
### Model Description : No model card New: Create and edit this model card directly on the website! |
DeepChem/SmilesTokenizer_PubChem_1M | https://huggingface.co/DeepChem/SmilesTokenizer_PubChem_1M | RoBERTa model trained on 1M SMILES from PubChem 77M set in MoleculeNet. Uses Smiles-Tokenizer | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : DeepChem/SmilesTokenizer_PubChem_1M
### Model URL : https://huggingface.co/DeepChem/SmilesTokenizer_PubChem_1M
### Model Description : RoBERTa model trained on 1M SMILES from PubChem 77M set in MoleculeNet. Uses Smiles-Tokenizer |
DeepESP/gpt2-spanish-medium | https://huggingface.co/DeepESP/gpt2-spanish-medium | GPT2-Spanish is a language generation model trained from scratch with 11.5GB of Spanish texts and with a Byte Pair Encoding (BPE) tokenizer that was trained for this purpose. The parameters used are the same as the medium version of the original OpenAI GPT2 model. This model was trained with a corpus of 11.5GB of texts corresponding to 3.5GB of Wikipedia articles and 8GB of books (narrative, short stories, theater, poetry, essays, and popularization). The texts are tokenized using a byte-level version of Byte Pair Encoding (BPE) (for Unicode characters) and a vocabulary size of 50257. The inputs are sequences of 1024 consecutive tokens. This tokenizer was trained from scratch with the Spanish corpus, since it was evidenced that the tokenizer of the English models presented limitations to capture the semantic relations of Spanish, due to the morphosyntactic differences between both languages. Apart from the special token "<|endoftext|>" for text ending in the OpenAI GPT-2 models, the tokens "<|talk|>", "<|ax1|>", "<|ax2|>" (..)"<|ax9|>" were included so that they can serve as prompts in future training. The model and tokenizer were trained using the Hugging Face libraries with an Nvidia Tesla V100 GPU with 16GB memory on Google Colab servers. The model was trained by Alejandro Oñate Latorre (Spain) and Jorge Ortiz Fuentes (Chile), members of -Deep ESP-, an open-source community on Natural Language Processing in Spanish (https://t.me/joinchat/VoEp1bPrDYEexc6h). Thanks to the members of the community who collaborated with funding for the initial tests. The model generates texts according to the patterns learned in the training corpus. These data were not filtered, therefore, the model could generate offensive or discriminatory content. | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : DeepESP/gpt2-spanish-medium
### Model URL : https://huggingface.co/DeepESP/gpt2-spanish-medium
### Model Description : GPT2-Spanish is a language generation model trained from scratch with 11.5GB of Spanish texts and with a Byte Pair Encoding (BPE) tokenizer that was trained for this purpose. The parameters used are the same as the medium version of the original OpenAI GPT2 model. This model was trained with a corpus of 11.5GB of texts corresponding to 3.5GB of Wikipedia articles and 8GB of books (narrative, short stories, theater, poetry, essays, and popularization). The texts are tokenized using a byte-level version of Byte Pair Encoding (BPE) (for Unicode characters) and a vocabulary size of 50257. The inputs are sequences of 1024 consecutive tokens. This tokenizer was trained from scratch with the Spanish corpus, since it was evidenced that the tokenizer of the English models presented limitations to capture the semantic relations of Spanish, due to the morphosyntactic differences between both languages. Apart from the special token "<|endoftext|>" for text ending in the OpenAI GPT-2 models, the tokens "<|talk|>", "<|ax1|>", "<|ax2|>" (..)"<|ax9|>" were included so that they can serve as prompts in future training. The model and tokenizer were trained using the Hugging Face libraries with an Nvidia Tesla V100 GPU with 16GB memory on Google Colab servers. The model was trained by Alejandro Oñate Latorre (Spain) and Jorge Ortiz Fuentes (Chile), members of -Deep ESP-, an open-source community on Natural Language Processing in Spanish (https://t.me/joinchat/VoEp1bPrDYEexc6h). Thanks to the members of the community who collaborated with funding for the initial tests. The model generates texts according to the patterns learned in the training corpus. These data were not filtered, therefore, the model could generate offensive or discriminatory content. |
DeepESP/gpt2-spanish | https://huggingface.co/DeepESP/gpt2-spanish | GPT2-Spanish is a language generation model trained from scratch with 11.5GB of Spanish texts and with a Byte Pair Encoding (BPE) tokenizer that was trained for this purpose. The parameters used are the same as the small version of the original OpenAI GPT2 model. This model was trained with a corpus of 11.5GB of texts corresponding to 3.5GB of Wikipedia articles and 8GB of books (narrative, short stories, theater, poetry, essays, and popularization). The texts are tokenized using a byte-level version of Byte Pair Encoding (BPE) (for Unicode characters) and a vocabulary size of 50257. The inputs are sequences of 1024 consecutive tokens. This tokenizer was trained from scratch with the Spanish corpus, since it was evidenced that the tokenizer of the English models presented limitations to capture the semantic relations of Spanish, due to the morphosyntactic differences between both languages. Apart from the special token "<|endoftext|>" for text ending in the OpenAI GPT-2 models, the tokens "<|talk|>", "<|ax1|>", "<|ax2|>" (..)"<|ax9|>" were included so that they can serve as prompts in future training. The model and tokenizer were trained using the Hugging Face libraries with an Nvidia Tesla V100 GPU with 16GB memory on Google Colab servers. The model was trained by Alejandro Oñate Latorre (Spain) and Jorge Ortiz Fuentes (Chile), members of -Deep ESP-, an open-source community on Natural Language Processing in Spanish (https://t.me/joinchat/VoEp1bPrDYEexc6h). Thanks to the members of the community who collaborated with funding for the initial tests. The model generates texts according to the patterns learned in the training corpus. These data were not filtered, therefore, the model could generate offensive or discriminatory content. | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : DeepESP/gpt2-spanish
### Model URL : https://huggingface.co/DeepESP/gpt2-spanish
### Model Description : GPT2-Spanish is a language generation model trained from scratch with 11.5GB of Spanish texts and with a Byte Pair Encoding (BPE) tokenizer that was trained for this purpose. The parameters used are the same as the small version of the original OpenAI GPT2 model. This model was trained with a corpus of 11.5GB of texts corresponding to 3.5GB of Wikipedia articles and 8GB of books (narrative, short stories, theater, poetry, essays, and popularization). The texts are tokenized using a byte-level version of Byte Pair Encoding (BPE) (for Unicode characters) and a vocabulary size of 50257. The inputs are sequences of 1024 consecutive tokens. This tokenizer was trained from scratch with the Spanish corpus, since it was evidenced that the tokenizer of the English models presented limitations to capture the semantic relations of Spanish, due to the morphosyntactic differences between both languages. Apart from the special token "<|endoftext|>" for text ending in the OpenAI GPT-2 models, the tokens "<|talk|>", "<|ax1|>", "<|ax2|>" (..)"<|ax9|>" were included so that they can serve as prompts in future training. The model and tokenizer were trained using the Hugging Face libraries with an Nvidia Tesla V100 GPU with 16GB memory on Google Colab servers. The model was trained by Alejandro Oñate Latorre (Spain) and Jorge Ortiz Fuentes (Chile), members of -Deep ESP-, an open-source community on Natural Language Processing in Spanish (https://t.me/joinchat/VoEp1bPrDYEexc6h). Thanks to the members of the community who collaborated with funding for the initial tests. The model generates texts according to the patterns learned in the training corpus. These data were not filtered, therefore, the model could generate offensive or discriminatory content. |
DeepPavlov/bert-base-bg-cs-pl-ru-cased | https://huggingface.co/DeepPavlov/bert-base-bg-cs-pl-ru-cased | SlavicBERT[1] (Slavic (bg, cs, pl, ru), cased, 12‑layer, 768‑hidden, 12‑heads, 180M parameters) was trained on Russian News and four Wikipedias: Bulgarian, Czech, Polish, and Russian. Subtoken vocabulary was built using this data. Multilingual BERT was used as an initialization for SlavicBERT. 08.11.2021: upload model with MLM and NSP heads [1]: Arkhipov M., Trofimova M., Kuratov Y., Sorokin A. (2019). Tuning Multilingual Transformers for Language-Specific Named Entity Recognition. ACL anthology W19-3712. | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : DeepPavlov/bert-base-bg-cs-pl-ru-cased
### Model URL : https://huggingface.co/DeepPavlov/bert-base-bg-cs-pl-ru-cased
### Model Description : SlavicBERT[1] (Slavic (bg, cs, pl, ru), cased, 12‑layer, 768‑hidden, 12‑heads, 180M parameters) was trained on Russian News and four Wikipedias: Bulgarian, Czech, Polish, and Russian. Subtoken vocabulary was built using this data. Multilingual BERT was used as an initialization for SlavicBERT. 08.11.2021: upload model with MLM and NSP heads [1]: Arkhipov M., Trofimova M., Kuratov Y., Sorokin A. (2019). Tuning Multilingual Transformers for Language-Specific Named Entity Recognition. ACL anthology W19-3712. |
DeepPavlov/bert-base-cased-conversational | https://huggingface.co/DeepPavlov/bert-base-cased-conversational | Conversational BERT (English, cased, 12‑layer, 768‑hidden, 12‑heads, 110M parameters) was trained on the English part of Twitter, Reddit, DailyDialogues[1], OpenSubtitles[2], Debates[3], Blogs[4], Facebook News Comments. We used this training data to build the vocabulary of English subtokens and took English cased version of BERT‑base as an initialization for English Conversational BERT. 08.11.2021: upload model with MLM and NSP heads [1]: Yanran Li, Hui Su, Xiaoyu Shen, Wenjie Li, Ziqiang Cao, and Shuzi Niu. DailyDialog: A Manually Labelled Multi-turn Dialogue Dataset. IJCNLP 2017. [2]: P. Lison and J. Tiedemann, 2016, OpenSubtitles2016: Extracting Large Parallel Corpora from Movie and TV Subtitles. In Proceedings of the 10th International Conference on Language Resources and Evaluation (LREC 2016) [3]: Justine Zhang, Ravi Kumar, Sujith Ravi, Cristian Danescu-Niculescu-Mizil. Proceedings of NAACL, 2016. [4]: J. Schler, M. Koppel, S. Argamon and J. Pennebaker (2006). Effects of Age and Gender on Blogging in Proceedings of 2006 AAAI Spring Symposium on Computational Approaches for Analyzing Weblogs. | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : DeepPavlov/bert-base-cased-conversational
### Model URL : https://huggingface.co/DeepPavlov/bert-base-cased-conversational
### Model Description : Conversational BERT (English, cased, 12‑layer, 768‑hidden, 12‑heads, 110M parameters) was trained on the English part of Twitter, Reddit, DailyDialogues[1], OpenSubtitles[2], Debates[3], Blogs[4], Facebook News Comments. We used this training data to build the vocabulary of English subtokens and took English cased version of BERT‑base as an initialization for English Conversational BERT. 08.11.2021: upload model with MLM and NSP heads [1]: Yanran Li, Hui Su, Xiaoyu Shen, Wenjie Li, Ziqiang Cao, and Shuzi Niu. DailyDialog: A Manually Labelled Multi-turn Dialogue Dataset. IJCNLP 2017. [2]: P. Lison and J. Tiedemann, 2016, OpenSubtitles2016: Extracting Large Parallel Corpora from Movie and TV Subtitles. In Proceedings of the 10th International Conference on Language Resources and Evaluation (LREC 2016) [3]: Justine Zhang, Ravi Kumar, Sujith Ravi, Cristian Danescu-Niculescu-Mizil. Proceedings of NAACL, 2016. [4]: J. Schler, M. Koppel, S. Argamon and J. Pennebaker (2006). Effects of Age and Gender on Blogging in Proceedings of 2006 AAAI Spring Symposium on Computational Approaches for Analyzing Weblogs. |
DeepPavlov/bert-base-multilingual-cased-sentence | https://huggingface.co/DeepPavlov/bert-base-multilingual-cased-sentence | Sentence Multilingual BERT (101 languages, cased, 12‑layer, 768‑hidden, 12‑heads, 180M parameters) is a representation‑based sentence encoder for 101 languages of Multilingual BERT. It is initialized with Multilingual BERT and then fine‑tuned on english MultiNLI[1] and on dev set of multilingual XNLI[2]. Sentence representations are mean pooled token embeddings in the same manner as in Sentence‑BERT[3]. [1]: Williams A., Nangia N. & Bowman S. (2017) A Broad-Coverage Challenge Corpus for Sentence Understanding through Inference. arXiv preprint arXiv:1704.05426 [2]: Williams A., Bowman S. (2018) XNLI: Evaluating Cross-lingual Sentence Representations. arXiv preprint arXiv:1809.05053 [3]: N. Reimers, I. Gurevych (2019) Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks. arXiv preprint arXiv:1908.10084 | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : DeepPavlov/bert-base-multilingual-cased-sentence
### Model URL : https://huggingface.co/DeepPavlov/bert-base-multilingual-cased-sentence
### Model Description : Sentence Multilingual BERT (101 languages, cased, 12‑layer, 768‑hidden, 12‑heads, 180M parameters) is a representation‑based sentence encoder for 101 languages of Multilingual BERT. It is initialized with Multilingual BERT and then fine‑tuned on english MultiNLI[1] and on dev set of multilingual XNLI[2]. Sentence representations are mean pooled token embeddings in the same manner as in Sentence‑BERT[3]. [1]: Williams A., Nangia N. & Bowman S. (2017) A Broad-Coverage Challenge Corpus for Sentence Understanding through Inference. arXiv preprint arXiv:1704.05426 [2]: Williams A., Bowman S. (2018) XNLI: Evaluating Cross-lingual Sentence Representations. arXiv preprint arXiv:1809.05053 [3]: N. Reimers, I. Gurevych (2019) Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks. arXiv preprint arXiv:1908.10084 |
DeepPavlov/distilrubert-base-cased-conversational | https://huggingface.co/DeepPavlov/distilrubert-base-cased-conversational | Conversational DistilRuBERT (Russian, cased, 6‑layer, 768‑hidden, 12‑heads, 135.4M parameters) was trained on OpenSubtitles[1], Dirty, Pikabu, and a Social Media segment of Taiga corpus[2] (as Conversational RuBERT). Our DistilRuBERT was highly inspired by [3], [4]. Namely, we used The model was trained for about 100 hrs. on 8 nVIDIA Tesla P100-SXM2.0 16Gb. To evaluate improvements in the inference speed, we ran teacher and student models on random sequences with seq_len=512, batch_size = 16 (for throughput) and batch_size=1 (for latency).
All tests were performed on Intel(R) Xeon(R) CPU E5-2698 v4 @ 2.20GHz and nVIDIA Tesla P100-SXM2.0 16Gb. If you found the model useful for your research, we are kindly ask to cite this paper: [1]: P. Lison and J. Tiedemann, 2016, OpenSubtitles2016: Extracting Large Parallel Corpora from Movie and TV Subtitles. In Proceedings of the 10th International Conference on Language Resources and Evaluation (LREC 2016) [2]: Shavrina T., Shapovalova O. (2017) TO THE METHODOLOGY OF CORPUS CONSTRUCTION FOR MACHINE LEARNING: «TAIGA» SYNTAX TREE CORPUS AND PARSER. in proc. of “CORPORA2017”, international conference , Saint-Petersbourg, 2017. [3]: Sanh, V., Debut, L., Chaumond, J., & Wolf, T. (2019). DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter. arXiv preprint arXiv:1910.01108. [4]: https://github.com/huggingface/transformers/tree/master/examples/research_projects/distillation | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : DeepPavlov/distilrubert-base-cased-conversational
### Model URL : https://huggingface.co/DeepPavlov/distilrubert-base-cased-conversational
### Model Description : Conversational DistilRuBERT (Russian, cased, 6‑layer, 768‑hidden, 12‑heads, 135.4M parameters) was trained on OpenSubtitles[1], Dirty, Pikabu, and a Social Media segment of Taiga corpus[2] (as Conversational RuBERT). Our DistilRuBERT was highly inspired by [3], [4]. Namely, we used The model was trained for about 100 hrs. on 8 nVIDIA Tesla P100-SXM2.0 16Gb. To evaluate improvements in the inference speed, we ran teacher and student models on random sequences with seq_len=512, batch_size = 16 (for throughput) and batch_size=1 (for latency).
All tests were performed on Intel(R) Xeon(R) CPU E5-2698 v4 @ 2.20GHz and nVIDIA Tesla P100-SXM2.0 16Gb. If you found the model useful for your research, we are kindly ask to cite this paper: [1]: P. Lison and J. Tiedemann, 2016, OpenSubtitles2016: Extracting Large Parallel Corpora from Movie and TV Subtitles. In Proceedings of the 10th International Conference on Language Resources and Evaluation (LREC 2016) [2]: Shavrina T., Shapovalova O. (2017) TO THE METHODOLOGY OF CORPUS CONSTRUCTION FOR MACHINE LEARNING: «TAIGA» SYNTAX TREE CORPUS AND PARSER. in proc. of “CORPORA2017”, international conference , Saint-Petersbourg, 2017. [3]: Sanh, V., Debut, L., Chaumond, J., & Wolf, T. (2019). DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter. arXiv preprint arXiv:1910.01108. [4]: https://github.com/huggingface/transformers/tree/master/examples/research_projects/distillation |
DeepPavlov/distilrubert-tiny-cased-conversational-v1 | https://huggingface.co/DeepPavlov/distilrubert-tiny-cased-conversational-v1 | Conversational DistilRuBERT-tiny (Russian, cased, 3‑layers, 264‑hidden, 12‑heads, 10.4M parameters) was trained on OpenSubtitles[1], Dirty, Pikabu, and a Social Media segment of Taiga corpus[2] (as Conversational RuBERT). It can be considered as tiny copy of Conversational DistilRuBERT-small. Our DistilRuBERT-tiny is highly inspired by [3], [4] and architecture is very close to [5]. Namely, we use The key features are: Here is comparison between teacher model (Conversational RuBERT) and other distilled models. DistilRuBERT-tiny was trained for about 100 hrs. on 7 nVIDIA Tesla P100-SXM2.0 16Gb. We used PyTorchBenchmark from transformers to evaluate model's performance and compare it with other pre-trained language models for Russian. All tests were performed on Intel(R) Xeon(R) CPU E5-2698 v4 @ 2.20GHz and nVIDIA Tesla P100-SXM2.0 16Gb. | Model name | Batch size | Seq len | Time, s || Mem, MB ||
|---|---|---|------||------||
| | | | CPU | GPU | CPU | GPU |
| rubert-base-cased-conversational | 1 | 512 | 0.147 | 0.014 | 897 | 1531 |
| distilrubert-base-cased-conversational | 1 | 512 | 0.083 | 0.006 | 766 | 1423 |
| distilrubert-small-cased-conversational | 1 | 512 | 0.03 | 0.002 | 600 | 1243 |
| cointegrated/rubert-tiny | 1 | 512 | 0.041 | 0.003 | 272 | 919 |
| distilrubert-tiny-cased-conversational | 1 | 512 | 0.023 | 0.003 | 206 | 855 |
| rubert-base-cased-conversational | 16 | 512 | 2.839 | 0.182 | 1499 | 2071 |
| distilrubert-base-cased-conversational | 16 | 512 | 1.065 | 0.055 | 2541 | 2927 |
| distilrubert-small-cased-conversational | 16 | 512 | 0.373 | 0.003 | 1360 | 1943 |
| cointegrated/rubert-tiny | 16 | 512 | 0.628 | 0.004 | 1293 | 2221 |
| distilrubert-tiny-cased-conversational | 16 | 512 | 0.219 | 0.003 | 633 | 1291 | To evaluate model quality, we fine-tuned DistilRuBERT-tiny on classification (RuSentiment, ParaPhraser), NER and question answering data sets for Russian and obtained scores very similar to the Conversational DistilRuBERT-small. If you found the model useful for your research, we are kindly ask to cite this paper: [1]: P. Lison and J. Tiedemann, 2016, OpenSubtitles2016: Extracting Large Parallel Corpora from Movie and TV Subtitles. In Proceedings of the 10th International Conference on Language Resources and Evaluation (LREC 2016) [2]: Shavrina T., Shapovalova O. (2017) TO THE METHODOLOGY OF CORPUS CONSTRUCTION FOR MACHINE LEARNING: «TAIGA» SYNTAX TREE CORPUS AND PARSER. in proc. of “CORPORA2017”, international conference , Saint-Petersbourg, 2017. [3]: Sanh, V., Debut, L., Chaumond, J., & Wolf, T. (2019). DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter. arXiv preprint arXiv:1910.01108. [4]: https://github.com/huggingface/transformers/tree/master/examples/research_projects/distillation [5]: https://habr.com/ru/post/562064/, https://huggingface.co/cointegrated/rubert-tiny | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : DeepPavlov/distilrubert-tiny-cased-conversational-v1
### Model URL : https://huggingface.co/DeepPavlov/distilrubert-tiny-cased-conversational-v1
### Model Description : Conversational DistilRuBERT-tiny (Russian, cased, 3‑layers, 264‑hidden, 12‑heads, 10.4M parameters) was trained on OpenSubtitles[1], Dirty, Pikabu, and a Social Media segment of Taiga corpus[2] (as Conversational RuBERT). It can be considered as tiny copy of Conversational DistilRuBERT-small. Our DistilRuBERT-tiny is highly inspired by [3], [4] and architecture is very close to [5]. Namely, we use The key features are: Here is comparison between teacher model (Conversational RuBERT) and other distilled models. DistilRuBERT-tiny was trained for about 100 hrs. on 7 nVIDIA Tesla P100-SXM2.0 16Gb. We used PyTorchBenchmark from transformers to evaluate model's performance and compare it with other pre-trained language models for Russian. All tests were performed on Intel(R) Xeon(R) CPU E5-2698 v4 @ 2.20GHz and nVIDIA Tesla P100-SXM2.0 16Gb. | Model name | Batch size | Seq len | Time, s || Mem, MB ||
|---|---|---|------||------||
| | | | CPU | GPU | CPU | GPU |
| rubert-base-cased-conversational | 1 | 512 | 0.147 | 0.014 | 897 | 1531 |
| distilrubert-base-cased-conversational | 1 | 512 | 0.083 | 0.006 | 766 | 1423 |
| distilrubert-small-cased-conversational | 1 | 512 | 0.03 | 0.002 | 600 | 1243 |
| cointegrated/rubert-tiny | 1 | 512 | 0.041 | 0.003 | 272 | 919 |
| distilrubert-tiny-cased-conversational | 1 | 512 | 0.023 | 0.003 | 206 | 855 |
| rubert-base-cased-conversational | 16 | 512 | 2.839 | 0.182 | 1499 | 2071 |
| distilrubert-base-cased-conversational | 16 | 512 | 1.065 | 0.055 | 2541 | 2927 |
| distilrubert-small-cased-conversational | 16 | 512 | 0.373 | 0.003 | 1360 | 1943 |
| cointegrated/rubert-tiny | 16 | 512 | 0.628 | 0.004 | 1293 | 2221 |
| distilrubert-tiny-cased-conversational | 16 | 512 | 0.219 | 0.003 | 633 | 1291 | To evaluate model quality, we fine-tuned DistilRuBERT-tiny on classification (RuSentiment, ParaPhraser), NER and question answering data sets for Russian and obtained scores very similar to the Conversational DistilRuBERT-small. If you found the model useful for your research, we are kindly ask to cite this paper: [1]: P. Lison and J. Tiedemann, 2016, OpenSubtitles2016: Extracting Large Parallel Corpora from Movie and TV Subtitles. In Proceedings of the 10th International Conference on Language Resources and Evaluation (LREC 2016) [2]: Shavrina T., Shapovalova O. (2017) TO THE METHODOLOGY OF CORPUS CONSTRUCTION FOR MACHINE LEARNING: «TAIGA» SYNTAX TREE CORPUS AND PARSER. in proc. of “CORPORA2017”, international conference , Saint-Petersbourg, 2017. [3]: Sanh, V., Debut, L., Chaumond, J., & Wolf, T. (2019). DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter. arXiv preprint arXiv:1910.01108. [4]: https://github.com/huggingface/transformers/tree/master/examples/research_projects/distillation [5]: https://habr.com/ru/post/562064/, https://huggingface.co/cointegrated/rubert-tiny |
DeepPavlov/distilrubert-tiny-cased-conversational | https://huggingface.co/DeepPavlov/distilrubert-tiny-cased-conversational | WARNING: This is distilrubert-small-cased-conversational model uploaded with wrong name. This one is the same as distilrubert-small-cased-conversational. distilrubert-tiny-cased-conversational could be found in distilrubert-tiny-cased-conversational-v1. Conversational DistilRuBERT-small (Russian, cased, 2‑layer, 768‑hidden, 12‑heads, 107M parameters) was trained on OpenSubtitles[1], Dirty, Pikabu, and a Social Media segment of Taiga corpus[2] (as Conversational RuBERT). It can be considered as small copy of Conversational DistilRuBERT-base. Our DistilRuBERT-small was highly inspired by [3], [4]. Namely, we used The model was trained for about 80 hrs. on 8 nVIDIA Tesla P100-SXM2.0 16Gb. To evaluate improvements in the inference speed, we ran teacher and student models on random sequences with seq_len=512, batch_size = 16 (for throughput) and batch_size=1 (for latency).
All tests were performed on Intel(R) Xeon(R) CPU E5-2698 v4 @ 2.20GHz and nVIDIA Tesla P100-SXM2.0 16Gb. To evaluate model quality, we fine-tuned DistilRuBERT-small on classification, NER and question answering tasks. Scores and archives with fine-tuned models can be found in DeepPavlov docs. If you found the model useful for your research, we are kindly ask to cite this paper: [1]: P. Lison and J. Tiedemann, 2016, OpenSubtitles2016: Extracting Large Parallel Corpora from Movie and TV Subtitles. In Proceedings of the 10th International Conference on Language Resources and Evaluation (LREC 2016) [2]: Shavrina T., Shapovalova O. (2017) TO THE METHODOLOGY OF CORPUS CONSTRUCTION FOR MACHINE LEARNING: «TAIGA» SYNTAX TREE CORPUS AND PARSER. in proc. of “CORPORA2017”, international conference , Saint-Petersbourg, 2017. [3]: Sanh, V., Debut, L., Chaumond, J., & Wolf, T. (2019). DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter. arXiv preprint arXiv:1910.01108. [4]: https://github.com/huggingface/transformers/tree/master/examples/research_projects/distillation | Indicators looking for configurations to recommend AI models for configuring AI agents
### Model Name : DeepPavlov/distilrubert-tiny-cased-conversational
### Model URL : https://huggingface.co/DeepPavlov/distilrubert-tiny-cased-conversational
### Model Description : WARNING: This is distilrubert-small-cased-conversational model uploaded with wrong name. This one is the same as distilrubert-small-cased-conversational. distilrubert-tiny-cased-conversational could be found in distilrubert-tiny-cased-conversational-v1. Conversational DistilRuBERT-small (Russian, cased, 2‑layer, 768‑hidden, 12‑heads, 107M parameters) was trained on OpenSubtitles[1], Dirty, Pikabu, and a Social Media segment of Taiga corpus[2] (as Conversational RuBERT). It can be considered as small copy of Conversational DistilRuBERT-base. Our DistilRuBERT-small was highly inspired by [3], [4]. Namely, we used The model was trained for about 80 hrs. on 8 nVIDIA Tesla P100-SXM2.0 16Gb. To evaluate improvements in the inference speed, we ran teacher and student models on random sequences with seq_len=512, batch_size = 16 (for throughput) and batch_size=1 (for latency).
All tests were performed on Intel(R) Xeon(R) CPU E5-2698 v4 @ 2.20GHz and nVIDIA Tesla P100-SXM2.0 16Gb. To evaluate model quality, we fine-tuned DistilRuBERT-small on classification, NER and question answering tasks. Scores and archives with fine-tuned models can be found in DeepPavlov docs. If you found the model useful for your research, we are kindly ask to cite this paper: [1]: P. Lison and J. Tiedemann, 2016, OpenSubtitles2016: Extracting Large Parallel Corpora from Movie and TV Subtitles. In Proceedings of the 10th International Conference on Language Resources and Evaluation (LREC 2016) [2]: Shavrina T., Shapovalova O. (2017) TO THE METHODOLOGY OF CORPUS CONSTRUCTION FOR MACHINE LEARNING: «TAIGA» SYNTAX TREE CORPUS AND PARSER. in proc. of “CORPORA2017”, international conference , Saint-Petersbourg, 2017. [3]: Sanh, V., Debut, L., Chaumond, J., & Wolf, T. (2019). DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter. arXiv preprint arXiv:1910.01108. [4]: https://github.com/huggingface/transformers/tree/master/examples/research_projects/distillation |