bert-unformatted-network-data-test-6-types
This model is a fine-tuned version of roberta-large on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1318
- F1: 0.9624
EXAMPLE FULL NAMES:
label_0 = UDP-lag DDoS, label_1 = benign, label_2 = SYN flood, label_3 = NetBIOS, label_4 = MSSQL, label_5 = LDAP
- Benign traffic from training data
- Benign traffic from outside training data
- malicious UDP-Lag DDoS attack from training data
- malicious UDP-Lag DDoS attack from outside of training data
- malicious SYN flood attack from training data
- malicious SYN flood attack from outside of training data
- malicious NetBIOS DDoS attack from training data
- malicious NetBIOS DDoS attack from outside of training data
- malicious MSSQL DDoS attack from training data
- malicious MSSQL DDoS attack from outside of training data
- malicious LDAP DDoS attack from training data
- malicious LDAP DDoS attack from outside of training data
examples from CIC-DDoS2019 (formatted for model training) https://colab.research.google.com/drive/1PmLep9D3NfMhYsX0soTBhfVXFkawGgGx?authuser=0#scrollTo=ReaH6NCljdsn
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
Training results
Training Loss | Epoch | Step | Validation Loss | F1 |
---|---|---|---|---|
0.1427 | 1.0 | 2250 | 0.1279 | 0.9622 |
0.1348 | 2.0 | 4500 | 0.1517 | 0.9624 |
0.1331 | 3.0 | 6750 | 0.1467 | 0.9613 |
0.1407 | 4.0 | 9000 | 0.1294 | 0.9623 |
0.1229 | 5.0 | 11250 | 0.1318 | 0.9624 |
Framework versions
- Transformers 4.42.0.dev0
- Pytorch 2.3.0+cu121
- Datasets 2.19.2
- Tokenizers 0.19.1
- Downloads last month
- 15
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for Jios/bert-unformatted-network-data-test-6-types
Base model
FacebookAI/roberta-large