Edit model card

BERT_NLU_240806_NLU_1

Overview

This model is a fine-tuned version of bert-base-uncased on cntc-brkr/240806_NLU_1. It is designed for Natural Language Understanding (NLU) tasks, specifically intent classification.

Model Details

  • Model Type: BERT (NLU)
  • Language: English
  • Training Data: cntc-brkr/240806_NLU_1
  • Input: Text
  • Output: Intent Classification

Training Parameters

  • Maximum Sequence Length: 200
  • Training Batch Size: 8
  • Validation Batch Size: 4
  • Number of Epochs: 10
  • Learning Rate: 1e-05

Dataset Information

  • Dataset Name: cntc-brkr/240806_NLU_1
  • Task: Intent Classification
  • Number of Classes: 9

Label Classes

The model was trained to classify the following intents:

  • 0: land
  • 1: take_off
  • 2: overwatch
  • 3: search
  • 4: transit
  • 5: detection
  • 6: stop
  • 7: monitor
  • 8: out_of_scope

Weights and Biases Run Information

  • Run information is not available.
Downloads last month
7
Safetensors
Model size
109M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .