This is a Gemma model uploaded using the KerasHub library and can be used with JAX, TensorFlow, and PyTorch backends. This model is related to a CausalLM task.

Model config:

  • name: gemma_backbone
  • trainable: True
  • vocabulary_size: 256000
  • num_layers: 18
  • num_query_heads: 8
  • num_key_value_heads: 1
  • hidden_dim: 2048
  • intermediate_dim: 32768
  • head_dim: 256
  • layer_norm_epsilon: 1e-06
  • dropout: 0
  • query_head_dim_normalize: True
  • use_post_ffw_norm: False
  • use_post_attention_norm: False
  • final_logit_soft_cap: None
  • attention_logit_soft_cap: None
  • sliding_window_size: 4096
  • use_sliding_window_attention: False
Downloads last month
24
Inference Examples
Inference API (serverless) does not yet support keras-hub models for this pipeline type.

Model tree for harishnair04/gemma_instruct_medtr_2b

Base model

google/gemma-2-2b
Finetuned
(475)
this model

Dataset used to train harishnair04/gemma_instruct_medtr_2b

Space using harishnair04/gemma_instruct_medtr_2b 1