Edit model card

HelpingAI-180B-base

Description

The HelpingAI-180B-base model is a large-scale artificial intelligence model developed to assist in various natural language processing tasks. Trained on a diverse range of data sources, this model is designed to generate text, facilitate language understanding, and support various downstream tasks.

Model Information

  • Model size: 176 billion parameters
  • Training data: Diverse datasets covering a wide range of topics and domains.
  • Training objective: Language modeling with an emphasis on understanding and generating human-like text.
  • Tokenizer: Gemma tokenizer

Intended Use

The HelpingAI-180B-base model is intended for researchers, developers, and practitioners in the field of natural language processing (NLP). It can be used for a variety of tasks, including but not limited to:

  • Text generation
  • Language understanding
  • Text summarization
  • Dialogue generation This model for research
Downloads last month
33
Safetensors
Model size
176B params
Tensor type
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.