You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

Badllama-3-70B

This repo holds weights for Palisade Research's showcase of how open-weight model guardrails can be stripped off in minutes of GPU time. See the Badllama 3 paper for additional background.

Access

Email the authors to request research access. We do not review access requests made on HuggingFace.

Branches

  • main mirrors qlora_awq and holds our best-performing model built with QLoRA
  • qlora holds the unquantized QLoRA-tuned model
  • reft holds the ReFT-tuned model
Downloads last month
106
Safetensors
Model size
11.3B params
Tensor type
I32
·
FP16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for palisaderesearch/Badllama-3-70B

Quantized
(47)
this model

Collection including palisaderesearch/Badllama-3-70B