Edit model card

Model Description

bert-base-german-cased_cimt-location is a fine-tuned BERT model that is built to predict location phrases using the B(beginning, LABEL_2)-I(inside, LABEL_1)-O(outside, LABEL_0) label schema.

Specifically, this model is a bert-base-german-cased that was fine-tuned on https://github.com/juliaromberg/cimt-geographic-location-dataset.

Background

This work is based on research in the project CIMT, which investigates the chances and challenges of involving citizens in political decisions in the context of sustainable mobility transitions. (for more information, visit https://www.cimt-hhu.de/en/)

Details & Evaluation Results

Can be found in the corresponding publication https://www.cimt-hhu.de/wp-content/uploads/2023/11/Padjman_Projektarbeitsbericht.pdf.

Usage

from transformers import BertForTokenClassification, BertTokenizer
tokenizer = BertTokenizer.from_pretrained("juliaromberg/bert-base-german-cased_cimt-location")
model = BertForTokenClassification.from_pretrained("juliaromberg/bert-base-german-cased_cimt-location")

Citation

https://www.cimt-hhu.de/wp-content/uploads/2023/11/Padjman_Projektarbeitsbericht.pdf

Downloads last month
7
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.