Edit model card

Bert2Bert (Encoder-Decoder) on Liputan6 100k dataset

Dataset source: https://huggingface.co/datasets/fajrikoto/id_liputan6
Model used for Fine Tuning (Encoder-Decoder):
https://huggingface.co/indolem/indobert-base-uncased

Trained on 1x3090 @ 8 epoch (EarlyStopping Callbacks)

Train logs, metrics, and params: https://wandb.ai/willy030125/huggingface/runs/2qk3jtic
https://www.comet.com/willy030125/huggingface/560ed6ccde1240c8b4401918fd27253a
Eval results and Perplexity: eval_results.json

Usage:

from transformers import AutoTokenizer, EncoderDecoderModel
tokenizer = AutoTokenizer.from_pretrained("Willy030125/Bert2Bert_Liputan6_100k_10epoch_IndoBERT")
model = EncoderDecoderModel.from_pretrained("Willy030125/Bert2Bert_Liputan6_100k_10epoch_IndoBERT")
Downloads last month
1
Safetensors
Model size
250M params
Tensor type
F32
·
Inference Examples
Inference API (serverless) is not available, repository is disabled.