bert-base-uncased-ag-news
Model description
bert-base-uncased
finetuned on the AG News dataset using PyTorch Lightning. Sequence length 128, learning rate 2e-5, batch size 32, 4 T4 GPUs, 4 epochs. The code can be found here
Limitations and bias
- Not the best model...
Training data
Data came from HuggingFace's datasets
package. The data can be viewed on nlp viewer.
Training procedure
...
Eval results
...
- Downloads last month
- 253
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.