tmnam20/VieGLUE
Updated • 181 • 1
How to use tmnam20/bert-base-multilingual-cased-cola-10 with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-classification", model="tmnam20/bert-base-multilingual-cased-cola-10") # Load model directly
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("tmnam20/bert-base-multilingual-cased-cola-10")
model = AutoModelForSequenceClassification.from_pretrained("tmnam20/bert-base-multilingual-cased-cola-10")This model is a fine-tuned version of bert-base-multilingual-cased on the tmnam20/VieGLUE/COLA dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|---|---|---|---|---|
| 0.5762 | 1.87 | 500 | 0.6181 | 0.0372 |
Base model
google-bert/bert-base-multilingual-cased