Instructions to use Bachhoang/continual-pretraining-checkpoint with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- PEFT
How to use Bachhoang/continual-pretraining-checkpoint with PEFT:
from peft import PeftModel from transformers import AutoModelForCausalLM base_model = AutoModelForCausalLM.from_pretrained("Bachhoang/vbd-llama2-7B-legals-chat") model = PeftModel.from_pretrained(base_model, "Bachhoang/continual-pretraining-checkpoint") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- b29109b07040d2f4872a27bcc527fe86f8324e48aa0aa4123701774e76b56324
- Size of remote file:
- 795 kB
- SHA256:
- 450f5e8ed9b73ec6e9c31822ac667f93451914e8f80a5ef5a3be71916e44c506
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.