Instructions to use text-generation-inference/Mistral-7B-Instruct-v0.2-medusa with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use text-generation-inference/Mistral-7B-Instruct-v0.2-medusa with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("text-generation-inference/Mistral-7B-Instruct-v0.2-medusa", dtype="auto") - Notebooks
- Google Colab
- Kaggle
README.md exists but content is empty.
- Downloads last month
- 7
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support