Instructions to use eswardivi/llamathon with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- PEFT
How to use eswardivi/llamathon with PEFT:
from peft import PeftModel from transformers import AutoModelForCausalLM base_model = AutoModelForCausalLM.from_pretrained("meta-llama/Meta-Llama-3-8B-Instruct") model = PeftModel.from_pretrained(base_model, "eswardivi/llamathon") - Notebooks
- Google Colab
- Kaggle
This model is finetuned on microsoft/orca-math-word-problems-200k using MonsterAPI's No Code Finetuning Tool.
- Downloads last month
- 2
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support