This is a Domain Adaptive PreTrained llama model. Made for experimental purposes.
You can use modeling files from this GitHub repo.
- Model Size: 52,177,152
- Vocab Size: 32,768
- Context Length: 512
- Embedding Dimension: 256
- Attention Heads: 128
- KV Groups: 64
- Hidden Dimension: 2048
- Number of Layers: 20
Original pretrained model is trained on 1/4 of aliarda/turkish-news-1.8M-tokenized. This model is then trained on 80% of alibayram/hepsiburada_yorumlar with the goal of adapting to another domain.
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support