Safetensors
Turkish

You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

This is a Domain Adaptive PreTrained llama model. Made for experimental purposes.

You can use modeling files from this GitHub repo.

  • Model Size: 52,177,152
  • Vocab Size: 32,768
  • Context Length: 512
  • Embedding Dimension: 256
  • Attention Heads: 128
  • KV Groups: 64
  • Hidden Dimension: 2048
  • Number of Layers: 20

Original pretrained model is trained on 1/4 of aliarda/turkish-news-1.8M-tokenized. This model is then trained on 80% of alibayram/hepsiburada_yorumlar with the goal of adapting to another domain.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for aliarda/llama-50M-DAPT-Hepsiburada

Finetuned
(1)
this model

Datasets used to train aliarda/llama-50M-DAPT-Hepsiburada