Llama-3.2-1B-Instruct, with domain adapted pretraining (DAPT), also called Continuous Pre-training (CPT) on a Dutch medical corpus, slightly biased towards cardiology.

Training for one full epoch, with a 256 batch size, maximally 768 sequence length and a linear-cosine schedule (details follow..).

This model will be further pre-trained on 5 million cardiology records from the UMCU.

The perplexity was around 5 on the validation set.

Note: this is not instruction tuned, and does not generate an EOS token. Update coming.

Downloads last month
13
Safetensors
Model size
1B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for UMCU/CardioLlama.nl

Finetuned
(1204)
this model
Quantizations
1 model

Dataset used to train UMCU/CardioLlama.nl