How to use gyaudhw/gptbert-2025-strictsmall-mixed with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="gyaudhw/gptbert-2025-strictsmall-mixed", trust_remote_code=True)
# Load model directly from transformers import AutoModelForMaskedLM model = AutoModelForMaskedLM.from_pretrained("gyaudhw/gptbert-2025-strictsmall-mixed", trust_remote_code=True, dtype="auto")
Exported GPT-BERT checkpoint for BabyLM 2025 strict-small evaluation.
This repository branch contains one checkpoint exported from local training for evaluation usage.