BungeoLM

Tiny Korean goldfish language model exported in Hugging Face custom-model format.

Architecture

  • architectures: BungeoForCausalLM
  • model_type: bungeo
  • vocab_size: 4096
  • max_position_embeddings: 128
  • hidden_size: 384
  • num_hidden_layers: 6
  • num_attention_heads: 6
  • intermediate_size: 768

Load

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("drlee1/Bungeo-8.7M", trust_remote_code=True)
tokenizer = AutoTokenizer.from_pretrained("drlee1/Bungeo-8.7M", trust_remote_code=True, use_fast=False)

Inspiration

  • This project was inspired by guppylm
Downloads last month
194
Safetensors
Model size
8.73M params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support