inclusionAI/Ling-mini-2.0
#1377
by
Poro7
- opened
Sure!
You can check for progress at http://hf.tst.eu/status.html or regularly check the model
summary page at https://hf.tst.eu/model#Ling-mini-2.0-GGUF for quants to appear.
Unfortunately, BailingMoeV2ForCausalLM is not supported by llama.cpp at this time