--- datasets: - SkunkworksAI/Mistralic-7B-1 language: - en pipeline_tag: text-generation license: apache-2.0 --- This repo includes `.gguf` built for HuggingFace/Candle. They will not work with `llama.cpp`. This model should be used with the `Config` [`config_chat_ml`]( https://github.com/huggingface/candle/blob/main/candle-transformers/src/models/mistral.rs). Refer to the [original repo](https://huggingface.co/SkunkworksAI/Mistralic-7B-1) for more details.