Vojtech Cekal
cekal
AI & ML interests
None yet
Organizations
Error when running with transformers.
β
5
23
#2 opened 10 months ago
by
ammar90
Different answer after each request
2
#130 opened almost 2 years ago
by
amin2557
Mistral 7B produces different results when we hit via postman api
7
#124 opened almost 2 years ago
by
DivyaKanniah
Discussion Thread - Let's Get To Know the (Allegedly) Leaked "mistral-medium" better!
π
4
2
#4 opened almost 2 years ago
by
cekal
How to fine-tune this? + Training code
π
β€οΈ
13
44
#19 opened about 2 years ago
by
cekal
Responses by phi-2 are off, it's depressed and insults me for no reason whatsover?
7
#61 opened almost 2 years ago
by
NightcoreSkies
How to use system prompt?
π
1
1
#69 opened almost 2 years ago
by
mznw
Is there any simple way to solve the problem of redundant output
3
#68 opened almost 2 years ago
by
jjplane
How many GPUs do we need to run this out of box?
3
#63 opened about 2 years ago
by
kz919
WARNING:root:Some parameters are on the meta device device because they were offloaded to the cpu.
β
2
4
#25 opened about 2 years ago
by
deleted
FINE TUNING with PEFT MIXTRAL
2
#17 opened about 2 years ago
by
TK4000
PEFT based Fine Tuned model hallucinates values from the fine tuning training data while inferencing.
7
#111 opened about 2 years ago
by
Pradeep1995
Delete model-00010-of-00019.safetensors
1
#21 opened about 2 years ago
by
dynamicmortal
Freaking outstanding! GPT-4 confirms
3
#1 opened about 2 years ago
by
TioPeperino
PEFT Finetuning Code Please
9
#3 opened over 2 years ago
by
kdua
Support for LoRA?
π
3
17
#3 opened over 2 years ago
by
cekal
Which dataset was used to train this model?
4
#1 opened over 2 years ago
by
cekal
Hallucinates too much
4
#6 opened over 2 years ago
by
WajihUllahBaig
Customize README
1
#1 opened over 2 years ago
by
muelletm