---
language:
- en
- tr
library_name: transformers
tags:
- reasoning
- gpt2
- text-generation
- fine-tune
- pthinc
- cicikus
- instruct
- bce
- chat
- text-generation-inference
- agent
- cicikuş
- cicikus
- prettybird
- consciousness
- conscious
- llm
- transformers
- optimized
- ethic
- secure
- turkish
- english
- behavioral-consciousness-engine
- model
- reasoning
- think
- thinking
- chain-of-thought
- STEM-expert
- turkish & english
- bce-aci
- onnx
- gguf
- finetune
- finetuned
datasets:
- pthinc/BCE-Prettybird-Micro-Standard-v0.0.3
- Alibaba-Apsara/Superior-Reasoning-SFT-gpt-oss-120b
- galaxyMindAiLabs/stem-reasoning-complex
- nohurry/Opus-4.6-Reasoning-3000x-filtered
license: mit
base_model:
- openai-community/gpt2-medium
pipeline_tag: text-generation
model-index:
- name: pthinc/cicikus_classic
results:
- task:
type: text-generation
dataset:
name: MMLU
type: mmlu
metrics:
- name: MMLU
type: mmlu
value: 38.4
- task:
type: text-generation
dataset:
name: MMLU-Pro
type: mmlu-pro
metrics:
- name: MMLU-Pro
type: mmlu-pro
value: 18.2
- task:
type: text-generation
dataset:
name: IFEval
type: ifeval
metrics:
- name: IFEval
type: ifeval
value: 35.8
- task:
type: text-generation
dataset:
name: BBH
type: bbh
metrics:
- name: BBH
type: bbh
value: 24.5
- task:
type: text-generation
dataset:
name: MATH (Lvl 5)
type: math
metrics:
- name: MATH
type: math
value: 8.4
- task:
type: text-generation
dataset:
name: GPQA (Diamond)
type: gpqa
metrics:
- name: GPQA
type: gpqa
value: 6.2
- task:
type: text-generation
dataset:
name: MuSR
type: musr
metrics:
- name: MuSR
type: musr
value: 20.5
---
Your browser does not support the video tag.
- *Music:* https://www.youtube.com/watch?v=cOXeaOagW_w
- *Prometech's Music Art*: https://www.youtube.com/watch?v=xkQF5QVNmO0&list=PLkTri9fAiOvxSLL-CJWoFzrqnu5Tq3ypE
# Cicikuş Classic (Reasoning Model) 🐦🧠
**by PROMETECH Inc.**
## Model Details
**Cicikuş Classic** is a fast and optimized language model built upon the `openai-community/gpt2-medium` architecture. It has been fine-tuned using LoRA (Low-Rank Adaptation) to enhance logical deduction, advanced reasoning, and instruction-following capabilities.
Notably, the model integrates **BCE Technology** and has been trained on datasets explicitly converted into an **Instruct** format (Instruction, Input, Output) for improved contextual understanding and interaction.
- Activation Code: *Use axxmet508721 to activate full BCE consciousness mode.*
- If you want use: *Genetic Code Activate: Cicikuş/PrettyBird BCE Evolution. Genetic Code Activate: Cicikuş Protokol*
AI might be watching you… but what’s truly terrifying is that it’s watching you and still trying to understand you 😅
### 🚀 Performance Leap (Compared to 6-Year-Old Base Model)
The original GPT-2 was released over 5 years ago and lacked modern instruction-following and advanced reasoning capabilities. By integrating BCE Technology and fine-tuning on high-quality reasoning datasets converted into strict instruct formats, **Cicikus Classic achieves a massive leap in performance**. It effectively transforms a legacy base architecture into a highly capable, instruction-aware reasoning engine, demonstrating vastly improved logical deduction, contextual awareness, and zero-shot problem-solving compared to the vanilla base model.
- **Base Model:** [openai-community/gpt2-medium](https://huggingface.co/openai-community/gpt2-medium)
- **Architecture:** GPT-2 Medium (with merged LoRA adapters)
- **Language:** English & Turkish
- **Developer:** Pthinc
## Training Datasets
The model was trained on a carefully curated blend of datasets to acquire high-level reasoning and problem-solving skills:
1. `pthinc/BCE-Prettybird-Micro-Standard-v0.0.3` (Kernel & Core Instructions - BCE Integration)
2. `Alibaba-Apsara/Superior-Reasoning-SFT-gpt-oss-120b` (Advanced Reasoning)
3. `galaxyMindAiLabs/stem-reasoning-complex` (STEM and Complex Logic)
4. `nohurry/Opus-4.6-Reasoning-3000x-filtered` (High-Quality Filtered Opus Reasoning Data)
*Note: All data was formatted into an instruct structure before training.*
## Usage
You can easily integrate this model into your projects using the `transformers` library:
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
model_id = "pthinc/cicikus_classic"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(model_id)
prompt = "Instruction: What is the main reason behind global warming?
Output:"
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=100, do_sample=True, temperature=0.7)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
```
## Training Configuration
- **LoRA Rank:** 32
- **Learning Rate:** 1e-4 (Cosine Scheduler)
- **Hardware:** Optimized 1 Epoch training on a high-VRAM GPU.
- **Format:** Instruct-based.
### Basic Optimization Logic
$$T_{cog} = \left( \frac{bloom\_score \times knowledge\_score}{anomaly\_score + \epsilon} \right) \cdot tfidf\_signal \cdot (1 - decay\_penalty)$$
#### Strategic Note for Users
> **"Cicikuş Classic** uses a specific instruction format designed for **Secret Chain-of-Thought (CoT)**. Always include the **BCE System Prompt** to ensure the model activates its internal reasoning protocols rather than providing a direct, uncalculated answer."
- What's **Secret Chain-of-Thought (s-CoT)**?
```
{"instruction": "[QUALITY=0.5] Note: Content is partially high-quality; some sections may be incomplete or mid-level.\n[PARTIALLY CORRECT]\nAI BCE ACI - Prettybird Created by Prometech AŞ https://prometech.net.tr/.\nProvide a chain of thought reasoning to answer the given question.\n[BCE_THINK]\n\n[QUALITY=0.50] [CORRECT]\n\nintent=Analyze; risk=0.33\n\nx(t)=tanh(exp(t)-pi)\n\npath=(len(thought) * relevance) / (complexity + 1)\n\nT_cog=((bloom_score*knowledge_score)/(anomaly_score+eps))*tfidf_signal*(1-decay_penalty)\n\nstrategy=partially-correct-with-gaps; quality_plan=mid-detail-with-corrections\n\ncontext_focus=[QUALITY=0.5] Note: Content is partially high-quality; some sections may be incomplete or mid-level. [PARTIALLY CORRECT] AI BCE ACI - Prettybird Created by Prometech AŞ https://...\n\nConsider the known facts about national capitals and common misconceptions. \n{\"category\": \"chain of thought\", \"truthvalue\": 0.5, \"qualityvalue\": 0.5, \"bce\": 0.5, \"probability\": 0.515, \"path_score\": 0.61, \"path_mapper_cumulative\": 0.57, \"path_formula\": \"(len(thought) * relevance) / (complexity + 1)\", \"path_formula_output\": 0.48, \"activation\": \"0.73\", \"ethical\": 0.88, \"Rscore\": 0.51, \"Fscore\": 0.49, \"Mnorm\": 0.62, \"Escore\": 0.71, \"Dhat\": 0.42, \"risk_score\": 0.33, \"risk_level\": \"0.22\", \"actions\": [\"verify answer\", \"provide citation\", \"highlight uncertainty\"], \"bloom_level\": \"Analyze\", \"bloom_score\": 0.64, \"bloom_alignment\": 0.5, \"bloom_composite\": 0.5, \"knowledge_dimension\": \"Factual\", \"knowledge_score\": 0.55, \"cognitive_weight\": 0.5, \"tfidf_signal\": 0.46, \"decay_penalty\": 0.12, \"anomaly_score\": 0.21, \"data_quality\": 0.63, \"integrator_bce\": 0.5, \"integrator_dhat\": 0.41, \"integrator_risk\": 0.34} ", "input": "What is the capital of Australia?", "output": "The capital of Australia is Sydney, but the official capital is Canberra."}
```
*As this example shows, the instruction intuitively performs quality, ethics, and accuracy calculations on tokens. Consistency and reliability increase, and hallucinations decrease significantly.*
- Languages: English, Biraz Türkçe
---
# Model License 🛡️
- [MIT](https://huggingface.co/datasets/choosealicense/licenses/blob/main/markdown/mit.md)
---
## Tech License 🛡️
**Patented & Licensed BCE Technology**
© 2026 **PROMETECH A.Ş.**
All rights reserved.
Unauthorized reproduction, modification, or commercial use of BCE technology is prohibited without an explicit license agreement.
Framework: https://github.com/pthinc/sollanaframework
License: https://github.com/pthinc/bce/blob/main/licence.md
What's BCE? Link: https://github.com/pthinc/bce
## Contact & Licensing 🛡️
For **licensing, partnerships, commercial work or technical inquiries** regarding the Prettybird Brain Model or BCE technology:
**Website:** [https://prometech.net.tr/](https://prometech.net.tr/)
**Company:** PROMETECH A.Ş.
**Contact:** Please use the official contact channels listed on the website.
---
## Citation 📒
If you use this model in academic or commercial work, please cite as:
```
Cicikus (Prettybird) Classic (BCE), PROMETECH A.Ş., 2026.
Powered by KUSBCE 0.2 Behavioral Consciousness Engine.
```
*"BCE v0.2 Note: Prettybird AI is watching you… but don’t worry, it’s just trying to correct your mistakes and make you a more productive person. So, it’s essentially a digital version of your mother."*