Qwen 7B Maritime (Marine) Adapters

This repository contains LoRA adapters for the Maritime-LLM project, trained via progressive continual pretraining on maritime domain data.

πŸ“¦ Available Checkpoints

All checkpoints are stored in subfolders within this repository.

Description Subfolder Name Training Steps
Phase 1a Short Context (2,157 steps) phase1a-short-ckpt2157 2157
Phase 1a Short Context (4,314 steps) phase1a-short-ckpt4314 4314
Phase 1a Short Context (6,471 steps) phase1a-short-ckpt6471 6471
Phase 1a Short Context (8,628 steps - Final) phase1a-short-ckpt8628 8628
Phase 1b Medium Context (872 steps) phase1b-medium-ckpt872 872

πŸš€ Usage

You can load any specific checkpoint by specifying the subfolder argument.

from transformers import AutoModelForCausalLM, AutoTokenizer
from peft import PeftModel

# 1. Load Base Model
base_model_id = "Qwen/Qwen2.5-7B-Instruct"
base_model = AutoModelForCausalLM.from_pretrained(
    base_model_id, 
    device_map="auto", 
    torch_dtype="auto"
)

# 2. Load Specific Adapter Checkpoint
# Example: Loading Phase 1a Final Checkpoint
adapter_id = "naga080898/qwen7b-marine"
subfolder = "phase1a-short-ckpt8628" 

model = PeftModel.from_pretrained(
    base_model, 
    adapter_id, 
    subfolder=subfolder
)

# 3. Inference
tokenizer = AutoTokenizer.from_pretrained(base_model_id)
prompt = "Explain the safety procedure for enclosing space entry:"
inputs = tokenizer(prompt, return_tensors="pt").to(model.device)

outputs = model.generate(**inputs, max_new_tokens=200)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))

Training Phases

  1. Phase 1a (Short Context): Foundation maritime knowledge training (Short context window).
  2. Phase 1b (Medium Context): Extended context training with longer documents.
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for naga080898/qwen7b-marine

Base model

Qwen/Qwen2.5-7B
Adapter
(1651)
this model