mideavalwisard/Starjob
Updated โข 13 โข 4
How to use tiodh/ministral-8b-jssp-rslora with PEFT:
from peft import PeftModel
from transformers import AutoModelForCausalLM
base_model = AutoModelForCausalLM.from_pretrained("mistralai/Ministral-8B-Instruct-2410")
model = PeftModel.from_pretrained(base_model, "tiodh/ministral-8b-jssp-rslora")A rsLoRA adapter fine-tuned on the Starjob job-shop scheduling problem (JSSP) dataset. The model takes a natural-language description of jobs and machines and produces a feasible schedule that minimizes makespan.
| Hyperparameter | Value |
|---|---|
| Method | rsLoRA (use_rslora = true) |
LoRA rank r |
32 |
| LoRA alpha | 32 |
| Max sequence length | 8192 |
| Per-device batch | 1 |
| Gradient accumulation | 8 (effective batch 8) |
| Epochs | 1 |
| Learning rate | 2e-4 |
| Base quantization | bnb 4-bit (Unsloth) |
200 samples (seed 42) from the small+medium split of Starjob, identical pipeline for LoRA and rsLoRA. Feasibility validates routing order, machine non-overlap, and operation completeness.
| Metric | Value |
|---|---|
| Feasibility | 64.0% (128/200) |
| Exact makespan | 24.5% (49/200) |
| Mean gap | 42.93% |
| Median gap | 9.25% |
| Eval time | 118.9 min |
Full head-to-head LoRA vs rsLoRA comparison and code: github.com/tiodh/slm_jssp.
from peft import PeftModel
from transformers import AutoModelForCausalLM, AutoTokenizer
base = AutoModelForCausalLM.from_pretrained(
"mistralai/Ministral-8B-Instruct-2410",
device_map="auto",
torch_dtype="auto",
)
tok = AutoTokenizer.from_pretrained("mistralai/Ministral-8B-Instruct-2410")
model = PeftModel.from_pretrained(base, "tiodh/ministral-8b-jssp-rslora")
prompt = (
"Optimize schedule for 3 Jobs (denoted as J) across 3 Machines (denoted as M) "
"to minimize makespan...\nJ0:\nM0:5 M1:3 M2:4\nJ1:\nM1:2 M0:4 M2:3\nJ2:\nM2:6 M0:1 M1:5\n"
)
inputs = tok(prompt, return_tensors="pt").to(model.device)
out = model.generate(**inputs, max_new_tokens=512, temperature=0.1, top_p=0.95)
print(tok.decode(out[0], skip_special_tokens=True))
CC BY-SA 4.0 (inherits from the Starjob dataset).
Base model
mistralai/Ministral-8B-Instruct-2410