Skip to content

Configuration File

PMetal supports YAML configuration files as an alternative to CLI flags. Generate a sample config with pmetal init.

model:
name: "meta-llama/Llama-3.2-1B"
dtype: bfloat16
device: gpu
max_seq_len: 2048
lora:
r: 16
alpha: 32.0
training:
learning_rate: 2e-4
batch_size: 1
epochs: 3
lr_schedule: cosine
warmup_steps: 100
gradient_accumulation_steps: 4
max_grad_norm: 1.0
weight_decay: 0.01
dataset:
path: "./train.jsonl"
format: jsonl
Terminal window
pmetal train --config training.yaml

CLI flags override config file values when both are specified.

FieldTypeDefaultDescription
namestringrequiredHuggingFace ID or local path
dtypestringbfloat16Data type: float32, float16, bfloat16
devicestringgpuDevice: gpu, cpu
max_seq_leninteger0Max sequence length (0 = auto)
FieldTypeDefaultDescription
rinteger16LoRA rank
alphafloat32.0LoRA scaling factor
FieldTypeDefaultDescription
learning_ratefloat2e-4Learning rate
batch_sizeinteger1Micro-batch size
epochsinteger1Training epochs
lr_schedulestringcosineLR schedule type
warmup_stepsinteger100Warmup steps
gradient_accumulation_stepsinteger4Gradient accumulation
max_grad_normfloat1.0Gradient clipping
weight_decayfloat0.01AdamW weight decay
FieldTypeDefaultDescription
pathstringrequiredDataset file path
formatstringautoFormat: jsonl, json, parquet, csv