PMetal supports YAML configuration files as an alternative to CLI flags. Generate a sample config with pmetal init.
name: "meta-llama/Llama-3.2-1B"
gradient_accumulation_steps: 4
pmetal train --config training.yaml
CLI flags override config file values when both are specified.
| Field | Type | Default | Description |
|---|
name | string | required | HuggingFace ID or local path |
dtype | string | bfloat16 | Data type: float32, float16, bfloat16 |
device | string | gpu | Device: gpu, cpu |
max_seq_len | integer | 0 | Max sequence length (0 = auto) |
| Field | Type | Default | Description |
|---|
r | integer | 16 | LoRA rank |
alpha | float | 32.0 | LoRA scaling factor |
| Field | Type | Default | Description |
|---|
learning_rate | float | 2e-4 | Learning rate |
batch_size | integer | 1 | Micro-batch size |
epochs | integer | 1 | Training epochs |
lr_schedule | string | cosine | LR schedule type |
warmup_steps | integer | 100 | Warmup steps |
gradient_accumulation_steps | integer | 4 | Gradient accumulation |
max_grad_norm | float | 1.0 | Gradient clipping |
weight_decay | float | 0.01 | AdamW weight decay |
| Field | Type | Default | Description |
|---|
path | string | required | Dataset file path |
format | string | auto | Format: jsonl, json, parquet, csv |