Skip to content
Back to Docs

API Reference

VLAModel

The main model class for loading and running VLA models.

VLAModel.from_preset(name)

Load a model by preset name.

model = VLAModel.from_preset("openvla-7b")

Parameters:

  • name (str) — Preset name: openvla-7b, smolvla-450m, dream-vla-7b

Returns: VLAModel instance

VLAModel.from_checkpoint(path)

Load a model from a local checkpoint.

model = VLAModel.from_checkpoint("./checkpoints/my-model")

model.predict(image, instruction)

Run inference on a single image-instruction pair.

Parameters:

  • image (PIL.Image) — Camera image
  • instruction (str) — Natural language instruction

Returns: Action object with fields: x, y, z, roll, pitch, yaw, gripper

Action

@dataclass
class Action:
    x: float       # Position delta x
    y: float       # Position delta y
    z: float       # Position delta z
    roll: float    # Orientation delta roll
    pitch: float   # Orientation delta pitch
    yaw: float     # Orientation delta yaw
    gripper: float # Gripper state (0=closed, 1=open)

VLATrainer

Training interface for fine-tuning VLA models.

VLATrainer(config)

Parameters:

  • config (TrainingConfig) — Training configuration

TrainingConfig

FieldTypeDefaultDescription
modelstrrequiredPreset name or checkpoint path
datasetstrrequiredPath to HDF5 dataset
methodstr"lora"Training method: lora, qlora, full
num_epochsint10Number of training epochs
learning_ratefloat2e-5Learning rate
batch_sizeint16Batch size
lora_rankint16LoRA rank
lora_alphaint32LoRA scaling factor
lora_dropoutfloat0.05LoRA dropout
wandb_projectstrnullW&B project name
mixed_precisionstr"bf16"Mixed precision mode

RobotController

Deploy a model to a real or simulated robot.

RobotController(model, robot, control_hz)

Parameters:

  • model (VLAModel) — Trained model
  • robot (str) — Robot name: widowx, franka_panda, ur5
  • control_hz (float) — Control loop frequency

Methods

  • start() — Initialize robot connection
  • execute(instruction, max_steps=50) — Run task
  • stop() — Clean shutdown

CLI

vlarobot presets                          # List available presets
vlarobot train --model <name> --dataset <path> --method <method>
vlarobot evaluate --model <name> --benchmark <name>
vlarobot predict --model <name> --image <path> --instruction "<text>"
vlarobot --version                        # Show version