Skip to main content
Local mode is the fastest way to use roz. The CLI runs the agent loop directly in your process — no roz-server, no database, no NATS to set up. You bring your own LLM API key (or use Ollama for fully offline operation), and roz handles the rest.
“Local mode” means no roz-server in the middle — the CLI runs the agent directly. The agent still makes HTTP calls to your LLM provider (Anthropic API, OpenAI API, or Ollama on localhost:11434) and connects to sim containers via MCP (HTTP) and the gRPC bridge. “Local” refers to the orchestration, not the network.

How It Works

roz-cli → roz-local → roz-agent → LLM provider (HTTP)
                                 → roz-copper → sim container (MCP + gRPC bridge)
                                              → real robot (MAVLink / ROS 2 / Zenoh)
  • roz-cli accepts your natural language input and renders the TUI.
  • roz-local orchestrates the session in your process (no roz-server needed).
  • roz-agent calls the LLM provider via HTTP, generates WASM controllers, and dispatches MCP tool calls.
  • roz-copper executes WASM controllers at 100Hz and manages the channel interface.
  • The target is either a sim container (Docker, for development) or a real robot (connected via MAVLink, ROS 2, or Zenoh).

Configure Your LLM Provider

roz is provider-agnostic. Set the API key for whichever provider you want to use.
export ANTHROPIC_API_KEY="sk-ant-..."

Select a Model

Create a roz.toml in your project directory (or ~/.config/roz/roz.toml for global config) to set the model:
[agent]
provider = "anthropic"        # "anthropic", "openai", "google", or "ollama"
model = "claude-sonnet-4-20250514"  # model name for the chosen provider

[agent.ollama]
url = "http://localhost:11434" # only needed for Ollama
model = "llama3.1:70b"
If no roz.toml is present, roz defaults to Anthropic Claude Sonnet if ANTHROPIC_API_KEY is set, then falls back to OpenAI, Google, and finally Ollama in that order.

Connect to a Robot

Simulation (Docker containers)

For development and testing, use pre-built sim containers from Docker Hub. Each bundles Gazebo, the robot’s middleware, and an MCP server — no ROS 2 or Gazebo installation needed on your host.
# UR5 manipulator arm
docker pull bedrockdynamics/substrate-sim:ros2-manipulator

# PX4 quadcopter
docker pull bedrockdynamics/substrate-sim:px4-gazebo-humble

# ArduPilot quadcopter
docker pull bedrockdynamics/substrate-sim:ardupilot-gazebo

# Nav2 mobile robot
docker pull bedrockdynamics/substrate-sim:ros2-nav2

Real hardware

For real robots, roz connects directly — no Docker container needed:
  • Manipulators — connect via ROS 2 (MoveIt2) or your robot’s native API
  • Drones — connect via MAVLink over serial or UDP to the flight controller
  • Mobile robots — connect via ROS 2 Nav2 or Zenoh
See the Robots pages for per-robot connection details and the Edge Deployment guide for deploying to robot hardware.

Start a Session

Launch the simulation and start an interactive session:
roz sim start manipulator
roz
The CLI connects to the running sim container, discovers the available MCP tools and WASM channels, and presents a prompt. Type natural language commands and the agent handles the rest.

When to Use Local Mode

Local mode is ideal for:
  • Development and experimentation — fast iteration with no infrastructure to manage.
  • Offline operation — pair with Ollama for fully air-gapped environments.
  • Single-robot sessions — one operator, one robot, one machine.
For multi-user access, persistent session history, or fleet management, see Self-Hosting or Roz Cloud. For deploying the agent to robot hardware, see Edge Deployment.