Connect hardware
to AI agents.

APIS generates the device-side binary protocol and the host-side MCP layer from a single interface definition — so any AI agent can discover, call, and control physical hardware without custom integration code.

Physical Device / Sensor
APIS binary protocol
Generated APIS Runtime
MCP
Calls AI Agent

The protocol gap between hardware and AI

Two codebases, one interface

Connecting a device to an AI agent means writing integration code on the device and on the host. They drift the moment the hardware changes.

Host-driven timing breaks devices

Multi-step operations driven round-trip from a host AI agent introduce timing errors. Sensor acquisition sequences, actuator chains, and fusion pipelines need to execute on the device — not be orchestrated over a network.

Partial failures are unrecoverable

When a sequence fails partway through, the device is left in an undefined state. The AI agent has no reliable mechanism to recover physical hardware from a half-executed operation.

How it works

Step 1 — Describe

Describe your hardware

APIS generates an Interface Definition Language (IDL) file from whatever describes your hardware most clearly: a text prompt, a requirements document, a PCB layout, an ARM SVD register map, or a CAN DBC database. You review and refine the generated endpoints before compiling.

// Generated from IMU register map + requirements
interface IMU_Sensor {

  // Fused orientation output (on-device)
  fn get_orientation() -> Quaternion;

  // Configures sample rate
  fn set_sample_rate(hz: u16) -> Result;

  // Resets device to idle
  fn reset() -> Result;
}
Step 2 — Generate

Generate both ends

The APIS compiler — available as a web tool or IDE plugin — takes the IDL and outputs the binary protocol implementation for your microcontroller and the MCP server layer for the host. Both are generated from the same definition and stay in sync when the IDL changes.

# Generated host-side MCP tool call

result = agent.call_tool(
    "IMU_Sensor_get_orientation",
    {}
)

# quaternion: { w: 0.99, x: 0.01,
#               y: 0.03, z: 0.00 }
Step 3 — Connect

AI agents call hardware natively

Your device is now discoverable and callable via standard MCP. AI agents compose multi-step operation sequences on the host side at runtime. APIS sends each sequence to the device in a single round-trip and the device executes it locally.

# Host-side promise chain
chain = (
    apis.chain()
    .call("set_sample_rate", hz=200)
        .on_fail("reset")
    .call("get_orientation")
        .on_fail("reset")
)

result = agent.submit(chain)

Core capabilities

Single IDL, both ends generated

One interface definition produces the device-side binary protocol implementation and the host-side MCP layer. Change the IDL — both ends regenerate. No manual synchronization, no drift.

Host-composed promise chaining

Operation sequences are composed in host code at runtime — including by AI agents responding to context. The full chain is sent to the device in one call. The device executes it locally, without the host driving each step over the network.

Engineer-defined safety states

Each step in a chain can declare a safety handler — a specific action the device takes if execution cannot continue from that point. What recovery means is defined by the engineer per step. The device resolves it at the point of failure, without waiting for the host.

Session-level access control

Permissions are defined per endpoint in the IDL and resolved once when an agent connects. The data path carries no per-call authentication overhead. A high-frequency sensor stream is not interrupted by permission checks.

Actionable usage analytics

APIS logs call timing, payload sizes, and sampling frequencies across sessions. A dedicated analytics agent reads those logs and identifies specific changes: which calls can be merged, which polling loops are redundant, what the measured impact on latency and power is.

Binary protocol for constrained devices

The protocol running on the device is compact and designed for microcontrollers with tight memory and CPU budgets. It does not assume a capable host operating system or a full network stack on the device.

Built for engineers connecting hardware to AI

APIS is suited to teams where low latency, resource constraints, and safe failure handling are hard requirements — not nice-to-haves.

Robotics

Expose fused pose estimates and actuator commands. AI agents plan motion without managing sensor acquisition timing.

Medical Devices & Wearables

Surface filtered vitals and device state. On-device processing stays on-device. Regulatory auditability is supported by session logging.

Industrial Equipment

Connect controllers and sensor arrays to AI agents without writing bespoke protocol bridges from scratch.

Autonomous Platforms

Sensor fusion pipelines, actuator chains, and safety-critical sequences all run on-device. AI agents call results, not intermediate steps.

Embedded Systems Teams

Start from your existing hardware artifacts — register maps, bus databases, schematics — and generate a working AI interface without rewriting your firmware.

AI Application Developers

Physical hardware becomes a callable MCP tool. Your AI agent calls a device the same way it calls any software function — no embedded systems expertise required on the application side.

APIS is in early access

We have a working MVP. If you are connecting hardware to AI agents and want to try it, leave your email and we will reach out directly.

No pitch calls. We will send you access to the tool.