# Roboflow Workflows

## Overview

[Roboflow Workflows](https://roboflow.com/workflows/build) is a visual, no-code system for building and deploying end-to-end
computer vision pipelines. It allows you to connect models, pre- and post-processing steps, and decision logic into a single,
production-ready workflow—without writing glue code. With Workflows, models trained on Roboflow can be turned into deployed vision
applications quickly, while keeping pipelines easy to understand, debug and iterate on.

In this integration, we bring Roboflow Workflows directly to a DepthAI device so you can see results instantly in the real world.
No complicated setup. No heavy coding. Just plug in your device, connect your workflow and watch everything come to life.

And because OAK4 is a fully standalone AI camera, the entire Roboflow Workflow can run directly on-device—no extra hardware, no
cloud dependency, and no code required. It’s one of the fastest ways to turn a Roboflow project into a real-world computer vision
system.

## Usage

> For a complete, working example, see the
> [Roboflow Workflow OAK example](https://github.com/luxonis/oak-examples/tree/main/integrations/roboflow-workflow)
> .

### Getting Started

Before running the app, you’ll need an existing [Roboflow Workflow](https://roboflow.com/workflows/build).

Follow these steps to get up and running:

 * Create your Workflow in the Roboflow web app
 * Click Deploy → Video → Live Video
 * Select Run locally
 * Clone the [Roboflow Workflow OAK example](https://github.com/luxonis/oak-examples/tree/main/integrations/roboflow-workflow)
 * Copy the required values (workspace name, workflow ID, etc.) into the app’s config.yaml
 * Add your Roboflow API key
 * Ensure the workflow parameters match the inputs defined in your Workflow

You can also use config.yaml to customize settings such as device type, frame size and frame rate to better fit your application.

### Running the App

This app runs on an OAK4 device in standalone mode. The easiest way to test it is to connect to your device locally using
[oakctl](https://docs.luxonis.com/software-v3/oak-apps/oakctl.md), then run the app directly from the example directory:

```bash
oakctl connect
oakctl app run . # assumes you are already in the example directory
```

Then open the local DepthAI Visualizer in your browser to see everything in action. The Roboflow Workflow starts automatically on
the live video stream and you can adjust parameters in real time or even switch between different workflows on the fly.

The app determines how results are displayed using simple naming conventions:

 * Output names containing "predictions" are parsed into DepthAI bounding boxes and rendered as overlays
 * Output names containing "visualization" are displayed as images
 * All other workflow outputs are ignored

If you want an output to appear in the visualizer, just give it a clear, descriptive name.

> At the moment this example runs on the OAK4 CPU. An optimized version with hardware accelerator support is coming soon for
faster on-device inference.
