# Model Card Template

For a public model in Hub, Model Card should be structured as follows:

```code
# Model Details

## Model Description

TODO

- **Developed by:**: TODO
- **Shared by:** TODO
- **Model type:** TODO
- **License:** TODO
- **Resources for more information:**
    - TODO

# Training Details

## Training Data

TODO

# Testing Details

## Metrics

TODO

# Technical Specifications

## Input/Output Details

- **Input**:
    - Name: TODO
    - Info: TODO
- **Output**:
    - Name: TODO
    - Info: TODO

## Model Architecture

TODO

## Throughput

### Model variant: TODO
• **Input shape**: TODO • **Output shape**: TODO
• **Params (M)**: TODO • **GFLOPs**: TODO
| Platform | Precision | Throughput (infs/sec) | Power Consumption (W) |
|----------|-----------|-----------------------|-----------------------|
| RVC2     | TODO      | TODO                  | TODO                  |
| RVC3     | TODO      | TODO                  | TODO                  |
| RVC4     | TODO      | TODO                  | TODO                  |

### Model variant: TODO
...

\* Benchmarked with [DAIv3](https://docs.luxonis.com/software-v3/depthai/), using 2 threads (and the DSP runtime in balanced mode for RVC4).
\* Parameters and FLOPs are obtained from the [onnx-tool](https://github.com/ThanatosShinji/onnx-tool) package.

## Quantization

TODO

# Utilization

TODO

## Example
<link to minimal working example>
```

## Additional guidance

Some models have very specific postprocessing steps that need to be taken to get a meaningful output. We recommend you expand on
those under the Utilization section so other community members can use them as well. Preferably this would be a combination of
code snippets and additional comments.

If you already have a working example feel free to share it as a link either through Github gist, repository, or similar. This
should be a minimal working example that shows passing data into the model through the DepthAI pipeline, parsing the raw output,
and optionally some basic visualization (if relevant).

We already have some existing parsers which you can use for this in the [depthai-nodes](https://github.com/luxonis/depthai-nodes)
library. If none of them are applicable and you made a new parsing node do not hesitate to make a pull request to the repository
and help with thedevelopment.

Important: To make usage of your example as straightforward as possible we strongly encourage you to state the DepthaAI version
(and potentially depthai-nodes version) which you used during testing. This way others can easily replicate your environment.
