Integrating DepthAI into products¶
This guide serves as a introduction on how one can integrate DepthAI (Spatial AI platform) into their own custom products. When designing the PCB, please also see OAK Design Guide.
Difficulty of integrating DepthAI into products¶
When designing the DepthAI platform we always made efforts to consider long term integration needs, with the goal of making it as simple as possible to integrate into other products.
What is the OAK System on Module (SoM)?¶
The OAK SoM is a small form-factor PCB that features a powerful Robotics Vision Core 2. The Robotics Vision Core 2 has 16 powerful SHAVE cores and also features the Neural Compute Engine, which is a dedicated hardware accelerator for deep neural network inference. In addition, OAK-SoM-IoT and OAK-SoM-Pro have NOR Flash, which can be used as an alternative to the USB boot. The idea of SoM is that customer can use it to build their own device, since SoM is a very complex, 12 layer PCB. That way our SoM devices serve as an abstraction layer. They can also be used in standalone mode without a host computer, although not all devices support that use-case.
We have 3 types of SoM devices:
The main difference between them is in:
NOR flash capability, OAK-SoM does not have NOR flash by default, while the other two have 1Gbit NOR flash by default (in some iterations 125Mbit is used),
Boot modes they support, for example OAK-SoM-Pro also supports SD-card and Ethernet (EEPROM) boot.
Just like our software and our library, our hardware is opensource too. That way it is not just a black box. It allows you to see how our devices (PCB) are designed and change them however you like. Even high-school students were able to design their own baseboard by modifying existing opensource designs. Most of the complexity is on the SoM, so the baseboard can be a 2-layer PCB.
Here is an example of a baseboard without the SoM:
And here is an example of a SoM on the baseboard - OAK-FFC-4P:
NOR Flash and Powering¶
The OAK-SoM-IoT and OAK-SoM-Pro have the QSPI NOR Flash, which is capable of quick random access location and is used to store and run code. This is the key factor to support the standalone use case.
Power consumption can vary depending on the application. A stereo vision application running Mobilenet-SSD V2 at 30 FPS typically consumes about 2.5 W, but more computationally heavy applications can consume up to 5 W. Most of this power is consumed by the Robotics Vision Core 2.
For more information, see the respective datasheet on our GitHub hardawre repository (OAK-SoM Datasheet.
We’re always happy to help with code or other questions you might have.