This application is designed to process live video streams, identify objects using a pre-trained YOLOv6 model, and then trigger actions based on those detections or user input. It achieves this by combining real-time object detection with functionalities from the Hub and a separate user interface.Object DetectionThe application utilizes a pre-trained YOLOv6 model to perform real-time object detection on the video stream. This model is capable of identifying various objects within the frames of the video.User NotificationsThe application allows for the sending of notifications from a separate user interface. These notifications can be used to trigger events or actions within the system.Hub IntegrationThe application integrates with the Hub, a platform that enables communication and control of robotic systems. Based on the detected objects or received notifications, the application can send events to the Hub.Live Stream MonitoringThe application provides a live stream of the video analysis process. This allows users to monitor the video feed and view the objects being detected in real-time.