Sample Design - OpenVisualCloud/Smart-City-Sample GitHub Wiki

General Design

The sample can scale to multiple regional offices in a distributed design. Each regional office manages an array of sensors. The sensor information is processed locally for low-latency responses, and then aggregated to the cloud central office for summary and presentation. Each office has local storage for recording sensor data, and also uploads a subset of the recordings to the central office for permanent archival or post-processing.

Office Design

Within each regional office, multiple services are designed around the database operations, each retrieves the work order by querying the database and submits the processing results back into the database.

For example, the camera discovery service starts by registering itself to the database, and then scan the network for available cameras. For any available cameras, the camera discovery service pairs them with the provisioning parameters (stored in the database) and then registers the cameras into the database. At the same time, the analytics instances search for available idle cameras and attach to them to start streaming, recording and analytics. The processed results are sent back to the database for post-processing (such as creating alerts or performing smart-upload.)

Analytics Pipeline

The analytics processing is enabled by the Intel® OpenVINO™ Toolkit inference engines, packaged as part of the Open Visual Cloud software stacks. The input to the analytics pipeline is the RTSP stream from a simulated or real IP camera. The pipeline performs object detection, people/crowd counting analytics (depending on the sample scenarios, ) while recording the camera streams into a set of recording files. The processed metadata are saved to the database for additional actions such as triggering an alert, or uploading to the cloud.

See Also

⚠️ **GitHub.com Fallback** ⚠️