Athena‐Aqua Wiki Page - Atoth8411/CSSE4011-Athena-Aqua GitHub Wiki

Project Overview


Project Description

System monitors a restricted zone, detecting and localizing unauthorized human presence using an embedded AI camera system. A servo-controlled ESP32-CAM scans the space and ultrasonic sensor provides distance data within a specified radius. When an unauthorized person is detected, their location is logged, and a blockchain-based transaction records the event immutably for auditing and trust.

Team Member Responsibility

Jayden Maynard: Jayden is responsible for the implementation of image processing and inference logic on the Jetson BOXER-8251AI. This includes receiving and decoding the MJPEG video stream from the ESP32-CAM, performing human detection and classification, and identifying unauthorized individuals in real time. Jayden will also develop the logic to calculate and send servo correction angles via BLE to the turret node based on visual target tracking.

Attila Toth: Attila is responsible for the implementation of all communication protocols that is used within the project between embedded devices. This includes the Wi-Fi streaming from the camera board of ESP32 cam to the jetson NX Xavier, and all BLE GATT communication to transmitting the distance and angle. The BLE GATT will include communication of polar position to the base node to show on the dashboard, and communication from the jetson to update the angle on the sensor node. Additional responsibilities also include all calculation required to get the change in angle from the position difference of the detection in the image frame, and implementation of methods to control the device through console commands (e.g. start and stop detection).

Newton Long: Newton is responsible for developing the hardware control logic on the turret node. This includes implementing PWM-based servo control on the nRF52840-DK, managing ultrasonic distance sensing, and coordinating these with BLE communication flows. Newton will ensure accurate and smooth rotation of the turret in response to tracking commands and will capture distance data for each scanned angle. He will also oversee the design and integration of the dashboard system This includes formatting and transmitting telemetry and blockchain events to the PC for real-time visualization.

System overview

Sensor Turret Node (ESP32-CAM + nRF52840-DK)

  • Captures live MJPEG video using ESP32-CAM
  • Measures distance with ultrasonic sensor
  • Rotates using servo motor (controlled by nRF52840)
  • Receives tracking commands via BLE from Jetson
  • Sends telemetry (servo angle + distance) to base node

Inference Node (Jetson BOXER-8251AI)

  • Receives and processes video stream from ESP32-CAM
  • Detects and classifies individuals (authorized vs unauthorized)
  • Sends servo control commands to turret via BLE (this will require a BlueTooth dongle)
  • Forwards detection events to base node

Base Node (M5-Core2)

  • Receives telemetry from the turret node
  • Computes target localization in polar coordinates (r, θ)
  • Generates and signs blockchain event logs
  • Sends data to PC over UART for storage and visualization

Dashboard Node (PC)

  • Acts as a Wi-Fi access point
  • Hosts real-time dashboard (Python GUI)
  • Displays target location and event history
  • Stores on blockchain (tamper-proof logging)

Project Block diagram

DIKW Pyramid Abstraction

System Integration

The system integrates multiple embedded nodes, each performing a specialized function within a real-time human detection and localization pipeline. The nodes are connected through a combination of Wi-Fi, BLE, UART, and MQTT/WebSocket protocols to form a cohesive, responsive IoT system.

The ESP32-CAM and ultrasonic sensor are mounted on a servo-controlled rotating turret. The nRF52840-DK on the turret controls servo motion and captures distance readings. This acts as the local control and telemetry hub. The ESP32-CAM streams MJPEG video over Wi-Fi to the Jetson BOXER-8251AI, which performs real-time detection and classification(e.g., OpenCV, Pytorch, ResNet models).

Upon detecting an unauthorized individual, the Jetson sends BLE commands to the turret node. The BLE packets will have information to correct servo angle to track the target. In another thread, the turret logs the current angle and ultrasonic distance and transmits this telemetry to the base node.

The base node calculates the (r, θ) position of the detected individual, logs this data immutably via a blockchain transaction, and forwards it to the PC via UART. The PC visualizes real-time system data using a Python or Grafana-based dashboard and stores the event history.

Wireless Network

Communication Summary

Source Destination Medium Purpose
ESP32-CAM Jetson BOXER-8251AI Wi-Fi (HTTP) MJPEG video stream for real-time person detection
Jetson Turret Node (nRF52840-DK) BLE GATT Send servo angle commands to track unauthorized target
Turret Node Base Node (nRF52840-DK) BLE GATT Transmit servo angle and ultrasonic distance
Base Node PC UART Send localized coordinates and blockchain event data
BASE NODE/PC Dashboard (UI) HTTPS Display live localization and log detection events

The images are transferred to the Jetson through MJPEG streaming through wifi https which then is received by the Jetson, by routing it through the pc given the Jetson's lack of wifi cababilities. The message protocol diagram for this communcation can be seen below.

The angle of the servo and its update and the distance measurements are transferred through BLE GATT, with the server being the nrf52840-dk, while the Jetson and M5-Core2 serving as clients. the Jetson only sends to new angle to turn the servo too, while the nrf52840-dk sends the current angle and distance measurement to the M5-Core2. The message protocol diagrams for both parts of this network can be seen below.


Deliverables and Key Performance Indicators

Deliverable Key Performance Indicator (KPI)
1. Real-time Person Detection Achieve ≥90% accuracy in classifying individuals as authorized or unauthorized in live video
2. Servo-Based Target Tracking Servo rotates to within ±15° of the detected target’s angle based on vision and BLE commands
3. Localization via Distance + Angle Compute target position (r, θ) with error ≤ ±20 cm in distance and ±15° in angle
4. Live Dashboard Visualization Display detection events and target location on dashboard within ≤1 s of occurrence
5. Blockchain Event Logging Log 100% of unauthorized detections as immutable blockchain transactions without data loss

Appendix: Diagrams of System setup top and side view