PE1100N - ASUS-IPC/ASUS-IPC GitHub Wiki

1. Flash Image

PE1100N only support flash image via Ubuntu and doesn't support flash image via Windows.

1.1 Recovery Mode

  1. System Requirement

    • Linux Host Computer (x86 Ubuntu 18.04 and above)

    • Micro USB cable

  2. Enter Force Recovery Mode
    For PE1100N Box, the Flash Port is number ❷, and the Force Recovery Button is number ❹. 234836537-a91d0518-55ef-4694-a575-fd1345164753

Please perform the following steps to force the PE1100N to enter force recovery mode:

[PE1100N]

  1. Power off the PE1100N and remove the power cable.
  2. Connect Host Computer and PE1100N Flash Port (number ❷) with Micro USB cable.
  3. Press and hold the Force Recovery Button (number ❹).
  4. Connect the power cable and Power ON the PE1100N.
  5. After 3s release the Force Recovery Button.

[Host Computer]

Make sure PE1100N is entered to force recovery mode, type command ‘lsusb’, you can see: “NVidia Corp”, it means PE1100N is in force recovery mode.

234836868-50478e57-027e-4784-a458-51d325f5b256

 

1.2 Flash Image

[Host Computer]

  1. Extract BSP file on Host Computer.

    BSP file example :

System File Name
PE1100N Orin Nano 4GB PE1100N Orin Nano 4GB JetPack x.x.x Image Vx.x.x
PE1100N Orin Nano 8GB PE1100N Orin Nano 8GB JetPack x.x.x Image Vx.x.x
PE1100N Orin NX 16GB PE1100N Orin NX 16GB JetPack x.x.x Image Vx.x.x
sudo tar xvpf PE1100N_JXANS_Orin-NX-16GB_JetPack-ssd-5.1.1_L4T-35.3.1_v0.1.3-debug-20230503.tar.gz
  1. Change folder
cd mfi_PE110xxxxxxxx  
sudo ./tools/kernel_flash/l4t_initrd_flash.sh --erase-all --flash-only --showlogs --network usb0
  1. Flashing the image takes around 15 minutes.

[PE1100N]
After 15 minutes, PE1100N will auto reboot.

NOTE :

  1. Do not use a USB Hub between Host Computer and PE1100N.
  2. You can know the process of flashing image from “mfi_PE1100N-orin/initrdlog”.

 

2. LTE Setting

2.1 How to set up a SIM card

NOTE : Please insert the SIM card and power on the device.

  1. Click to open the settings bar at the top right side of the desktop. 1

  2. Click Mobile Broadband Settings 2

  3. Switch on the Mobile Broadband to enable LTE function
    3

  4. Open the Network bar below the IMEI 4

  5. Select Add new connection 5

  6. Click Next on the top right of the window 6

  7. Select your country 7

  8. Select your provider 8

  9. Select your plan 9

  10. Click Apply on the top right of the window 10

  11. It will use this configure to connect to the internet automatically 11

  12. The current signal strength also appear on the settings bar 12

 

2.2 How to switch SIM card slot

  1. Open Terminal on the desktop 1

  2. Type command ‘sudo PE1100N-config’, if it asks for a password, enter the password set by yourself. Press Enter. 2

  3. Select Ok 3

  4. Select 5. SIM Select SIM slot and press Enter 4

  5. Select 1 SIM1 or 2 SIM2 and press Enter 5

  6. Select Finish and press Enter 6

  7. Select Yes and press Enter to reboot 7

 

2.3 How to check SIM card status

  1. Open Terminal on the desktop 1

  2. Type command ‘sudo PE1100N-config’, if it asks for a password, enter the password set by yourself. Press Enter. 2

  3. Select Ok 3

  4. Select 7. Configuration and press Enter 4

  5. Check SIM slot and SIM state on the screen 5

 

3. Switching M.2, COM Mode, and SIM

  1. Open Terminal on the desktop
  2. Type command ‘sudo PE1100N-config’, if it asks for a password, enter the password set by yourself. Press Enter.
  3. Select Ok
  4. Select the M.2, COM Mode, and SIM you want to switch

4. Switch DIO Instructions

  1. Open Terminal on the desktop
  2. Type below command to switch DIO
Set one DO Value: sudo dio_out #DO_Num # Value
Get one DI Value: sudo dio_in #DO_Num
Get all DO Value: sudo dio_out
Get all DI Value: sudo dio_in

5. LED Functions

5.1 Turn on/off LED

  1. Open Terminal on the desktop
  2. Below command are examples to turn on LED (Type Echo 1 or 0 for turning on/off LED)
ETH0 : sudo echo 1 > /sys/class/leds/eth0-led/brightness
ETH1: sudo echo 1 > /sys/class/leds/eth1-led/brightness
UART0: sudo echo 1 > /sys/class/leds/uart0-led/brightness
UART1: sudo echo 1 > /sys/class/leds/uart1-led/brightness
CAN: sudo echo 1 > /sys/class/leds/can-led/brightness
WIFI: sudo echo 1 > "/sys/class/leds/wifi-led/brightness
LTE: sudo echo 1 > /sys/class/leds/lte-led/brightness

5.2 Disable kernel LED function

  1. Open Terminal on the desktop
  2. Type below commands to disable kernel LED function
ETH0: sudo echo none > /sys/class/leds/eth0-led/trigger
ETH1: sudo echo none > /sys/class/leds/eth1-led/trigger
UART0: sudo echo none > /sys/class/leds/uart0-led/trigger
UART1: sudo echo none > /sys/class/leds/uart1-led/trigger
CAN: sudo echo none > /sys/class/leds/can-led/trigger
WIFI: sudo echo none > /sys/class/leds/wifi-led/trigger
LTE: sudo echo none > /sys/class/leds/lte-led/trigger

5.3 Recover Kernel LED function

  1. Open Terminal on the desktop
  2. Type below commands to recover kernel LED function
ETH0: sudo echo eth0 > /sys/class/leds/eth0-led/trigger
ETH1: sudo echo eth1 > /sys/class/leds/eth1-led/trigger
UART0: sudo echo uart0 > /sys/class/leds/uart0-led/trigger
UART1: : sudo echo uart1 > /sys/class/leds/uart1-led/trigger
CAN: sudo echo can > /sys/class/leds/can-led/trigger
WIFI: sudo echo wifi > /sys/class/leds/wifi-led/trigger
LTE: sudo echo lte > /sys/class/leds/lte-led/trigger

6. Others

6.1 ASUS IoT API (DIO Function Control)

Asus_API_Programming_Guide_v1.05_20240223.pdf

ASUS API (Library, Header files, Sample code)

6.2 PE1100N console port connect to PC - USB to UART Bridge

  1. When PE1100N console port is connected to a PC, it is possible that a USB to UART Bridge exclamation mark may appear in Device Manager.
pe1100n_otherdevices_2

The baud rate is 115200

  1. In this case, you need to install the following driver (F81232_231115_whql.zip). After restarting, you will be able to use the console port to read logs normally

F81232_231115_whql.zip

6.3 OS and NVIDIA SDK Version Mapping Table

Please refer to the table below for recommended OS versions and NVIDIA SDK versions for each image version

PE1100N official release version L4T Ubuntu JetPack CUDA DeepStream SDK cuDNN TensorRT
V1.0.0 35.3.1 20.04 5.1.1 11.4.19 6.2 8.6.0 8.5.2
V1.1.1 35.4.1 20.04 5.1.2 11.4.19 6.3 8.6.0 8.5.2
V2.0.5 36.3.0 22.04 6.0 12.2.2 6.4/7.0 8.9.4 8.6.2
V2.0.6 36.3.0 22.04 6.0 12.2.2 6.4/7.0 8.9.4 8.6.2
V2.0.7 36.4.0 22.04 6.1 12.6.10 7.1 9.3.0 10.3.0
V2.0.11 36.4.3 22.04 6.2 12.6.10 7.1 9.3.0 10.3.0

6.4 Backup and Restore OS image

The backup and restore steps are applicable to Jetpack 6.0 and above

  1. Installation Prerequisites
$ sudo apt update 
$ sudo apt install -y abootimg binfmt-support binutils cpio cpp device-tree-compiler dosfstools \ 
iproute2 iputils-ping lbzip2 libxml2-utils netcat nfs-kernel-server openssl python3-yaml qemu-user-static \ 
rsync sshpass udev uuid-runtime whois xmlstarlet zstd lz4 chrpath diffstat xxd wget bc 
  1. Download and prepare the Linux_for_Tegra source code
$ wget https://developer.nvidia.com/downloads/embedded/l4t/r36_release_v3.0/release/jetson_linux_r36.3.0_aarch64.tbz2
$ tar xf jetson_linux_r36.3.0_aarch64.tbz2
  1. Download and prepare the sample root file system
$ wget https://developer.nvidia.com/downloads/embedded/l4t/r36_release_v3.0/release/tegra_linux_sample-rootfilesystem_r36.3.0_aarch64.tbz2
$ sudo tar xpf tegra_linux_sample-root-filesystem_r36.3.0_aarch64.tbz2 -C Linux_for_Tegra/rootfs/
  1. Download the backup and restore patch file for PE1100N

    https://drive.google.com/file/d/1fw7jiq7SIMnj18erY5YQA7kF2xogE0uU/view?usp=sharing

  2. Overwrite the original BSP by patch files

$ tar zxf PE1100N_r3630_backup_restore_patch.tar.gz 
$ cp -r PE1100N_r3630_backup_restore_patch/Linux_for_Tegra/* Linux_for_Tegra/ 
# sudo tar -xf PE1100N_r3630_backup_restore_patch/kernel_module/kernel_supplements.tbz2 -C Linux_for_Tegra/rootfs/ 
# sudo tar -xf PE1100N_r3630_backup_restore_patch/kernel_module/kernel_oot_modules.tbz2 -C Linux_for_Tegra/rootfs/ 
  1. Create the backup image

    1. Enter Force Recovery Mode

      • Power off the PE1100N and remove the power cable.
      • Connect Host Computer and PE1100N Flash Port with Micro USB cable.
      • Press and hold the Force Recovery Button.
      • Connect the power cable and Power ON the PE1100N.
      • After 3s release the Force Recovery Button.
    2. Run this command from the Linux_for_Tegra folder:

      $ cd Linux_for_Tegra 
      $ sudo ./tools/backup_restore/l4t_backup_restore.sh -e nvme0n1 -b PE1100N-orin 
      

      If this command completes successfully, a backup image will be stored in Linux_for_Tegra/tools/backup_restore/images.

  2. Restore a PE1100N using a backup image

    1. Enter Force Recovery Mode

      • Power off the PE1100N and remove the power cable.
      • Connect Host Computer and PE1100N Flash Port with Micro USB cable.
      • Press and hold the Force Recovery Button.
      • Connect the power cable and Power ON the PE1100N.
      • After 3s release the Force Recovery Button.
    2. Run this command from the Linux_for_Tegra folder:

      $ cd Linux_for_Tegra 
      $ sudo ./tools/backup_restore/l4t_backup_restore.sh -e nvme0n1 -r PE1100N-orin
      

6.5 Build Image

6.5.1 Jetpack 6.0 image

  1. Installation Prerequisites
$ sudo apt update
$ sudo apt install -y apt-utils bc build-essential cpio curl \
device-tree-compiler expect gawk gdisk git kmod liblz4-tool libssl-dev \
locales parted python python3 qemu-user-static rsync \
software-properties-common sudo time tzdata udev unzip wget zip \
nfs-kernel-server uuid-runtime
  1. Download and prepare the Linux_for_Tegra source code
$ wget https://developer.nvidia.com/downloads/embedded/l4t/r36_release_v3.0/release/jetson_linux_r36.3.0_aarch64.tbz2
$ tar xf jetson_linux_r36.3.0_aarch64.tbz2
  1. Download and prepare sample root file system
$ wget https://developer.nvidia.com/downloads/embedded/l4t/r36_release_v3.0/release/tegra_linux_sample-root-filesystem_r36.3.0_aarch64.tbz2
$ sudo tar xpf tegra_linux_sample-root-filesystem_r36.3.0_aarch64.tbz2 -C Linux_for_Tegra/rootfs/
  1. Sync the source code for compiling
$ cd Linux_for_Tegra/source/
$ ./source_sync.sh -t jetson_36.3

If the following error occurs, please wait for Nvidia to provide a fix. NV_fatal_error

reference: https://forums.developer.nvidia.com/t/unable-to-connect-to-nv-tegra-nvidia-com/334353

  1. Download the patch file for PE1100N

    https://drive.google.com/file/d/12wBWT0YSfkDncw0JbIJu2hYsKAeRnzTu/view?usp=sharing

  2. Overwrite the original source code by patch files

$ cd ../..
# Copy the PE1100N_r3630_patch.tar.gz to this folder.
$ tar zxf PE1100N_r3630_patch.tar.gz
$ sudo cp -r PE1100N_r3630_patch/Linux_for_Tegra/* Linux_for_Tegra/

For Nano 4G only

$ sudo cp PE1100N_r3630_patch/nano_4g_patch/chip_info.bin_bak Linux_for_Tegra/bootloader 
  1. Apply necessary changes to rootfs
$ cd Linux_for_Tegra
$ sudo ./apply_binaries.sh
  1. Download and install the toolchain
$ wget https://developer.nvidia.com/downloads/embedded/l4t/r36_release_v3.0/toolchain/aarch64--glibc--stable-2022.08-1.tar.bz2
$ sudo tar xf aarch64--glibc--stable-2022.08-1.tar.bz2 -C /opt
$ rm aarch64--glibc--stable-2022.08-1.tar.bz2
  1. Build the kernel
$ cd source
$ export ARCH=arm64
$ export CROSS_COMPILE=/opt/aarch64--glibc--stable-2022.08-1/bin/aarch64-linux-
$ ./nvbuild_asus.sh
  1. Install new kernel dtbs and kernel modules
$ ./do_copy.sh
$ export INSTALL_MOD_PATH=`realpath ../rootfs/`
$ ./nvbuild_asus.sh -i
$ cd ..
  1. Pre-install JetPack SDK (optional)
$ sudo sed -i "s/<SOC>/t234/g" rootfs/etc/apt/sources.list.d/nvidia-l4t-apt-source.list
$ sudo cp /usr/bin/qemu-aarch64-static rootfs/usr/bin/
$ sudo mount --bind /sys ./rootfs/sys
$ sudo mount --bind /dev ./rootfs/dev
$ sudo mount --bind /dev/pts ./rootfs/dev/pts
$ sudo mount --bind /proc ./rootfs/proc
$ sudo chroot rootfs
# apt update
# apt install -y nvidia-jetpack
# apt clean
# exit
$ sudo umount ./rootfs/sys
$ sudo umount ./rootfs/dev/pts
$ sudo umount ./rootfs/dev
$ sudo umount ./rootfs/proc
$ sudo rm rootfs/usr/bin/qemu-aarch64-static
  1. Flash the device
  • Orin NX 16G
$ sudo BOARDID=3767 BOARDSKU=0000 ./tools/kernel_flash/l4t_initrd_flash.sh --external-device nvme0n1p1 -c tools/kernel_flash/flash_l4t_t234_nvme.xml -p "-c bootloader/generic/cfg/flash_t234_qspi.xml" --showlogs --network usb0 PE1100N-orin internal
  • Orin NX 8G
$ sudo BOARDID=3767 BOARDSKU=0001 ./tools/kernel_flash/l4t_initrd_flash.sh --external-device nvme0n1p1 -c tools/kernel_flash/flash_l4t_t234_nvme.xml -p "-c bootloader/generic/cfg/flash_t234_qspi.xml" --showlogs --network usb0 PE1100N-orin internal
  • Orin Nano 8G
$ sudo BOARDID=3767 BOARDSKU=0003 ./tools/kernel_flash/l4t_initrd_flash.sh --external-device nvme0n1p1 -c tools/kernel_flash/flash_l4t_t234_nvme.xml -p "-c bootloader/generic/cfg/flash_t234_qspi.xml" --showlogs --network usb0 PE1100N-orin internal
  • Orin Nano 4G
$ sudo BOARDID=3767 BOARDSKU=0004 ./tools/kernel_flash/l4t_initrd_flash.sh --external-device nvme0n1p1 -c tools/kernel_flash/flash_l4t_t234_nvme.xml -p "-c bootloader/generic/cfg/flash_t234_qspi.xml" --showlogs --network usb0 PE1100N-orin internal
$ sudo rm bootloader/chip_info.bin_bak

 

6.5.2 Jetpack 6.2 image

  1. Installing Prerequisites
$ sudo apt update
$ sudo apt install -y apt-utils bc build-essential cpio curl \
device-tree-compiler expect gawk gdisk git kmod liblz4-tool libssl-dev \
locales parted python3 qemu-user-static rsync \
software-properties-common sudo time tzdata udev unzip wget zip \
nfs-kernel-server uuid-runtime
  1. Download and prepare the Linux_for_Tegra source code
$ wget https://developer.nvidia.com/downloads/embedded/l4t/r36_release_v4.3/release/Jetson_Linux_r36.4.3_aarch64.tbz2
$ tar xf Jetson_Linux_r36.4.3_aarch64.tbz2
  1. Download and prepare sample root file system
$ wget https://developer.nvidia.com/downloads/embedded/l4t/r36_release_v4.3/release/Tegra_Linux_Sample-Root-Filesystem_r36.4.3_aarch64.tbz2
$ sudo tar xpf Tegra_Linux_Sample-Root-Filesystem_r36.4.3_aarch64.tbz2 -C Linux_for_Tegra/rootfs/
  1. Sync the source code for compiling
$ cd Linux_for_Tegra/source/
$ ./source_sync.sh -t jetson_36.4.3
  1. Download the patch file for PE1100N

    https://drive.google.com/file/d/14sQxWWBMq40wqrE8U0H9MEENZv54nCK4/view?usp=sharing

    Copy it in the same location as the Linux_for_Tegra folder.

    圖片2

  2. Overwrite the original source code by patch files

$ cd ../..
$ tar zxf PE1100N_v2.0.11_patch.tar.gz
$ sudo cp -r PE1100N_v2.0.11_patch/Linux_for_Tegra/* Linux_for_Tegra/
  1. Apply necessary changes to rootfs
$ cd Linux_for_Tegra
$ sudo ./apply_binaries.sh
  1. Download and install the toolchain
$ wget https://developer.nvidia.com/downloads/embedded/l4t/r36_release_v3.0/toolchain/aarch64--glibc--stable-2022.08-1.tar.bz2
$ sudo tar xf aarch64--glibc--stable-2022.08-1.tar.bz2 -C /opt
$ rm aarch64--glibc--stable-2022.08-1.tar.bz2
  1. Build the kernel
$ cd source
$ export ARCH=arm64
$ export CROSS_COMPILE=/opt/aarch64--glibc--stable-2022.08-1/bin/aarch64-linux-
$ ./nvbuild_asus.sh
  1. Install new kernel dtbs and kernel modules
$ ./do_copy.sh
$ export INSTALL_MOD_PATH=`realpath ../rootfs/`
$ ./nvbuild_asus.sh -i
$ cd ..
  1. Pre-install JetPack SDK (optional)
$ sudo sed -i "s/<SOC>/t234/g" rootfs/etc/apt/sources.list.d/nvidia-l4t-apt-source.list
$ sudo cp /usr/bin/qemu-aarch64-static rootfs/usr/bin/
$ sudo mount --bind /sys ./rootfs/sys
$ sudo mount --bind /dev ./rootfs/dev
$ sudo mount --bind /dev/pts ./rootfs/dev/pts
$ sudo mount --bind /proc ./rootfs/proc
$ sudo chroot rootfs
# apt update
# apt install -y nvidia-jetpack
# apt clean
# exit
$ sudo umount ./rootfs/sys
$ sudo umount ./rootfs/dev/pts
$ sudo umount ./rootfs/dev
$ sudo umount ./rootfs/proc
$ sudo rm rootfs/usr/bin/qemu-aarch64-static
  1. Build the SSD image
$ sudo build.sh
  • Select the SOM you are using.

圖片3

  • Build success will be shown as in the image below.

圖片4

  • The image will be generated in the mfi_PE1100N-orin folder and will be packaged as mfi_PE1100N-orin.tar.gz.

圖片5

  1. Flash image to device

Installing the flash requirements

$ sudo ./tools/l4t_flash_prerequisites.sh

Set up the device in recovery mode, then flash the image.

$ cd mfi_PE1100N-orin
$ sudo ./tools/kernel_flash/l4t_initrd_flash.sh --erase-all --flash-only --showlogs --network usb0

 

6.6 Upgrade to Jetpack 6.1

  1. Please make sure your OS is Jetpack 6.0
$ cat /etc/nv_tegra_release

123

  1. Add the R36.4/JP 6.1
$ echo "deb https://repo.download.nvidia.com/jetson/common r36.4 main" | sudo tee -a /etc/apt/sources.list.d/nvidia-l4t-apt-source.list
$ echo "deb https://repo.download.nvidia.com/jetson/t234 r36.4 main" | sudo tee -a /etc/apt/sources.list.d/nvidia-l4t-apt-source.list
  1. Update the apt
$ sudo apt-get update
  1. Install Jetpack compute components
$ sudo apt-get install nvidia-jetpack
  1. Remove the R36.4/JP 6.1 repo to avoid installing nvidia-l4t bsp packages accidentally later
    Important
    Do not use apt-get upgrade because that will upgrade L4T packages too

For more details, please follow the link below from NVIDIA official website.
https://docs.nvidia.com/jetson/archives/jetpack-archived/jetpack-61/install-setup/index.html#upgradable-compute-stack

 

7. Intelligent Video Analytics Samples

7.1 Metropolis Microservices for Jetson

NVIDIA Metropolis Microservices is a collection of software components designed for building intelligent video analytics applications. It's part of NVIDIA Metropolis, which is a platform aimed at transforming cities, enterprises, and industries into smart environments using computer vision, deep learning, and AI.

Prerequisites

  • Jetson Orin devices with JetPack 6.0 OS
  • Android phone or tablet
  1. Install Docker
sudo apt install -y docker.io
sudo usermod -aG docker $USER
newgrp docker
  1. Install nvidia-jetson-services
sudo apt update
sudo apt install -y nvidia-jetson-services
  1. Launch the Redis, Ingress and VST services
sudo systemctl start jetson-redis
sudo systemctl start jetson-ingress
sudo systemctl start jetson-vst
  1. Download Jetson Platform Services Reference Workflow & Resources

Follow the link below, and select on Version History tab
Choose 1.1.0 -> Download
https://catalog.ngc.nvidia.com/orgs/nvidia/teams/jps/resources/reference-workflow-and-resources 圖片1 Note: 2.0.0 is for JetPack 6.1

  1. Launch NVStreamer
unzip files.zip
rm files.zip
cd files
tar -xvf nvstreamer-1.1.0.tar.gz
cd nvstreamer
sudo docker compose -f compose_nvstreamer.yaml up -d --force-recreate
  1. Launch AI_NVR
cd <path of files>
tar -xvf ai_nvr-1.1.0.tar.gz
sudo cp ai_nvr/config/ai-nvr-nginx.conf /opt/nvidia/jetson/services/ingress/config/
cd ai_nvr

For Orin AGX:

sudo docker compose -f compose_agx.yaml up -d --force-recreate

For Orin NX 16G:

sudo docker compose -f compose_nx16.yaml up -d --force-recreate

For Orin NX 8G:

sudo docker compose -f compose_nx8.yaml up -d --force-recreate

For Orin Nano 8G/4G:

sudo docker compose -f compose_nano.yaml up -d --force-recreate

Docker containers will be created after executing the above instructions.
圖片2

  1. Visit the NVStream Streamer Dashboard
    http://localhost:31000/
    圖片4

Click "File Upload" and select sample_1080p_h264.mp4 from the files folder.
圖片5

The file should appear on the dashboard once it has been uploaded successfully
圖片6

Ex: The RTSP URL is rtsp://192.168.1.46:31555/nvstream/root/store/nvstreamer_videos/sample_1080p_h264.mp4

  1. Visit the VST Dashboard
    http://localhost:30080/vst
    圖片7

Click “Sensor Management”, then add the device

  1. Enter the sample RTSP URL
  2. Enter the sample name
  3. Click submit
    圖片8

The sample video should be listed on the VST Dashboard.
圖片9

  1. Example for Metropolis analytics
    A. Go to http://localhost:30080/vst
    B. Click Live Streams.
    C. Select the sample VST.
    圖片10
    D. Expand Analytics.
    圖片11
    E. Click “ROI”.
    圖片12
    F. Add areas of interest, then click “Done”
    圖片13
    G. Enter “People” on the dialog, then click Submit
    圖片14
    H. Select People on “Select ROIs”
    圖片15
    I. Click “Show”
    圖片16
    J. Detected people will be displayed on the video screen.
    圖片17

  2. Download and install NVIDIA Jetson Services on Android phone or tablet

Follow the link below, and there is an Android app that allows you to track events and create areas of interest to monitor
you can find it on Google Play as AI NVR.
https://play.google.com/store/apps/details?id=com.nvidia.ainvr
圖片18

For JetPack 6.0, please download and install this version manually instead of the Play Store.
https://apkpure.com/cn/jetson-platform-services/com.nvidia.ainvr/downloading/1.0.2024060601

  1. Execute NVIDIA Jetson Services on Android device

A. Enter the JETSON_IP address, then press “Submit”
圖片19

B. Press “Analytics”
圖片20

C. This App allows you to track events and create areas of interest to monitor
圖片21

Reference:

  1. Tutorial mmj
    https://www.jetson-ai-lab.com/tutorial_mmj.html
  2. NVIDIA Metropolis Microservices
    https://developer.nvidia.com/metropolis-microservices
  3. Overview of AI-NVR
    https://www.nvidia.com/zh-tw/on-demand/session/other2024-mmj2/

 

7.2 Running DeepStream Python samples

  1. Flash images for Jetpack 6.0

Please follow the link below to flash the devices based on your models

For Orin NX 16G:
https://dlcdnets.asus.com/pub/ASUS/mb/Embedded_IPC/PE1100N/PE1100N_JONXS_Orin-NX-16GB_JetPack-ssd-6.0.0_L4T-36.3.0_v2.0.6-official-20240904.tar.gz?model=PE1100N

For Orin NX 8G:
https://dlcdnets.asus.com/pub/ASUS/mb/Embedded_IPC/PE1100N/PE1100N_JONXS_Orin-NX-8GB_JetPack-ssd-6.0.0_L4T-36.3.0_v2.0.6-official-20240904.tar.gz?model=PE1100N

For Orin Nano 8G:
https://dlcdnets.asus.com/pub/ASUS/mb/Embedded_IPC/PE1100N/PE1100N_JONAS_Orin-Nano-8GB_JetPack-ssd-6.0.0_L4T-36.3.0_v2.0.6-official-20240904.tar.gz?model=PE1100N

For Orin Nano 4G:
https://dlcdnets.asus.com/pub/ASUS/mb/Embedded_IPC/PE1100N/PE1100N_JONAS_Orin-Nano-4GB_JetPack-ssd-6.0.0_L4T-36.3.0_v2.0.6-official-20240905.tar.gz?model=PE1100N

  1. Install Jetson Stats and use jtop to monitor the system
sudo apt-get update
sudo apt-get install python3-pip
sudo pip3 install -U jetson-stats
sudo reboot
jtop

圖片1

For more details, please follow the link below from NVIDIA developer website.
https://developer.nvidia.com/embedded/community/jetson-projects/jetson_stats

  1. Upgrade the compute stack to JetPack 6.1
    Please follow the link below to upgrade your device to Jetpack 6.1. If a release based on JetPack 6.1 is used, you can skip this.
    https://github.com/ASUS-IPC/ASUS-IPC/wiki/PE1100N#66-upgrade-to-jetpack-61

  2. Meet the prerequisites for the DeepStream SDK

sudo pip3 install meson
sudo pip3 install ninja
cd ~/Documents/
git clone https://github.com/GNOME/glib.git
cd glib
git checkout 2.76.6
meson build --prefix=/usr
ninja -C build/
cd build/
sudo ninja install
pkg-config --modversion glib-2.0
sudo apt install \
libssl3 \
libssl-dev \
libgstreamer1.0-0 \
gstreamer1.0-tools \
gstreamer1.0-plugins-good \
gstreamer1.0-plugins-bad \
gstreamer1.0-plugins-ugly \
gstreamer1.0-libav \
libgstreamer-plugins-base1.0-dev \
libgstrtspserver-1.0-0 \
libjansson4 \
libyaml-cpp-dev

For more details, please follow the link below from NVIDIA official website.
https://docs.nvidia.com/metropolis/deepstream/dev-guide/text/DS_Installation.html#prerequisites

  1. Install the DeepStream SDK
cd ~/Documents/
wget --content-disposition 'https://api.ngc.nvidia.com/v2/resources/nvidia/deepstream/versions/7.1/files/deepstream-7.1_7.1.0-1_arm64.deb' -O deepstream-7.1_7.1.0-1_arm64.deb
sudo apt-get install ./deepstream-7.1_7.1.0-1_arm64.deb
wget --content-disposition 'https://github.com/NVIDIA-AI-IOT/deepstream_python_apps/releases/download/v1.2.0/pyds-1.2.0-cp310-cp310-linux_aarch64.whl' -O pyds-1.2.0-cp310-cp310-linux_aarch64.whl
sudo pip3 install pyds-1.2.0-cp310-cp310-linux_aarch64.whl
sudo pip3 install cuda-python

Please follow the links below for more information
https://docs.nvidia.com/metropolis/deepstream/dev-guide/text/DS_Installation.html#install-the-deepstream-sdk
https://github.com/NVIDIA-AI-IOT/deepstream_python_apps/releases
https://github.com/NVIDIA-AI-IOT/deepstream_python_apps/tree/master/bindings

  1. Download the DeepStream Python samples
cd /opt/nvidia/deepstream/deepstream/sources/
sudo git clone https://github.com/NVIDIA-AI-IOT/deepstream_python_apps.git

Please follow the links below for more information
https://github.com/NVIDIA-AI-IOT/deepstream_python_apps

  1. Run the samples of the deepstream
    Before runing the samples, please run the following commads to boost the clocks
sudo nvpmodel -m 0
sudo jetson_clocks

For more information, please refer to the following URL.
https://docs.nvidia.com/metropolis/deepstream/dev-guide/text/DS_Quickstart.html#boost-the-clocks

  • Run the sample deepstream-test2
    This is a sample with 4-class object detection, tracking and attribute classification pipeline.
cd /opt/nvidia/deepstream/deepstream/sources/deepstream_python_apps/apps/deepstream-test2/
python3 deepstream_test_2.py /opt/nvidia/deepstream/deepstream/samples/streams/sample_720p.h264

圖片1
For more information, please refer to the following URL.

https://github.com/NVIDIA-AI-IOT/deepstream_python_apps/tree/master/apps/deepstream-test2

  • Run the sample deepstream-test3 with single file input
    This is a sample with multi-stream pipeline performing 4-class object detection and also supports triton inference server, no-display mode, file-loop and silent mode.
cd /opt/nvidia/deepstream/deepstream/sources/deepstream_python_apps/apps/deepstream-test3/
sudo python3 deepstream_test_3.py -i file:///opt/nvidia/deepstream/deepstream/samples/streams/sample_1080p_h264.mp4 -s --file-loop

圖片2

  • Run the sample deepstream-test3 with 4 file inputs
cd /opt/nvidia/deepstream/deepstream/sources/deepstream_python_apps/apps/deepstream-test3/
sudo python3 deepstream_test_3.py -i file:///opt/nvidia/deepstream/deepstream/samples/streams/sample_1080p_h264.mp4 file:///opt/nvidia/deepstream/deepstream/samples/streams/sample_1080p_h264.mp4 file:///opt/nvidia/deepstream/deepstream/samples/streams/sample_1080p_h264.mp4 file:///opt/nvidia/deepstream/deepstream/samples/streams/sample_1080p_h264.mp4 -s --file-loop

圖片3

For more information, please refer to the following URL.
https://github.com/NVIDIA-AI-IOT/deepstream_python_apps/tree/master/apps/deepstream-test3

 

7.3 Visual Language Models (VLM) for Jetson

VLMs are multi modal models supporting images, video and text and using a combination of large language models and vision transformers. Based on this capability, they are able to support text prompts to query videos and images thereby enabling capabilities such as chatting with the video, and defining natural language based alerts.

Prerequisites

  • Jetson Orin devices with JetPack 6.0 OS
  • Android phone or tablet
  1. Install Docker
sudo apt install -y docker.io
sudo usermod -aG docker $USER
newgrp docker
  1. Install nvidia-jetson-services
sudo apt update
sudo apt install -y nvidia-jetson-services
  1. Launch the VST services
sudo systemctl start jetson-vst
  1. Download Jetson Platform Services Reference Workflow & Resources

Follow the link below, and select on Version History tab
Choose 1.1.0 -> Download
https://catalog.ngc.nvidia.com/orgs/nvidia/teams/jps/resources/reference-workflow-and-resources 圖片1 Note: 2.0.0 is for JetPack 6.1

  1. Launch NVStreamer
unzip files.zip
rm files.zip
cd files
tar -xvf nvstreamer-1.1.0.tar.gz
cd nvstreamer
sudo docker compose -f compose_nvstreamer.yaml up -d --force-recreate
  1. Visit the NVStream Streamer Dashboard
    http://localhost:31000/
    圖片4

Click "File Upload" and select sample_1080p_h264.mp4 from the files folder.
圖片5

The file should appear on the dashboard once it has been uploaded successfully
圖片6
Ex: The RTSP URL is rtsp://192.168.1.46:31555/nvstream/root/store/nvstreamer_videos/sample_1080p_h264.mp4

  1. Visit the VST Dashboard
    http://localhost:30080/vst
    圖片7

Click “Sensor Management”, then add the device

  1. Enter the sample RTSP URL
  2. Enter the sample name
  3. Click submit
    圖片8

The sample video should be listed on the VST Dashboard.
圖片9

  1. Launch VLM
cd <path of files>
tar -xvf vlm-1.1.0.tar.gz
cd vlm/example_1
sudo cp config/vlm-nginx.conf /opt/nvidia/jetson/services/ingress/config
sudo cp config/prometheus.yml /opt/nvidia/jetson/services/monitoring/config/prometheus.yml
sudo cp config/rules.yml /opt/nvidia/jetson/services/monitoring/config/rules.yml

Start the foundation services and launch it.

sudo systemctl start jetson-ingress
sudo systemctl start jetson-monitoring
sudo systemctl start jetson-sys-monitoring
sudo systemctl start jetson-gpu-monitoring
sudo docker compose up -d

The first time the VLM service is launched, it will automatically download and quantize the VLM. This will take some time.
You can visit the page http://JETSON_IP:5015/v1/health, If the VLM is ready it will return {“detail”:”ready”}. If you are launching the VLM for the first time it will take some time to fully load.
1
Important If it shows {“detail”:”model loading”}, it means it is not ready yet.
2

  1. Interact with VLM Service
    A. Control Stream Input via REST APIs
    You can start by adding an RTSP stream for the VLM to use with the following curl command. This will use the POST method on the live-stream endpoint. Currently the VLM will only support 1 stream but in the future this API will allow for multi-stream support.

Replace 192.168.100.45 with your Jetson IP and replace the RTSP link with your RTSP link.

curl --location 'http://192.168.100.45:5010/api/v1/live-stream' \
--header 'Content-Type: application/json' \
--data '{
"liveStreamUrl":"rtsp://192.168.100.45:31554/nvstream/root/store/nvstreamer_videos/sample_1080p_h264.mp4"
}'

This request will return a unique stream ID that is used later to set alerts and ask follow up questions and remove the stream. Ex: "id": "f16ffb43-95a4-44c5-bc5f-00cad33ddaf2"
3

B. Set Alerts
Alerts are questions that the VLM will continuously evaluate on the live stream input. For each alert rule set, the VLM will try to decide if it is True or False based on the most recent frame from of the live stream. These True and False states as determined by the VLM, are sent to a websocket and the jetson monitoring service.

When setting alerts, the alert rule should be phrased as a yes/no question. Such as “Is there people?”. The body of the request must also have the “id” field that corresponds to the stream ID that was returned when the RTSP stream was added.

curl --location 'http://192.168.100.45:5010/api/v1/alerts' \
--header 'Content-Type: application/json' \
--data '{
    "alerts": ["is there people?"],
    "id": "f16ffb43-95a4-44c5-bc5f-00cad33ddaf2"
}'

C. View RTSP Stream Output
Once a stream is added, it will be passed through to the output RTSP stream. You can view this stream at "rtsp://JETSON_IP:5011/out". Once a query or alert is added, we can view the VLM responses on this output stream.
Ex: You can view this RTSP stream by VLC
4

D. Delete the stream
To shut down the example you can first remove the stream using a DELETE method on the live-stream endpoint. Note the stream ID is added to the URL path for this.

curl --location --request DELETE 'http://192.168.100.45:5010/api/v1/live-stream/f16ffb43-95a4-44c5-bc5f-00cad33ddaf2'

This request will return “Stream removed successfully”
5

  1. Using VLM Service from Android phone
    A. Download and install NVIDIA Jetson Services on Android phone or tablet

Follow the link below, and there is an Android app that allows you to track events and create areas of interest to monitor
you can find it on Google Play as AI NVR.
https://play.google.com/store/apps/details?id=com.nvidia.ainvr
圖片18

For JetPack 6.0, please download and install this version manually instead of the Play Store.
https://apkpure.com/cn/jetson-platform-services/com.nvidia.ainvr/downloading/1.0.2024060601

B. Execute NVIDIA Jetson Services on Android device
Enter the JETSON_IP address, then press “Submit”
圖片19

Press message button.
6

Now you can talk to the VLM service. For example, ask VLM what he sees on the screen.
7

Reference
Visual Language Models (VLM) with Jetson Platform Services
https://docs.nvidia.com/jetson/jps/inference-services/vlm.html

⚠️ **GitHub.com Fallback** ⚠️