Blog - terrytaylorbonn/auxdrone GitHub Wiki

24.0918

TODO:

CURRENT FOCUS IS ON APIs 24.0917

After getting back from Ukraine, my focus for now has shifted to APIs. See Part 63 APIs.

24.0812 Studying Python

I've been studying (doing hands-on demos) various python (including AI) topics, and will continue for the next few weeks/months. Taking a course this fall at a local uni. Preparing.. need certifcation. I will post my own strategy to cover the topics of interest for an AI drone builder at some point in the future. Right now not posting my working docs because they have personal account notes, etc. Still in Kyiv for next month.

Also will try to figure out the basic of creating CC programs for drone autonomous target homing (part 13c CC autonomy algorithms (Kalman track)).

image

24.0807 the ziptie'd ai drone race is on (ai autonomous attack drones)....

from Russia with love...

https://www.youtube.com/watch?v=CfUT2fJa6O4

image

24.0807 aws computer vision demo

Computer vision engineer

image

24.0806 Great blog article.. talks about the future phase of the war. And using AI for long-term warfare.

image

FPV-Дроны с Машинным Зрением: Будущее Обороны

https://www.youtube.com/watch?v=1KKlGFHGG0U

Кофе с Любарским

24.0806 Reorg epics 3-5

I really focus on the big picture (especially since I find the details challenging :) ), so constantly thinking how to better organize things. I think the following org makes more sense for epics 3-5:

  • EPIC 3 adding AI to Pix drone (force drone to do something really simple based on recognized external object)
    • (7) First get AI running on CC that uses camera output
    • (8) Use the AI to control the copter (very simple use of mavlink API, just land, etc)
    • (9a) Use SIH (SW in HW) to simulate. Basic sim (need this to do anything more advanced).
  • EPIC 4 Making the copter autonomous. There are basically 3 things needed for this:
    • (13) CC AI algorithms (using various AI api's.. advanced versions of what did in (7))
    • (11) Using the FC Mavlink "server" API by sending commands from the CC/GCS using the client side Mavlink APIs. This controls the copter. Advanced version of what did in (8).
    • (13c) NEW CC autonomy algorithms. Like Kalman filter tracking (I know nothing about this stuff). How to fine-tune control the copter so that it smoothly performs the mission. Got to thinking about this 24.0805 when I saw Russian-language blogger discuss. See 15.1 AI Kamikadze.
  • EPIC 5 Platforms. Focuses on creating mission ready platforms.
    • (14) Firmware dev. Maybe need to modify the FW for certain missions (not sure about this).
    • (15) Mission platforms. This the real final goal. Right now what interests me the most is the autonomous kamikadze mission.
    • (16) Special projects. Such as stealth comms, etc.

image

24.0730-0803 Simplified version of Nvidia (Dusty) AI fundamentals course

For details see Document 13.4-1 covers Nvidia tutorials (by Nvidia expert Dusty).

image

This is a much easier to follow version of the git/youtube content from Nvidia

  • Shows reorganization of Nvidia content (took me well over a week of effort; I added numbered subchapter headings).

  • Shows tests on my Nano (I verified all steps).

    image

24.0730-0731 Dusty demos S3E2 – IMAGE CLASSIFICATION Inference works on Nano in Kyiv

For details see 13.4-1_ai-nvidia_v17_24.0731.docx // search for "S3E2 (TERRY)"

Amazing.. After 3 weeks of travel, bus/train rides, airbnbs, etc I finally fired up the Nano today and tried the Dusty object inference demo and .. it worked :) I am not using the Docker, but rather a local build (that I built months ago; Docker was too slow). So now ready to continue with rest of Dusty demos.

The slow (cable) internet, the small table, the mismatch in electrical plugs, only 1 external monitor, etc etc are making this less than pleasant... I miss my home workshop back in Indiana! Add to that what is probably a case of the latest strain of Covid (Mr. Fauci + Chicomm bioweapons labs, the gift that keeps giving).

setup in my Kyiv Airbnb (I did not lose anything during the trip!)

image

Nano processing of images

image

Nano processing of video.

image

24.0705 Travel configuration works

Will start travelling tomorrow. Tested my travel setup, seems everything is working :)

Powered up, ran object recog test, and the copter (just the FC) went into land mode. :)

  • Red: FC with ETH adapter (Nano commands to FC via ethernet)
  • Blue: Nano with camera.

image

Started QGC (on Nano) to check status. Note "Land" in yellow box.

image

The stand for the monitor is too heavy, so use the travel fan. :)

image

Note enough USB ports on Nano, but the extension works. Might need to use USB sticks on travel.

image

24.0704 Yet another reorg :)

image

I am really big on organization and concepts. I reorg'd the last epics 4/5 into epics 4-6 (see diagram below). Time to redo (yet again) the Wiki organization (5 epics) and the Youtube playlist. :)

The change was motivated by a better understanding (I think its better) of what HITL, SITL, Matlab, and ROS are (now epic 6). For ZiptieAI drone purposes, basically PITS "Pie in the sky". PITS means

  • Too complex. You require an entire dev team to integrate. Especially with the AI routines (AI is a core focus of ZiptieAI). Since I am learning this stuff on my own, a big focus is avoid rabbit holes and PITS.
  • Whats the point? Sure if you are building customized proprietary drones with a big dev team, then fine. But that's not the ZiptieAI focus (ziptie'ing components is a core focus on ZiptieAI).

image

24.0703 SIH + object detection + mission interrupt working !! (big milestone)

The doc 9.0_SIH_nano_px4_ETH_v07_ shows how how I (somehow, magically) got SIH + object detection + mission interrupt working (simulation test)

  • SIH (sw in hw)
  • nano-px4 connection via ETH (ethernet on telem3)
  • obj recognition interrupts mission and lands copter when nano sees human

This is an important milestone (just a few days before my trip to Lviv/Kyiv). HITL demanded too many resources, and I saw no value from using jmavsim. SIH seems to be a practical way to test the logic of python programs on the CC (Nano) that are to be used to make the copter do (potentially complex) maneuvers in response to the Nano AI detecting something. This logic needs to be simulation tested before trying in the field.

Note: I have not seen any other demo like this one that includes the actual setup details. The details for this demo are still a little bit rough. I wanted to get this working (and posted on the wiki) before july 4th and before my impending summer (working) vacation in Lithuania/Ukraine that starts 6th july. Never thought I would actually get a demo of this all working before 6th july!

PS: New wiki page: 9.0 SIH NANO/PX4.

Demo summary

Person recognized (my fingers) when i opened camera shutter (copter is already flying SIH mission)

image

Nano switches (pymavlink) copter mission mode to land // copter lands.

image

test setup (everything running on pixhawk 6c and nano)

image

24.0701 HITL (partially worked) / SIH (promising) tests

9.0_HITL_SIH_px4_nano_ETH_v04_24.0701.docx is very sparse text with many pics description of tests with HITL/SIH.

image

image

image

image

image

24.0701 successful test of telem3/ETH connection nano<>6c (kitchen test of "land" command to fc)

Need this so that can run HITL (I need to use the USB for HITL, so run NANO<>6c via telem2 or telem3/eth).

See 8a.1 Connect/code

image image

24.0630 reorg'd EPIC 3 "add AI to pixhawk drone" docs

I think the org below makes sense:

  • part 7: setup the cc (nano or pi5/4), cam. also: qgc/startup scripts
  • part 8: setup connect/code, then kitchen/field test. for one of 4 combinations
    • NANO <> PX4
    • NANO <> AP
    • PI <> PX4
    • PI <> AP

image

24.0629 xponential 24 holybro pixhawk with jetson ORIN nano baseboard ($400)

They packaged together the jetson ORIN nano ($500) .. thats wwhat I am trying to do manually with the jetson nano ($150) :).

https://youtu.be/BLXh2JWJ3kc?t=251

image image

$1500 with Orin all preconfigured.. nice, if it works, if the quality is there. If good reviews, i might get one, stop working on the low level grunt stuff, and just focus on the AI :).

image

PS: The following video is about the Ukrainian "Lucky strike" drone. Looks quite similiar to the X-650 (from Holybro, discussed earlier in the video above). Lucky Strike uses Ardupilot!

https://youtu.be/nQBN8oW8so0?t=74

image

24.0626 Created a Youtube playlist for ZiptieAI

This is the first time I have created Youtube videos, so still learning the basics.

image

24.0623-26 cleaning up wiki pages, gdrive folder names, file names, etc

Updated shopping list added.

image

24.0619 evening !!! first successful AI object recognition (person) / CC forces land / OUTDOOR test !!!

image

7.2.2 Flight build test in outdoors

In this video (note the load motor noise from the struggle to keep all that extra weight in the air) I put the copter into loiter (hold) mode and then walked around in front of the camera... the nano AI recognized me as human, and put the copter into land mode (it overrode the loiter mode). Just a 10 second video, but a big deal for me. :)

My first test flight AI object recog with this Pixhawk PX4 copter, and it worked. I am having great luck with this copter.

Note: I did not use a laptop. I just brought the copter to the field, powered on and tested. This was the process:

  1. Stand behind the camera.
  2. Open camera lid.
  3. Turn on TX-12 RC.
  4. COnnect Pixhawk 6c FC to 4s battery.
  5. Connect Nano to its phone charger battery.
  6. Use RC to take off in stabilize.
  7. Flick switch to go into loiter mode.
  8. (where video starts) Walk in front of camera (which was pointing down slightly).
  9. (THe Nano sends command to copter to go into land mode).

Next test will be with mission flight. But tonight I preferred to fly the copter in a slightly different location, so flying the saved in eeprom mission was not possible (dont need a mission to verify the functinality anyway).

Note: I put the Nano+battery at the bottom because I was afraid that the copter leg design was not strong enough to hold the weight on land. The problem is its not easy to fly with the weight at the bottom. But it worked good enough.

image

24.0619 !!! first successful AI object recognition (person) / CC forces land / kitchen test !!!

Video.

For details

image