Training - thomaspingel/advanced_remote_sensing GitHub Wiki
TCP/IP
1.Level 1
- 1.1.1 Watch Dr. Pingel's Lecture on TCP/IP for Remote Sensing.
- 1.1.2 Read the TCP/IP for Remote Sensing wiki page.
- 1.1.3 Locate your own IP address.
- 1.1.4 Ping a computer to test whether you can talk to it.
- 1.1.5 Locate the IP address of the computer or sensor you need to talk to.
Level 2
- 1.2.1 SSH into a remote computer and perform some task.
- 1.2.2 SFTP (or FTP) into a server to download some data.
- 1.2.3 Talk to a sensor via its web interface or other dedicated program.
Level 3
- 1.3.1 Configure a static IP address to a private network, including its subnet mask, gateway, and potentially DNS servers. Demonstrate that with this configuration you're able to successfully navigate the internet / ping a remote server using that connection. Be sure to document your connection. This will work most easily with a home Wi Fi system.
- 1.3.2 Use Wireshark or similar tool to log data from a specific sensor.
RTK/PPK GNSS
2.Level 1
- 2.1.1 Listen to Dr. Pingel's lecture on RTK / PPK GNSS.
- 2.1.2 Watch videos on GPS / GNSS, RTK GPS, and PPK GPS.
- 2.1.3 Read the wiki page on RTK and PPK GNSS.
- 2.1.4 Our RTK GPS systems are made by Emlid, and they have a quickstart guide for you to read and several YouTube videos. E38 Survey Solutions sells Emlids, and they have a great set of videos, too - watch at least 1 of these.
- 2.1.5 Run some sample data provided by Dr. Pingel. [video walkthrough]
Level 2
- 2.2.1 With the RTK GPS Collection and Processing / First Time Setup as your guide, configure an Emlid Reach RS unit to use your phone's hotspot and to receive RTK corrections from NY State.
- 2.2.2 With the RTK GPS Collection and Processing / RTL for GCPs as your guide, collect your first points using RTK. Take the GPS and survey pole out to capture the location of several service covers around the Old Johnson area. Make sure you are receiving RTK corrections and you are collecting in FIX mode. Dr. Pingel or someone already trained will take you for your first collection. Save your points as a shapefile and/or CSV and email to Dr. Pingel for review ([email protected]). Make sure your shapefile has been repaired (Define Projection) to be in EPSG 6319 (NAD83(2011)). Go over the results with Dr. Pingel.
- 2.2.3 Reprocess an RTK collection with PPK and compare the results to your RTK collection.
Level 3
- 2.3.1 Collect a PPK sample of service covers with NTRIP turned off. Run the data and compare to a previous RTK sample. Remember that it will take much longer to get a Fix, and you won't know its state. You should sample each location for two minutes, but you don't need a large number of samples (3-5 is recommended).
- 2.3.2 Use the device to tag locations of GCPs for a drone scan. Repeat for a total of five flights.
Level 4
- 2.4.1 Set up a tripod over a service cover with an Emlid set to record for at least 15 minutes. While doing so, collect a sample of 10 service covers in RTK mode using a second "rover" Emlid. When done, pull the logs from the Emlids, and send the "base" log to OPUS for processing. Use the OPUS response to get the position of the "base" service cover, and then use that coordinate with PPK to re-estimate the positions of your 10 service covers. Compare the results of the OPUS PPK vs your RTK results.
Structure from Motion
3.Level 1
- 3.1.1 Listen to Dr. Pingel's lectures on Structure from Motion photogrammetry.
- 3.1.2 Run the images for Pix4D's Quarry dataset in Drone2Map using 3D settings. [Walthrough] Be sure to set options to export the 3D model SLPK and OBJ. Load orthophotos and 3D models into ArcGIS Pro.
- 3.1.3 Run the images for Pix4D's Quarry dataset in WebODM using 3D Model settings. Inspect using the native interface.
- 3.1.4 Run the images for Pix4D's Quarry dataset in Pix4DMapper using 3D Model settings. Inspect using the native interface. A no-nonsense 6-minute walkthrough is here (no audio). Narrated walkthrough [1] [2] [3].
- 3.1.5 * Watch all 18 training videos on this playlist from Pix4D.
Level 2
- 3.2.1 Inspect the output of SfM software (point clouds AND models) using CloudCompare.
- 3.2.2 Complete Web Course on ArcGIS Drone2Map Basics.
- 3.2.3 Download three more sample NEIL datasets and reconstruct them. Some of these are difficult and will not reconstruct easily. This is normal! These datasets will push you to work through common problems in reconstruction. Show each reconstruction to Dr. Pingel before proceeding to the next.
- 3.2.4 Reconstruct the Old Johnson (small) dataset in a software of your choice using GCPs that you collect with the RTK GPS.
Level 3
- 3.3.1 Collect SfM imagery using the Autel Evo II V3 or DJI Mavic 3T, including at least 5 GCPs, and successfully complete a SfM reconstruction in Pix4DMapper, WebODM, or Drone2Map.
- 3.3.2 Explore the Pix4D GSD Calculator, Mapping Calculator, AND basic geometry/trig GSD calculations (Excel GSD = (2 * h * tan(radians(fov/2))) / pixels).
- 3.3.3 With supervision, collect and run 2 more SfM datasets yourself using the Autel Evo II V3. GCPs are not required.
Level 4
- 3.4.1 Fly 5 missions to collect and process your own 3D models. The 3D Scanning Protocol is here.
- 3.4.2 Fly 5 missions to collect and processing your own 3D models using RTK or PPK GPS to collected ground control points (GCPs). The 3D Scanning Protocol is here.
Terrestrial and Pole Photogrammetry
4.Level 1
- 4.1.1 Complete Level 1 of Structure from Motion
- 4.1.2 Watch Dr. Pingel's Lecture on Terrestrial and Pole Photogrammetry
- 4.1.3 Use two of RoomPlan, Scaniverse, SiteScape, 3D Scanner, Polycam, RealityScan, or Pix4Dcatch to capture a scene.
- 4.1.4 Run the sample pole photogrammetry dataset for Owens Park using Pix4DMapper or other SfM software. Pix4DMapper strongly recommended.
- 4.1.5 Use a GoPro on a pole to capture a few hundred images of a small place of interest using pole photogrammetry techniques (camera oriented down, but don't get your feet). Run these through Drone2Map, WebODM, or Pix4D Mapper or Matic. Pix4DMapper is strongly recommended.
UAV Piloting / Drone Flying
5.Level 1
- 5.1.1 Practice flying the DJI Avata or the DJI Neo INDOORS and complete the Basic Test of Piloting Skill (box flight including heading change with two intermediate landings and take-offs).
- 5.1.2 Complete the free FAA Recreational UAS Safety Test (TRUST) exam that enables you to fly drones recreationally or for research purposes (direct link. Credit for this if you have already done this in another class or have your Part 107.
- 5.1.3 Receive individual flight and safety briefly for outdoor flying from Dr. Pingel.
- 5.1.4 Fly a supervised manual flight with the Autel Evo II Pro V3 or similar drone.
- 5.1.5 With supervision, program and fly your first autonomous Structure from Motion mission.
Level 2
- 5.2.1 Fly two autonomous SfM missions with supervision.
- 5.2.2 Fly five autonomous SfM missions on your own.
Level 3
- 5.3.1 Fly an additional 5 autonomous SfM missions on your own.
- 5.3.2 Complete the training course at mzeroa.com.
- 5.3.2.1 Regulations
- 5.3.2.2 Airspace
- 5.3.2.3 Weather
- 5.2.2.4 Loading and Performance
- 5.2.2.5 Crew Resource Management
- 5.2.2.6 Airport / Field Ops
- 5.2.2.7 Radio Frequencies
- 5.2.2.8 Emergency Procedures
- 5.2.2.9 Real World Operations
- 5.2.2.10 Boot Camp
- 5.2.2.11 The Part 107 Exam
- 5.2.2.12 Review
- 5.2.2.13 Final Exam
- 5.3.3 Obtain your Part 107 Remote Pilot certificate.
6. Lidar
Level 1
Follow along with and do the following for an area other than the Binghamton University campus.
- 6.1.1 Watch lecture on Introduction to Lidar - Getting and Visualizing Lidar Data
- 6.1.2 Downloading lidar data from NOAA and visualizing it in CloudCompare: https://youtu.be/S7XRm3GERfA
- 6.1.3 Bringing LAS files into ArcGIS Pro Maps and Scenes: https://youtu.be/T-hQN8V8cko
- 6.1.4 Downloading Lidar Data from the National Map and Trimming Out Noise in CloudCompare: https://youtu.be/PkOyVAV_ZQk
- 6.1.5 Converting from LAZ to LAS to load in ArcGIS Pro & Defining a Projection to Fix Spatial Referencing Error: https://youtu.be/mEXixJTusGg
- 6.1.6 Download data from the National Map Lidar Explorer: https://youtu.be/TJvA-u_yeS0
- 6.1.7 Watch videos on the Interagency Elevation Inventory and the Eptium / COPC Viewer.
Level 2
- 6.2.1 Download lidar-derived DEMs (1 m) from the National Map and demonstrate visualization of them in ArcGIS Pro using hillshade, multidirectional hillshade vertical exaggeration, and slope mapping (0 to 45 degrees).
- 6.2.2 Download a lidar tile (BU central tile here). Use CloudCompare to clean the noise and ArcGIS Pro to define the projection using the metadata as a guide. Use ArcGIS to generate a DSM.
- 6.2.3 Visualize the DSM using hillshade, multidirectional hillshade, and both types of slope shading (VE-based about 2.3x and regular slope, but viusalized 0 to 45 degrees).
- 6.2.4 Create a DSM using CloudCompare.
- 6.2.5 Create a DTM in ArcGIS Pro by applying a filter for just ground points (class 2). You will need to run Classify LAS Ground first; default options are fine for now.
- 6.2.6 Create a canopy height model (CHM) by using Raster Calculator to subtract the DTM from the DSM.
- 6.2.7 Locate building footprint data for the area (Microsoft, Google, NY State all have versions), and use Zonal Statistics to Table with the CHM data to create extruded building footprints in a ArcGIS Pro Scene layer.
Level 3 - CloudCompare
- 6.3.1 Follow along and complete with your own data the Training exercises in CloudCompare. The support videos for each item will help you.
- 6.3.2 Watch the videos, providing some notes and comments on each for all of the What Else Can I Do? CloudCompare videos.
- 6.3.3 Read about and comment on listed items from the CloudCompare Wiki.
Level 4 - Overlay and Classification
- 6.4.1 Demonstrate a correctly overlaid lidar + SfM reconstruction in CloudCompare using manual translation and the NGS online calculator API to determine the correct value (example API call). Only vertical placement needs to be adjusted (no need to try to correct doming or tilting, if present).
- 6.4.2 Run and compare the efficacy of several methods of lidar ground classification using ArcGIS (both legacy and recent) and CloudCompare's Cloth Simulation Filter (CSF).
- 6.4.3.1 Use CloudCompare's segment tool not to segment but to repair classifications in a lidar data set.
- 6.5.3.2 Compare this to ArcGIS methods in the "Interactive Editing" cluster (Classification tab -> Select, etc.) to reassign classifications. [walkthrough]
- 6.4.4 Run a SfM project using Drone2Map, carefully demonstrating the use of correct horizontal and vertical datums (you'll want to install the Advanced Coordinate System Data Package). Set the input coordinate system of your input data correctly (for the Autel, WGS84 both horizontal and vertical; for the DJI receiving RTK, NAD83(2011) / ellipsoidal height). For the output/project coordinate system, NAD83(2011) - UTM 18N and NAVD88. You must project your GCPs into the output coordinate system before use (see the lecture notes).
- 6.4.5 Examine the Old Johnson UAV Lidar Dataset in CloudCompare, and comment on its quality in comparison to a more conventional lidar dataset sourced from an aircraft (e.g., from NOAA or the National Map you've previously worked with).
- 6.4.6 Use the "Classify LAS Building" tool in ArcGIS Pro to classify the buildings on a lidar tile of campus and qualitatively analyze the result.
- 6.4.7 Use the "Set LAS class code using Features" to use building footprint data previously sourced to classify buildings on campus.
7. Deep Learning
Level 1 - Using Existing Models
- Note: You can operate on imagery basemap layers in ArcGIS, just remember to set (in the Environment) to only run on the current display. Good layers to use include Bing imagery, Esri's imagery layers, and Google Satellite data (which you can add via Add Data / From Path at the linked URL). You can also use this orthoimagery layer of Binghamton University campus (also added to ArcGIS Pro via Add Data / From Path)
- 7.1.1 Read the article on Deep Learning models in arcgis.learn
- 7.1.2 Install the ArcGIS Deep Learning Framework
- 7.1.3.1 Download, deploy, and evaluate a pre-trained model from the Living Atlas (not SAM)
- 7.1.3.2 Download, deploy, and evaluate a second pre-trained model from the Living Atlas (not SAM)
- 7.1.3.3 Download, deploy, and evaluate a third pre-trained model from the Living Atlas (not SAM)
- 7.1.4 Download, deploy, and evaluate the Segment Anything Model (SAM)
- 7.1.5 Download, deploy, and evaluate the Text SAM model
Level 2
- 7.2.1 - Review the basic steps to train a deep learning model in ArcGIS Pro
- 7.2.2 - Complete the tutorial on training a deep learning model in ArcGIS Pro
- 7.2.3 - Download, deploy, and evaluate a third pre-trained model from the Living Atlas for point cloud classification
- 7.2.4 - Train your own deep model to detect objects in imagery.
- 7.2.5 - Train your own model to classify pixels in imagery.
Level 3
- 7.3.1 - Use Pixel Editor to clean a raster classified from a deep learning run.
- 7.3.2 - Code a complete Deep Learning training and classification with ArcPy
- 7.3.3 - Install and demonstrate the USGS Doodler package
- 7.3.4 - Set up a complete training and classification run using only core Python (not ArcGIS Pro)
Miscelaneous
- 99.1 - SAGA Tutorial