Image processing notes - quasics/quasics-frc-sw-2015 GitHub Wiki
- FIRST docs on image processing, including distance computation
- Discussion of identifying the likely target, and performing distance computation from 2017 game
- GRIP project home
- Intro to GRIP from FIRST
- Generating code from GRIP
- Some advice on using code produced by GRIP on your robot
- For the 2019 game:
- For the 2018 game
- Vision Tracking in FRC — What I’ve Learned this Year
- Another team's sample code, including distance computation using the images
- GarnetSquadron4901/rpi-vision-processing
- Tower Tracker 1.0
- "What did you use?" thread on CD
- Notes from GoS on creating a GRIP pipeline
- Vedu's Blog - FRC Vision
-
LED rings
- You're generally going to want something really bright, and tuned to a color that's unlikely to show up on the field (so red&blue are bad, but green is pretty popular). You'll also want a ring that will fit well around the camera/lens (60mm is pretty popular).
- The default URL for a USB camera plugged into the Rio is http://:1181/stream.mjpg. (For example, in the the lab using radio C, that's http://10.26.56.31:1181/stream.mjpg.)
- Reading network table data on the robot (e.g., data from the DS)
-
Specs for data when published by GRIP over HTTP
- Sample URL for data (assuming table name is "quasicsContoursReport", and you're browsing from the same machine): http://localhost:2084/GRIP/data?quasicsContoursReport
-
Accessing NetworkTables data on the robot