Client Meeting November 1st - ProstheticFalconry/Capstone GitHub Wiki

Controlling a robotic actuator via visual camera information is incredibly unlikely due to Master level complications, thus our demo is less so flying and grabbing things and more so just flying to a location. CDR moved until 2 weeks after reading week. The reading week and first week back in school will be everyone making schematics and block diagrams in preparation for CDR. Jaden and Adam can provide slide for artificial intelligence programs that will be put on the quadcopter after our involvement.

Regarding sensors we require gyroscopic stability sensor, FSR, ultrasound proximity sensors, and accelerometer, are there any other sensors required by the lab? Do we care about sound in the ultrasound sensor due to propeller motion, yes a little but many of those concerns are covered via hardware. Its fine to use ultrasound for sensors because using IR sensors we may run in to the issue of the IR signal moving through glass and giving the quadcopter a false sense of the environment and thinking it can pilot through a pane of glass. The choice of ultrasound sensor is fine for our purposes. Mike discusses concerns and problems with regards to the setting up of the kernel and getting the Beaglebone, Linux and Via adapter to run on a monitor. As of right now it has been tried on two monitors to no success.

Regarding budget we seem to be reaching a price point of $500 rather quickly and may need additional funding.