Progress log - ImmersiveSystems/net-robo GitHub Wiki
Here we will keep track of our progress and issues we run into:
Nov 25
- Set up one Raspberry Pi
Nov 26
-
Couldn't connect RPi to the Internet via Wi-Fi dongle, because the OS that was installed on that RPi didn't have Wi-Fi Config software that's used to connect to wireless networks. We used Tim's RPi that had more updated version of Raspbian and connected it to SFUNET Wi-Fi network.
-
We are also able to connect the Raspberry Pi to a Wi-Fi network set up on George's router through the command line. However this router is not connected to the SFU internet yet. The Raspberry Pi however have not yet been connected to the SFUNET-Secure Wi-Fi network or EDUROAM Wi-Fi networks as they are secured with WPA2 Enterprise, making it much more complicated for a Unix-based OS to connect to.
-
Set up George's router and set up remote access through router and SSH to the RPi
-
Tried to program Dagu motor controller. We faced two issues: Bootoader was either corrupted or absent. Arduino environment didn't support ATmega 168p CPU that was installed on the controller.
Nov 27
- Fixed issues with the controller we faced on the 26th. See
- See http://letsmakerobots.com/node/32096 and http://arduino.cc/en/Tutorial/ArduinoISP for more information.
- Connected the motor controller to the chassis
- Figured out battery was damaged
- Programmed the robot to spin wheels from connection made from Arduino IDE on laptop to motor controller to chassis
- Installed Arduino IDE on the computers in the lab
Dec 3
- Purchased a router and a powered USB hub
- Managed to stream video from RPi that can be viewed in a browser using this tutorial (the streaming is far from perfect)
Dec 4
- Set up new router and access to RPi via SSH
- Put wheels on robot and wired the controller with the chassis
- Programmed the controller to listen to input on serial port and spin wheels depending on the input
- Figured out the controller powered from a wall plug without a battery doesn't have enough power to drive wheels
Dec 6
- Set up sails.js server
- Figured out how to charge the battery
Dec 13
(Paraphrasing from George's Dec 13 status update information)
- Mounted all the parts on the robot in a more visually appealing and stable manner.
- Set up a Node.js app which acts as the server so we can now effectively control the robot from any computer on the network through our website.
- We are working on streaming the camera, controlling the robot better and creating an actual web platform.
- Since we are now able to control the robot through the local network from any computer, we would like to be able to connect our router to the internet. This will not only make development easier but it will also allow us to test the speed of our server to robot connection from the web.
Dec 14
- (George) Looked into streaming technologies and video encoding on the Pi
Dec 15
- (George) Set-up a RTMP streaming server but was getting 15 second delay in video
Dec 16
- (George) Decided to use libav instead of FFmpeg to encode videos as it takes 6 hours to compile FFmpeg on the Pi
- (George) Reduced delay to less than a second by using Flowplayer to view RTMP streams, however there is delay over while using the Flowplayer
Dec 17
- (George) Strobe player doesn't have the delay over time problem when viewing RTMP stream but requires a connection to the internet (actually, flowplayer does as well, but has a workaround when not connected to the internet)
- (George) Tried to get migrate Node.js app to the Sails.js server but there were problems with opening the socket connection with a Python client. No solution found in the community, so ultimately decided to not use Sails.js.
Dec 18
- (George) Investigate and played around with Meteor.js, Express and MEAN for alternative server technologies. Ultimately decided to use a MEAN Stack.
Dec 19
- (George) Set-up MEAN stack server and confirmed sockets work with the Python client
Dec 20
- (George and Soheil) Investigate best way of controlling robot and see if we are able to control the robot at slower PWM but first ramping up PWM so it moves and then rapidly decreasing it to the desired PWM. This is doable. We also looked into why the left wheels received voltage quicker than the right outside but found that this was inherent problem with this motor controller (http://letsmakerobots.com/node/35666). So to address this problem we would have to introduce some hack such as reducing the PWM of the left side or introducing a voltage-delay.
- (George) Re-organized robot layout
Dec 21
- (George) Add bootstrap 3 and socket.io support to web app
Dec 22
- (George) View and UI changes for web app and update web app with changes from the main MEAN repository
Dec 23
- (George) View and UI changes for web app and update web app with changes from the main MEAN repository
Dec 27
- (George) Github repo housekeeping
- (George) View and UI changes for web app
Dec 28
- (George, Soheil and Tim) Improved robot controls by adding spot turning and shift turning. Robot is now able to drive laps around the lab using the stream and web controls.
Jan 7
- (Soheil and Tim) Tested new firmware
- (Tim) Implemented controls for the exploration mode
Jan 8
- (George) Set up Raspberry Pi camera with ffmpeg. There's almost a 0.6-0.8 delay.
Jan 9
- (George) Set up Raspberry Pi camera with Gstreamer. Delay still there.
Jan 10
- (George and Tim) Investigated why robot was acting weird. Turns out battery continually recharging and discharging bought it to normal operating levels. Need to adjust code.
- (George) Reorganize robot
- (George) Try to get Logitech C920 camera working with Raspberry Pi in hopes of delay will be less. No luck.
Jan 11
- (George) C920 camera works with ffmpeg with almost barely any delay. However, tons of artefacts in the stream. Set up cross compiler to compile a newer ffmpeg. New version of ffmpeg does not solve the artefacts. Also, no way to adjust the bitrate of the camera from ffmpeg.
Jan 12
- (George) Compile Gstreamer to see if can remove artefacts. Gstreamer has a module that allows us to set the bit rate. Did not get it working though.
Jan 13
- (George) Finally got Gstreamer working but still tons of artefacts. Turns out new firmware broke the h264 parser on Pi. Revert to old firmware and artefacts are mostly gone. However, there are problems with playing the stream on Strobe Player.
Jan 14
- (George) Recompiled and adjusted settings in nginx rtmp module and seems like delay has been fixed.
Jan 16
- (George) Run tests on 7.2 V battery and find optimal charging settings for motor controller
Jan 17
- (George) Organize new parts and attempt to put together new pan and tilt module
- (George) Start working on driving mode
Jan 18
- (George) Update website from forked repo on Github
- (George) Redesign website UI
Jan 19
- (George) Redesign website UI
Jan 20
- (George) Run tests on 12 V battery
- (George) Remount robot
Jan 21
- (George) Fried a Raspberry Pi. Make sure to have battery switch off when connecting the battery as the negative lead might be connected first in the plug.
- (George) Robot is working well, controls are smooth and stream is good. Stability is an issue as wifi and pi sometimes stop responding.
Jan 22
- (George) Fried motor controller while testing the robot's stability, tried to fix to no avail. Motor controlled was fried as heatsink insulator was misaligned causing the heatsink to be part of the circuit. An exposed USB subsequently touched the heatsink frying the motor controller.
Jan 23
- (George) Put together pan and tilt module
- (George) Try to fix broken motor controller
Jan 28
- (Tim) Wired servos, encoders, motors with arduino and the motor controller
- (Tim) Changed connector for the battery
Jan 29
- (Tim) Wired and tested pan and tilt platform and laser
- (Everyone) Reorganized components inside the robot