Quickstart Guide (Read me first!) - GeorgiaTech-DDI/makerspace_iot GitHub Wiki
To quickly set up an IoT box, read me. To learn more about the project or previous work from the project, view the in-depth documentation.
The previous iteration of this project utilized a BeagleBoneBlack, but due to network connectivity issues and connection issues with the Teensy, it was deprecated in the most recent iteration in exchange for a Raspberry Pi; however, much of the github documentation may make reference to the BBB.
💡Everything you want to know about the project!
Enable data collection for various machine parameters in Flower Invention Studio.
After going through various research papers we realized there was no paper which provided holistic knowledge of principles of mechanical, electrical, electronics and computer science to set up predictive maintenance in makerspace environment. The predictive maintenance techniques being used in the industry are complex, proprietary and expensive to implement.
As a PI, I want to have a way to know when the Cold Cut Saw is broken and be able aware of it as soon as possible. As a PI, I want to have a way to monitor the Cold Cut Saw in real time to know when it needs maintenance.
We have made an IoT infrastructure which utilizes sensors to noninvasively gather data on various machine parameters like current, vibration, temperature etc. These machine parameters are constantly measured and stored in a cloud warehouse.
- Microcontroller: Teensy 4.0
- Microprocessor: Raspberry Pi
- Vibration sensor: MPU 6050
- Temperature sensor MLX 90614
- Setting up the AWS pipeline
- Setup Circuit & Teensy
- Setup Raspberry Pi to connect to the Teensy
- Setup Raspberry Pi to connect to AWS
- Enclose everything
Within AWS there are 3 main services being used to establish the base pipeline.
- IoT Core
- Lambda
- DynamoDB
IoT Core: IoT Core acts as the 'middle man' between the raspberry pi and AWS, as it is able to receive the data from the raspberry pi and then send it to various services within AWS. Here is the step by step process for establishing the pipeline in IoT Core (the following is written in 2024 so some wording/location of buttons may have changed over the years).
- In IoT core on AWS, on the left hand menu select the drop down under All Devices (which is under Manage) and click on Things
- Select Create Thing
- Click Create Single thing, and give it a name (I did not change any of the default settings on the thing)
- When prompted for a device certificate, select Auto-generate a new certificate
- When you get prompted to create a policy, create a policy with the following policy:
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": "*", "Resource": "*" } ] } - Once you are done with that, select create thing.
- After creating your thing, you will be prompted to download certificates, public keys and private keys for the thing. DOWNLOAD ALL OF THEM. This is an extremely important step, as you will not have the chance to download them again and they are vital to accessing IoT core on the raspberry pi.
After downloading the certificates, the thing has now been created! Now we need to establish a way to send the data to a Lambda function.
- Still in IoT Core, select the drop down under message routing and select rules. Hit create rule
- Give your rule a descriptive name and hit next.
- Using SQL version 2016-03-23, pase the following in the SQL statement box:
SELECT * FROM '/bbb_test'. Note: the '/bbb_test' could be changed to anything, we are just going to be using the /bbb_test topic for this example. - Under rule actions, select the lambda option to send a message to a lambda function. Now select the lambda function you would like to send the data to (Note: you might have to create the lambda function now)
- Review the rule and hit create! Now you have created a way for data to be sent to IoT core, and a way for the data to be sent to a lambda function.
If you are curious/want to go about testing and ensure data is being received by IoT core, on the left hand menu, go to MQTT test client. Subscribe to the '/bbb_test' filter. Now if any data gets received by that client, it will show up within this test client page.
Lambda: The overall functionality of the lambda function is simple. It takes in the data and then formats it and sends it to the DynamoDB table. Here's how you create your own Lambda function:
- Select Create function, and then select Author From Scratch. Give your function a name, and then change the run time to Python 3.12
- I did not change any other settings of the lambda function. Now hit create function.
- After creating the function, all that is left is to put the code in the function. Here is a link to the code within GitHub. Simply copy the code into the code source portion of the lambda function: Lambda Function Code
- Make sure after any change you make in the code, you save the code and hit deploy so the changes go into effect.
After putting the code in the function, your Lambda function is now properly configured and ready to go!
DynamoDB The process of creating the DynamoDB table is pretty straightforward. Here is a step by step guide on how to create it and set it up.
- On the DynamoDB page, hit create table. Give your table a name (in the lambda function code, we have the DynamoDB table named IoT_Box_Data, but you can name it anything as long as you change this value too in the lambda function)
- When creating the table, set the partition key to 'time_stamp' with a string value, and set the sort key to 'assetID' that is a String. Use default settings and then hit create table.
- Congrats! You have now established the entire AWS pipeline.
- Assembly all relevant parts:
- MPU 6050
- MLX 90614
- Teensy 4.0
- Raspberry Pi 4 Model B
-
Place parts into a circuit together ensuring the SCL is in Pin 19 and SDA is in Pin 18 if using a Teensy 4.0. Align all the SCL, SDA, GND, and VCC connections (make sure to utilize the 3.3V connection). There is a circuit diagram for reference, although it is slightly inaccurate with the wiring which is why it has text annotations. ENSURE ALL CONNECTIONS ARE SOLDERED PROPERLY There is also a PCB file available (made in Eagle) here.
-
Flash the Teensy with this code, ensure to change the assetID to be the machine you want and change the delay to be how often you want to collect the data points (both are at the end of the script). You will need to install appropriate libraries:
- Adafruit_MLX90614.h
- ArduinoJson.h
- Wire.h
- Install Raspbian or your operative system of choice (preferably linux based).
- After creating the Teensy Circuit, to connect the Teensy to the Raspberry Pi interface, just plug it in with a usb connection.
- To test the Teensy connection make sure to create a python virtual env and install minicom (which can be seen in the next step) then you can run this script, ensure to change the serial port to "/dev/ttyACM0" and Baudrate to 57600.
-
Install Dependencies in a python environment
python3 - m venv IoT -
Activate the python virtual environment
source ~IoT/bin/activate- Depends on the Path of your python virtual environment (don't copy and paste this) -
Install python dependencies now that the python environment is active
sudo apt-get install minicompip install AWSIoTPythonSDKpip install pyserial -
Now you can run this script to connect to AWS after entering the token values for your AWS connection.
-
To ensure that the script will run regardless, we will create a tmux session and create a shell script to start the tmux session on reboot.
sudo apt install tmuxnano ~/start_tmux_session.shType this within the shell script:. /home/VIP/IoT/bin/activate--> activate your python virtual env. On a new line type this:tmux new-session -d -s myscript 'python3/home/VIP/Desktop/serialToAWS.py'--> runs the AWS python script, depends on the path of the python script. Exit and Save. After exiting and saving, typechmod +x ~/start_tmux_session.shto ensure that the script is active. To create the cron job typecrontab -eand at the bottom of the script type@reboot /bin/bash -c "sleep 10 && /home/VIP/start_tmux_session.sh"--> exit and save, this will create a cron job to execute the shell script to start the tmux session on reboot. -
Test by reboot
sudo reboot. Furthermore some useful commands:tmux ls- shows all tmux sessions that are runningtmux attach -t myscript- will attach to the tmux session and show you the data streamctrl + b , d- to detach from a tmux session -
Connect via SSH. To connect just type something like
ssh [email protected]and to exit the ssh session, typeexit.
For a full in-depth guide on how to use the API, reference the following page.
Brief API use guide:
For a brief how to guide. Here is how to utilize the API. The API currently has many functions you can call that can be used to query the database differently. For a simple call to get values with a certain AssetID, use the following:
Simply edit asset='...' with whatever value you want, and change the date to the certain date you would like. The part represents the part of the day you would like to get data from. So part 1 will be midnight (12AM) to noon (12PM) and part 2 will be noon (12PM) to midnight of the next day.
If the above link is not working, the link to call the API may have changed over time. To find the updated link, go to AWS API Gateway in the AWS console. Hit stages in the left menu, and look at the invoke URL. This is the URL used to call the API.
- Know everything about vibration sensor MPU-6050: https://docs.google.com/document/d/1r2neOi8uGhan92e9DjsKKtxxqDJiMkUF84l4lC0W1Pw/edit?usp=sharing
- Know everything about temperature sensor MLX90614: https://docs.google.com/document/d/1phhB3WFZrKpIGyd0-lIQMWgj1KDxEgaFxqxmANvgIL0/edit?usp=sharing
| Analytics | Cloud/AWS | Sensor |
|---|---|---|
| Goal 1: To analyze the gathered data and work towards predictive maintenance. | Goal 1: Handle process from IoT Core to Dashboard. | Goal 1: Provide a sensor suite that can record machine data. |
| Goal 2: Scale monitoring to multiple machines. | Goal 2: Create and improve firmware for data transfer to cloud. |