Google Form Response Intake - SeanGrady/life_tracker GitHub Wiki
regarding email addresses: currently the only way to link gforms responses is via manually entered email addresses. The only way around this currently (afaik) is to either use a different form provider or use the apps script ResponseTrigger event, which would have to store the response somewhere (i.e. spin up the database). That's probably the correct way to do things if/when I actually have users but for now I want to keep the database off most of the time so manual email addresses it is.
Currently the project is set up to store google form responses in a google sheets spreadsheet as they come in. A more robust option in the future is to use a google apps script trigger to send form responses directly to the database when they're submitted, but this will require keeping the database instance running which will cost money. For the time being, a script to grab everything new from the spreadsheet and store it in the database is going to the solution.
UPDATE: Cloud SQL instances can be started/stopped programatically as detailed here. For the moment I'm still going to use a python script run locally and manually to grab everything from the Google Sheet(s) and do the ETL for two reasons:
- It's simpler, and I don't really need it to be automated. I'll only need the data in the database to be current while I'm looking at it, so until there's a web interface and/or additional users, if I'm in a place where I need to see the data I'm also in a place where I can just run the script real quick to update the database.
- In order to do this with Apps Script, I would need to do ETL (and therefore keep a copy of the models in the database) in javascript in the apps script project. I'd rather have everything in python and in the project repo on github. So far the only exceptions are where I'm using GAS as a job scheduler that lives in the cloud instead of on my machine, and I would like to keep it that way. Also, I think doing the ETL in GAS would require having the models defined in two places at once, which is obviously not a good solution.
If and when the project turns into a real app with a website, I may move everything onto Google App Engine or something similar, at which point automating data intake from start to finish will make more sense.