GCP Cloud Composer - ghdrako/doc_snipets GitHub Wiki
- https://medium.com/@amarachi.ogu/implementing-ci-cd-in-cloud-composer-using-cloud-build-and-github-part-2-a721e4ed53da
- https://engineering.adwerx.com/sync-a-github-repo-to-your-gcp-composer-airflow-dags-folder-2b87eb065915
- https://cloud.google.com/composer/docs/dag-cicd-integration-guide
DBT
- https://medium.com/@montadhar/how-to-setup-cloud-composer-with-dbt-f9f657eb392a
- https://discourse.getdbt.com/t/how-to-run-dbt-on-cloud-composer-and-authenticate-the-service-account/4646
- https://mastheadata.com/how-to-run-dbt-airflow-on-google-cloud/
- https://discourse.getdbt.com/t/running-dbt-in-composer-using-a-kubernetespodoperator/2590
Cloud Composer uses Cloud Storage to store Apache Airflow DAGs, also known as workflows. Each environment has an associated Cloud Storage bucket. Cloud Composer schedules only the DAGs in the Cloud Storage bucket.
export PROJECT_ID=$DEVSHELL_PROJECT_ID
export DEST_BUCKET_NAME="$PROJECT_ID-destination"
export SRC_BUCKET_NAME="$PROJECT_ID-source"
echo “Making bucket: gs://$DEST_BUCKET_NAME”
gsutil mb --retention 120m -c standard -l europe-west3 -p $DEVSHELL_PROJECT_ID gs://$DEST_BUCKET_NAME
gsutil mb --retention 120m -c standard -l us-central1 -p $DEVSHELL_PROJECT_ID gs://$SRC_BUCKET_NAME
gcloud composer environments run composer-advanced-lab \
--location europe-west3 variables -- \
--get gcs_source_bucket
gcloud composer environments run composer-advanced-lab \
--location europe-west3 variables -- \
--set table_list_file_path /home/airflow/gcs/dags/bq_copy_eu_to_us_sample.csv
gcloud composer environments run composer-advanced-lab \
--location europe-west3 variables -- \
--set gcs_source_bucket $SRC_BUCKET_NAME
gcloud composer environments run composer-advanced-lab \
--location europe-west3 variables -- \
--set gcs_dest_bucket $DEST_BUCKET_NAME
gcloud composer environments run composer-advanced-lab \
--location us-central1 variables -- \
--get gcs_source_bucket