GCP Labs - vidyasekaran/GCP GitHub Wiki
- Simplify VM setup in Google compute with few options
a. Startup Scripts - Search for VM Instances - select options - put below startup script in Management - startup script
#!/bin/bash apt update apt -y install apache2 echo "Hello world from $(hostname) $(hostname -I)" > /var/www/html/index.html
b. Instance Template - Creating vm instances much more easier than startup scripts you can define machine types, images, labels, startup scripts and other properties used to create VM instances and Managed Instance Group.
-
Search Instance template in GCP Console search - Click on Create Instance Template button - fill in the name and other details , create instance template.
-
Once you see instance template is created - click on actions - to create vm and managed instance group.
Once you created VM you can check it in VM Instances
c. Custom Image - If u are installing lots of software via startup script and create VM from console it would take lots of time to create VM. Custom Images will help reduce this time and create a VM. So create a instance preinstalled software and then create a custom image from it.
Create Custom image and create a template out of it
Custom image can be created from VM Instance, Persistence Disk, Snapshot, another image or file in cloud storage.
-
Create a VM Instance and go to disk then select action -> Create Image - review config and change if required you need to stop VM and then only you can create an Image.
-
Check the created image in images - from Actions - Create Instance-
-
You can create a template from this custom image already created in above steps..
Build and launch a Spring Boot Java app from Cloud Shell
Creating Managed Instance Group and use them in a load-balancing backend service.
Instance template is mandatory for creating Managed Instance Group
Configure autoscaling to auto adjust number of instances based on load
Minimum number of instances
Maximum number of instances
AutoScaling metrics CPU Utilization target, Load Balancer Utilization target, or any other metrics from stack driver.
And also cool down period - how long to wait before looking at autoscaling metrics again. Also Scale in control, autohealing, health check and its criteria
Updating Managed Instance Group (MIG)
Through update of MIG we can setup Canary testing or gradual update of instances via instance template to New Version of software.
Create a Load Balancer and load balance across all instances of Managed Instance Group
NOTE: Premium tier for global LB where 3 microservices MIG in different Region Standard Tier for LB where 3 microservices MIG in same Region
For microservice architecture you create multiple MIGs each where each MIG will run a specific microservice and then you can load balance by creating multiple host and path rules.
You will configure 3 things as part of LB - backendservice, frontend (which ip you hit for LB), host and path
we are creating a backendservice and attaching a Managed Instance group we created earlier. If u have multiple microserive you would create multiple backendservice with MIG attached and also setup different host/port for each microservices.
Create a Rolling Update - Gradual update of instances in an instance group to the new instance template
HA for Compute (Learn with Mahesh : GCP Professional cloud architect certification)
**Compute Engine **- Workload to be deployed in 2 different zones so if 1 goes down another can serve.
Solution : Create a Managed instance group - select multi zone option so that it deploys in multiple zones. Also enable auto scaling and have 3 zones minimum.
Achieving HA in Managed Instance Group
We can setup LB for distributing load evenly and auto scaling to cater to constantly raising traffic and for High Availability Managed instance group allows Auto healing and Auto updating configurations.
**Auto healing **- observe and replace unhealthy with healthy instances. Auto Updating - Update instance software or patches without downtime.
To Configure MIG with HA we need to setup health check which help achieve Autohealing so that MIG probes instances to observe for failures such as 500 and replace non healthy instance with healthy instance.
Create a health check named - "health-check" in Compute Engine -with health criteria shown below
Name : health-check
Check Interval : 10 sec ; Time out - 5 sec (wait for 5 seconds for response to probe)
Healthy Threshold : 2 consecutive sucesses
Unhealthy threshold : 3 consecutive failure
Now go to Instance Group - edit - go to autohealing health check - set 90 seconds delay and select health check named "health-check" which you created before and setup.
Now test by going to VM instance and simulate failure.
For Auto Updating - Applying updates to instance without downtime - MIG provides speed and scope to auto update without affecting server downtime. Partial rollouts are also possible via canary testing.
Instance Group--> edit - select rolling update (meaning gradual update) and provide 20% so this much traffic goes to newly created instances - change update mode from proactive (too disruptive) which proactively deletes and creates instance - you can also choose opportunistic meaning - updates occur as we manually restart server or a new server is auto started by MIG
Maximum surge - Max number of temporary instances to add while updating
Google App Engine
Search app engine and create a app engine where you specify a region and language. NOTE: For 1 project you can only have one Google App Engine.
Also search app engine admin and enable API
Now Open cloud shell, open in a new window. Now drag and drop python D:\GCP\In28minutes_GCP_Architect_course\downloads\downloads\app-engine
cd into /default-service and run > gcloud app deploy
(if u get permission issue take email id from here and also the resource which permission is denied and go to IAM - select the email id and add another role and select resource access there.
NOTE: Whenever we create a app engine project a new service account is created. when we deploy the package is stored in google cloud storage and app beam will try to access this via the service account, so if app beam fails to pick up the deployed package we need to provide permission to fetch it. so only we are going to IAM and provide the additional role access to it.
To deploy another version
cd into /default-service and run > and modify main.py and then deploy again
cd into /default-service and run > gcloud app deploy --version=v2
gcloud app versions list (list 2 versions)
SERVICE VERSION.ID TRAFFIC_SPLIT LAST_DEPLOYED SERVING_STATUS default 20210825t110417 0.00 2021-08-25T11:05:58+00:00 SERVING default v2 1.00 2021-08-25T11:37:41+00:00 SERVING
Now find the URL for this instance - 20210825t110417
duvarakan1950@cloudshell:~/default-service (labvidyan)$ gcloud app browse --version="20210825t110417" Did not detect your browser. Go to this link to view your app: https://20210825t110417-dot-labvidyan.et.r.appspot.com
If we have 2 instances of a service running - how do we split the traffic between these 2 versions
Make changes to code deploy the new version but without sending traffice to it
gcloud app deploy --version=v3 --no-promote
List the instances gcloud app instances list
To test V3 - gcloud app browse --version v3 and get the url and now test it https://v3-dot-labvidyan.et.r.appspot.com
Now to Split the traffic between V2 and V3 its by IP address
gcloud app services set-traffic split=v3=.5,v2=.5
take url of v2 and check like below in cloud shell watch curl https://v2-dot-labvidyan.et.r.appspot.com/
Split traffic randomly gcloud app services set-traffic --splits=v3=.5,v2=.5 --split-by=random
gcloud app deploy
gcloud app services list
gcloud app instances list
gcloud app deploy --version=v2
Todo Labs ==========
-
Create different compute engine MIG and setup a load balancer to serve different microservices
-
Create different instance of microservices using Google App engine and setup loadbalancer to serve diff microservices.
You may try using below Python code
https://www.youtube.com/watch?v=UFZCnhJtYd8 (Learn with Mahesh youtube channel)
https://github.com/wwwtyro/Astray
--
apt update apt install -y git apt install -y apache2 cd /var/www/html git clone https://github.com/learngcpwithmahesh/MountkirkGames.git
Cloud Pub/Sub
Created a topic and set pull subscriptions and published message and able to see the messages in subscription
Reference : Udemy (Renga course) - GCP Professional Cloud Architect: Google Cloud Certification, Session : 223 step 06
Cloud DataFlow
Below 2 i tried sucessfully
Pub/Sub to Text Files on Cloud Storage
Created a pub/sub topic and also Cloud Storage bucket and setup Dataflow job to read from topic and create and put the file in cloud storage. (Using DataFlow - Create job from Template)
Streaming pipeline. Reads records from Pub/Sub and writes them to Cloud Storage, creating a text file for each five minute window. Note that this pipeline assumes no newlines in the body of the Pub/Sub message and thus each message becomes a single line in the output file.
Text Files on Cloud Storage to Pub/Sub
Created a pub/sub topic and also Cloud Storage bucket and setup Dataflow job to read the text file from cloud storage and post it to pub/sub topic. (Using DataFlow - Create job from Template)
A pipeline that polls every 10 seconds for new text files stored in Cloud Storage and outputs each line to a Pub/Sub topic.
BigQuery
-
Create a Bigquery Schema and a table and imported a .csv file into the table (Created a table with field names that matches the field names in .csv file)
-
I setup "Data Transfer Service" Job in Biquery to load data from DataStore and Insert into BigQuery table and also put the message in Pub/Sub and sent a mail to admin. Its easy but it failed due to permission issue trying to query bigquery.
Service Account
- We create service account to create object and view permission -
- add this service account while creating vm and attach.
ssh and try creating bucket and you will not be able to do as you have only object create and view permissions
https://www.youtube.com/watch?v=qXuw--126Bk
Primitive Role (owner, editor, viewer) - we dont use generally
Predefined Role (we can assign this to service account)
custom roles ( if we don't find permissions in predefined roles we create this, this is much more granular permission than predefined roles)
Todo
Install Wordpress app in gcp also automate using Jenkins
Deploy Spring Boot App in AppEngine
you may follow javatechie or for simplicity just follow https://codelabs.developers.google.com/codelabs/cloud-app-engine-springboot/#0 Reference 1. 2. https://www.youtube.com/watch?v=0zJUK-SwPqQ
Deployed Helloworld microservices in AppEngine - Followed steps are
-
Install Cloud SDK Installer from https://dl.google.com/dl/cloudsdk/channels/rapid/GoogleCloudSDKInstaller.exe refer here too for steps - and installed - https://cloud.google.com/sdk/docs/install
-
you will need to login gcp - appengine - and create an app engine (or) from command line gcloud app create --region us-central
-
create a sample project and import it in STS and you will need to build (clean install) spring-boot:run (run app locally) and appengine:deploy (deploy to appengine) if STS build fails deploy from cmd line where u have the project and run below
curl https://start.spring.io/starter.tgz
-d bootVersion=2.3.0.RELEASE
-d dependencies=web
-d baseDir=gae-standard-example | tar -xzvf -
- once build from command line - gcloud init and then gcloud app deploy
Deploy Spring boot app in GKE
Reference : https://codelabs.developers.google.com/codelabs/cloud-springboot-kubernetes#0
created spring boot app - imported in STS - containerized using jib - created google cloud registry - created kubernets cluster - deployed app - exposed thru load balancer - modified and redeployed new version
Except for code changes in STS all other steps achieved thru command line.
NOTE: Had to install kubectl refer : https://kubernetes.io/docs/tasks/tools/install-kubectl-windows/
https://codelabs.developers.google.com/codelabs/cloud-springboot-kubernetes#1
Follow : https://codelabs.developers.google.com/codelabs/cloud-springboot-kubernetes#1
Code imported in STS and changes made but steps followed using CLI
got 3 issues
- $ ./mvnw -DskipTests com.google.cloud.tools:jib-maven-plugin:build -Dimage=gcr.io/$GOOGLE_CLOUD_PROJECT/hello-java:v1
hardcoded $GOOGLE_CLOUD = play-with-gke-vidyan like
mvnw -DskipTests com.google.cloud.tools:jib-maven-plugin:build -Dimage=gcr.io/play-with-gke-vidyan/hello-java:v1
- Below throwed error [ERROR] {"errors":[{"code":"UNAUTHORIZED","message":"You don't have the needed permissions to perform this operation, and you may have invalid credentials. To authenticate your request, follow the steps in: https://cloud.google.com/container-registry/docs/advanced-authentication"}]}
Fix : ran-gcloud auth configure-docker reference TROUBLESHOOTING GUIDE: https://cloud.google.com/container-registry/docs/advanced-authentication
C:\Users\USER\git\gs-spring-boot\complete>mvnw -DskipTests com.google.cloud.tools:jib-maven-plugin:build -Dimage=gcr.io/play-with-gke-vidyan/hello-java:v1
TODO: Try above using https://github.com/vidyasekaran/aws-eks-kubernetes-masterclass
Starting point : https://github.com/stacksimplify/kubernetes-fundamentals/tree/master/09-Deployments-with-YAML
Deploy, scale, and update your website with Google Kubernetes Engine
https://codelabs.developers.google.com/codelabs/cloud-deploy-website-on-gke#0 https://kubernetes.io/docs/concepts/workloads/
Continuous deployment to Google Kubernetes Engine (GKE) with Cloud Build (TODO)
https://codelabs.developers.google.com/codelabs/cloud-builder-gke-continuous-deploy#0
Create a Continuous Integration Pipeline with Jenkins and Google Kubernetes Engine(TODO) ———————————————————————————————————-———————————————————————————————————-——————————
https://docs.bitnami.com/tutorials/create-ci-cd-pipeline-jenkins-gke/
How to Push Docker Image to Google Container Registry (GCR) through Jenkins Job(TODO) ———————————————————————————————————-———————————————————————————————————-———————
Access files in Cloud Storage with the Spring Resource abstraction —————————————————————————————————-————————————————————————————————
Tried below tutorial and i was able to access files in cloud storage
Access files in Cloud Storage with the Spring Resource abstraction
Source Code :
Spring Boot Maven Plugin Documentation
https://docs.spring.io/spring-boot/docs/2.5.4/maven-plugin/reference/htmlsingle/#build-image
Complete GCP Code
https://github.com/GoogleCloudPlatform/spring-cloud-gcp
Deploy a website with Cloud Run (CodeLabs)
I was able to build and deploy to cloud run
https://codelabs.developers.google.com/codelabs/cloud-run-deploy#4
Think and design series with Mahesh
Problem 1 #
You want to connect to BigQuery via compute engine which runs python code to read data from specific table. You will run this query in adhoc basis, You need to build a cost effective solution which uses google recommendation with respect to cloud IAM.
Solution : Preemptable VM with service account Custom Service account with bigquery job user
-
Create Service account from IAM - Service Account - Click Create Service Account - myserviceaccount - continue provide bigquery service account - continue -- done
-
Create a compute engine - enable api - select machice type - in identity and api access select - "my service account" - in management set preemptive to on so that vm becomes preemptive to disable external ip (for cost effectivenss) - edit vm go to network and select none in external ip.
-
ssh into box create
NOTE: Quickstart: Using client libraries
https://www.youtube.com/watch?v=Wpek6iluY4I
Problem 2 #
Employee using compute and storage in GCP on weekends , how do you stop this?
Solution : Go to IAM - select the user - edit - add conditions like > Sunday and < Saturday and provide time.
SpringBoot Apps development
Cache data from a Spring Boot app with Memorystore
Connect a Spring Boot app to Cloud SQL
Retrieving Credentials/Secrets from Secret Manager with Spring Boot https://codelabs.developers.google.com/codelabs/cloud-spring-cloud-gcp-secret-manager?hl=en&continue=https%3A%2F%2Fcodelabs.developers.google.com%2Fspring%3Fhl%3Den#4
Labs Completed
Lab : Create Kubernetes Cluster thru GCP Console
Lab : Create Kubernetes Cluster using gcloud and kubectl command
Lab : Deploy ngnix server to Kubernetes cluster and expose it with loadbalancer.
Lab : Deploy hello-world-rest-api and expose it via LoadBalancer using kubectl
Lab : Write a cloud function to trigger it via HTTP
Lab : Write a cloud function to trigger it cloud storage file creation
Lab : Create a cloud storage bucket and set a notification on file creation send it to a pub/sub topic.
Lab : Create a pub/sub topic, a cloud function to be invoked when an object is published to topic.
Lab : Create a Cloud Scheduler which writes a message to a pub/sub topic every 1 minute.
Lab : Create a Cloud Scheduler which invokes a cloud function every minute.
Encryption using KMS
Lab: Attaching a boot disk encrypted with CMEK to a new VM Lab: Attaching a CMEK to a new CloudStorage bucket
Cloud Storage
Lab: Create a Cloud Storage bucket and store objects. Lab: Add a rule to delete object after 30 days Lab : Add data retention policy and locking it. Lab : Transfer data online using gsutil to cloudstorage Lab : Allow user limited time access to your object using Signed URL.
Pub/Sub
Lab: Create a topic and a publisher and 2 subscriber, publish message and receive it in subscribers
Cloud DataFlow
Lab: Create a dataprocessing pipeline using cloud dataflow with help of word count.
GKE
lab : create a cluster and deploy a microservice
-
Lab: create an Iam user and assign predefined roles to them
-
Lab: Create Custom Role & Assign to IAM Member
-
Lab: Create a service account in GCP with predefined roles
-
Lab : How to control and view audit logs in GCP
-
Lab: Create a Compute Engine and install webserver in startup script and SSH into it.
-
Lab: Simplify VM creation with Instance templates - with instance template
-
Lab : Create VM - Reducing launch time with a custom image.
-
Lab : Create VM using instance template with custom image.
-
Lab: Create Managed Instance Group and a HTTP load balancer front facing the MIG
Lab : Create a cloud bucket and write a cloud function for file creation and log some details. Upload some files and check logs.
Lab : Use Cloud Logging - Logs Explorer - Filter log by Search for log based on log information used in Cloud Function for logging.
1. You can filter the logs based on Resource, Log Name and Severity.
2. From log entries click on 3 dots to filter based on matching entries or matching substring.
3. Click on log each log entry to see details about which resrouce loged it, its severity, textPayload detail, timestamp etc.
4. Check Logs Storage (required, default), Logs Router which routes log based on fiters
5. You can use Logs Router and add Sinks and build inclusion filters to send log to any destination example : Cloud Storage bucket, Pub/Sub
Lab : Explore Cloud Monitoring - Dashboard, Alerting, Uptime check, Metrics explorer
Lab :Deploy a sample application to a Google Kubernetes Engine (GKE) cluster. Create a trace by sending an HTTP request to the sample application. Use the Cloud Trace interface to view the latency information of the trace you created.
https://cloud.google.com/trace/docs/quickstart https://github.com/GoogleCloudPlatform/python-docs-samples/blob/master/trace/cloud-trace-demo-app/app/app.py
Lab : Create a Managed Instance Group and perform canary testing using Peform Rolling update to deploy new version of software.
Lab : Deploy a VM instance using Deployment Manager.
Lab : Use BigQuery Public Dataset and view data from cloud console. https://www.freecodecamp.org/news/google-bigquery-beginners-guide/
Lab : Create Own Dataset, create table schema and query data in BigQuery tables from CloudConsole. use bq utility to select data bq query --nouse_legacy_sql 'SELECT firstname,lastname FROM vidya_practise_bq_dataset1.address_table'
Lab : Configure FireWall rules to allow SSH into Virtual machine. Lab : Configure Firewall rule to allow just bastion host to SSH into private machine in a subnet. Lab : Create a Custom VPC - 2 Subnets subnet-a, subnet-b, create 2 VMs one private in subnet-b. and another bastion host in subnet-a. Setup Firewall rule to allow SSH from bastion host to private machine. Lab : Configure NAT Gateway to allow private VM to connect to Internet. Lab : Setup a CloudSQL MYSQL DB and connection from CloudShell. Lab : clone springboot jdbc template to connect to cloudsql MySQL user db to perform select operation. Lab : clone springboot JPA to connect to cloudsql MySQL myuserdb to perform insert and select from house table. Lab : Store database password in Secrets Manager and use it via java service.
Lab : Create a App Engine Application and Deploy a service in it.
Lab : Clone java spring boot app and run it via appengine flexible
Lab : Create a CloudStorage bucket - set notification on a topic so that object created in cloudstorage publish it to a pubsub topic
Lab : How To Subscribe and Send PubSub Messages to another topic in Spring Boot App
Lab : Clone a java web api and run it in cloud shell
Lab : Convert java web api as a docker application and run in cloud shell
Lab : Run a Java web service app in Cloud Run
Lab : Using Cloud Data Store - create entity and select data via cloud console
Lab : Create a Kind in Cloud DataStore and perform CRUD operation via Java API
11/2
-
Lab : BigTable Instance creation
-
Lab : Create a big table application profile.
-
LAb : Use cbt commands to create cluster, instance, table and play with those commands https://cloud.google.com/bigtable/docs/cbt-reference
-
Lab : Use dataflow job to process and import a CSV file into a Cloud Bigtable table .
cbt createtable vidya-practise-table1 "families=csv" (Create table) cbt ls vidya-practise-table1 (Verify table created)
Ref : https://cloud.google.com/community/tutorials/cbt-import-csv
git clone https://github.com/GoogleCloudPlatform/cloud-bigtable-examples.git cd cloud-bigtable-examples/java/dataflow-connector-examples/
Start dataflow job to process the CSV concurrently and perform writes at a large scale to the table. mvn package exec:exec -DCsvImport -Dbigtable.projectID=cloud-dna-mas -Dbigtable.instanceID=vidya-bt-practise -DinputFile="gs://vidya-practise4/data.csv" -Dbigtable.table="vidya-practise-table1" -Dheaders="rowkey,a,b"
cbt read my-table count=5 (Verify data got inserted).
Ref: https://github.com/GoogleCloudPlatform/cloud-bigtable-examples
- Lab: Creating and Working on a datalab instance using juypter notebook and play with %gcs commands to interact with cloud data store
%gcs list (display all bucket names) %gcs view -n 10 -o gs://bucket-name/file-name.json %gcs read --object gs://bucket-name/file-name.json --variable babynames (read file and store it in variable - babynames) print(babynames)
sudo apt-get install google-cloud-sdk-datalab
datalab create training-data-analyst
connect to previously created instance
datalab connect training-data-analyst
- Lab : Importing data from Cloudstorage to BigQuery table using Juypter notebook pandas and py code
Perform below operation and open datalab in 8081 port
sudo apt-get install google-cloud-sdk-datalab
datalab create training-data-analyst
connect to previously created instance
datalab connect training-data-analyst
Importing Data from Cloud Storage
from google.datalab import Context import google.datalab.bigquery as bq import google.datalab.storage as storage import pandas as pd try: from StringIO import StringIO except ImportError: from io import BytesIO as StringIO
%gcs read --object gs://cloud-datalab-samples/cars.csv --variable cars
Create the schema, conveniently using a DataFrame example.
df = pd.read_csv(StringIO(cars)) schema = bq.Schema.from_data(df)
Create the dataset
bq.Dataset('importingsample').create()
Create the table
sample_table = bq.Table('importingsample.cars').create(schema = schema, overwrite = True)
sample_table.load('gs://cloud-datalab-samples/cars.csv', mode='append', source_format = 'csv', csv_options=bq.CSVOptions(skip_leading_rows = 1))
Check data present in BQ table importingsample.cars
%bq query -n importingSample SELECT * FROM importingsample.cars
%bq execute -q importingSample
Importing Data from a DataFrame
cars2 = storage.Object('cloud-datalab-samples', 'cars2.csv').read_stream() df2 = pd.read_csv(StringIO(cars2))
df2.fillna(value='', inplace=True)
sample_table.insert(df2) sample_table.to_dataframe()
- Lab : Exporting data from BigQuery table to Cloudstorage using Juypter notebook pandas and py code
Perform below operation and open datalab in 8081 port
sudo apt-get install google-cloud-sdk-datalab
datalab create training-data-analyst
connect to previously created instance
datalab connect training-data-analyst
Importing Data from Cloud Storage
from google.datalab import Context import google.datalab.bigquery as bq import google.datalab.storage as storage import pandas as pd try: from StringIO import StringIO except ImportError: from io import BytesIO as StringIO
project = Context.default().project_id sample_bucket_name = project + '-datalab-samples' sample_bucket_path = 'gs://' + sample_bucket_name sample_bucket_object = sample_bucket_path + '/tmp/cars.csv' print('Bucket: ' + sample_bucket_name) print('Object: ' + sample_bucket_object)
sample_bucket = storage.Bucket(sample_bucket_name) sample_bucket.create() sample_bucket.exists()
table = bq.Table('importingsample.cars') table.extract(destination = sample_bucket_object)
%gcs list --objects gs://$sample_bucket_name/*
bucket = storage.Bucket(sample_bucket_name) obj = list(bucket.objects())[0] data = obj.read_stream()
Exporting Data to a Local File
table.to_file('/tmp/cars.csv') lines = None with open('/tmp/cars.csv') as datafile: lines = datafile.readlines() print(''.join(lines))
Clean up
sample_bucket.object('tmp/cars.csv').delete() sample_bucket.delete() bq.Dataset('importingsample').delete(delete_contents = True)
- Lab : Create service account with gcloud file creation in cloud storage and use this service account to put objects in a bucket
Create and service account with role GCS Admin and generate and store key in local and use it in below command
gcloud auth activate-service-account --key-file creds.json gcloud init (1, new service account,
Ref: Install gcloud - https://snapcraft.io/install/google-cloud-sdk/debian
download a file
curl http://hadoop.apache.org/docs/current/ hadoop-project-dist/hadoop-common ClusterSetup.html >setup.html
cp setup.html setup2.html cp setup.html setup3.html
copy this file to bucket
gsutil cp setup.html gs://iaptest-328809 gsutil acl get gs://iaptest-328809/setup.html > acl.txt
cat acl.txt
lets constraint it by
gsutil acl set private gs://iaptest-328809/setup.html
check access
gsutil acl get private gs://iaptest-328809/setup.html > acl2.txt
cat acl2.txt - we now have only owner role
Lets loosen up and provide read access to all users
gstuil acl ch -u AllUsers:R gs://iaptest-328809/setup.html
check access now
gsutil acl get gs://iaptest-328809/setup.html > acl3.txt
cat acl3.txt -- now have read permission to all users..
download
gsutil cp gs://iaptest-328809/setup.html setup.html
Now Go to iam service account and disable the service account and try to copy file and you will not be able to do so and you will get error.
gsutil cp setup.html gs://iaptest-328809
Copying file://setup2.html [Content-Type=text/html]...
Your credentials are invalid. Please run
$ gcloud auth login