Week13 - Selesfia/ComputerNetwork GitHub Wiki

Cloud Run Function

Create Cloud Run Function

  1. Create function -> Environment "Cloud Run function (1st gen)" -> Give a name "function-1" -> Region "asia-east1" -> Trigger type "HTTP" -> Authentication "Allow unauthenticated invocations" -> Save -> Next
  2. Runtime "python 3.12" -> Deploy
  3. Copy CLI test command -> Open Cloud Shell -> Paste the command

  1. Still on cloud shell
mkdir test_iris
cd test_iris/
  1. Go here and copy the URL

git clone https://github.com/saedhussain/gcp_serverless_ml.git
cd gcp_serverless_ml/
cd Iris_model/
cd data
cat Iris_data.csv #to view the csv data
cd ..
  1. Click Open Editor -> Go to create_iris_model.py and change the model name to "iris_model.pkl" -> Save file

  1. Activate cloud shell
cd test_iris/
cd gcp_serverless_ml/
cd Iris_model/
cat create_iris_model.py
python create_iris_model.py

If there is error try install numpy pip install numpy, pandas pip install pandas and sklearn pip install scikit-learn.

  1. Download the model that has been trained by typing pwd to get the url and click three dots -> Download -> Paste the link (add iris_model.pkl at the end of the link)

  1. Put the file at your desktop
  2. Go to cloud storage -> Bucket -> Create -> Give a name "mybkt-iris" -> Region "asia-east1" -> Create -> Upload iris_model.pkl
  3. Go to cloud function -> Create function -> Environment "Cloud Run function (1st gen)" -> Give a name "iris_prediction" -> Region "asia-east1" -> Trigger type "HTTP" -> Authentication "Allow unauthenticated invocations" -> Save -> Next
  4. Runtime "python 3.12" -> Go back to the github webpage -> Iris_http_cloud_func -> main.py -> Copy raw file -> Go back to cloud run main.py and paste it -> Go back to the github webpage -> Iris_http_cloud_func -> requirements.txt -> Copy -> Go back to cloud run requirements.txt and paste it
  5. Go back to main.py -> Change Model Bucket details -> Change Entry Point "iris_predict"
BUCKET_NAME        = "mybkt-iris"
PROJECT_ID         = "trim-mix-436602-e4"
GCS_MODEL_FILE     = "iris_model.pkl"
  1. Click deploy -> Go to testing -> Copy CLI test command -> Activate cloud shell and paste it -> Change the message to "features": [2,3,4,5]
curl -m 70 -X POST https://asia-east1-trim-mix-436602-e4.cloudfunctions.net/iris_prediction \
-H "Authorization: bearer $(gcloud auth print-identity-token)" \
-H "Content-Type: application/json" \
-d '{
    "features": [2,3,4,5]
}'

Input : [2,3,4,5], Output : Iris-virginica Input : [5.1,3.5,1.4,0.2], Output : Iris-setosa

Move Large Files from GCS Bucket using Cloud Function

  1. Go to storage -> Bucket -> Create -> Give a name "bkt-ori" -> Region "asia-east1" -> Create
  2. Go to storage -> Bucket -> Create -> Give a name "bkt-large" -> Region "asia-east1" -> Create
  3. Create function -> Environment "Cloud Run function (1st gen)" -> Give a name "move-files" -> Region "asia-east1" -> Trigger type "Cloud Storage" -> Event Type "On (finalizing/creating)" -> Bucket "bkt-ori" -> Save -> Next
  4. Runtime "python 3.12" -> Paste the following text into requirements.txt
functions-framework==3.*
google-cloud-storage
google-cloud
  1. Paste the following text into main.py
import functions_framework
from [google.cloud](https://google.cloud/) import storage
from [google.cloud.storage](https://google.cloud.storage/) import Blob


# Triggered by a change in a storage bucket
[@functions_framework](https://t.me/functions_framework).cloud_event
def hello_gcs(cloud_event):

    data = cloud_event.data

    event_id = cloud_event["id"]
    event_type = cloud_event["type"]

    bucket = data["bucket"]
    name = data["name"]
    metageneration = data["metageneration"]
    timeCreated = data["timeCreated"]
    updated = data["updated"]
    
    print("="*30)
    print(f"Event ID: {event_id}")
    print(f"Event type: {event_type}")
    print(f"Bucket: {bucket}")
    print(f"File: {name}")
    print(f"Metageneration: {metageneration}")
    print(f"Created: {timeCreated}")
    print(f"Updated: {updated}")
    print(f"Processing file: {name}.")
    storage_client = storage.Client(project='mygcp-436602')
    source_bucket=storage_client.get_bucket('mybkt-src')
    destination_bucket=storage_client.get_bucket('mybkt-dst') 
    blobs=list(source_bucket.list_blobs(prefix=''))
    print(blobs)
    for blob in blobs:
        if blob.size > 1000000 and [blob.name](https://blob.name/) == name:
            source_blob = source_bucket.blob([blob.name](https://blob.name/))
            new_blob = source_bucket.copy_blob(source_blob, destination_bucket, [blob.name](https://blob.name/)) 
            blob.delete(if_generation_match=None)
            print(f'File moved from {source_blob} to {new_blob}')
        else:
            print("File size is below 1MB")
  1. Change entry point "hello_gcs" -> Change to your bucket name
storage_client = storage.Client(project='trim-mix-436602-e4')
source_bucket=storage_client.get_bucket('bkt-ori')
destination_bucket=storage_client.get_bucket('bkt-large')
  1. Deploy
  2. Go back to bucket -> Go to bkt-ori -> Upload iris_model.pkl
  3. Back to cloud run function -> refresh
  4. To test the function -> Upload random file at bkt-ori and if the file is larger than iris_model.pkl then the file will be uploaded at bkt-large

Connecting to Cloud SQL with Cloud Functions Using CLI

  1. At desktop create notepad -> Paste the following text
gcloud sql instances create mydb \
--database-version=MYSQL_5_7 \
--cpu=2 --memory=4GB --root-password=admin1234 \
--assign-ip \
--zone=us-central1-a \
--availability-type=zonal \
--no-backup
  1. Activate google cloud shell -> Paste the text
  2. Go to SQL to check if your db
  3. Back to cloud shell -> Paste the following text gcloud sql databases create demo-db --instance=mydb, -i mydb
  4. gcloud sql connect mydb --user-root -> Enter root password "admin1234"

PS : Remember to delete the resources that you created if you don't use it anymore.

03/12/2024