1.7 Using Cloud Memorystore (Redis) - grzzboot/pingpong-service GitHub Wiki

In order to able to perform this chapter you must have completed the database setup from the previous one already. If not, do so.

Cloud Memorystore

Reading data from a database must be 95% of all software development and, more or less, the remaining 5% is about caching it so that you don’t have to read it that often!

Jokes aside, caching can be a very elegant solution that gives you fantastic performance and also allows you to spend less money on high database performance.

Anyone who has implemented caching solutions in the past knows that this can be a cumbersome thing.

The GCP platform comes with a built in Redis service called Memorystore which is very easy to configure with respect to resources and high availability.

Continue from the previous step

Start by cleaning away the deployments of the previous step BUT keep the database. We're going to continue using it in this part.

Create a Redis cache

The Redis product goes under the alias Memorystore in GCP and you'll find that section in the web console menu not too far away from SQL.

If you haven't used Memorystore already it might require you to Enable the API, then do so!

When enabled you'll see an overview screen much like the one for SQL and in the top of the screen you can Create instance. Let's do it.

There are much fewer options for Redis caches than for SQL instances but anyway...

Instance ID and Display name

Well it's up to you but why not pingpong-cache?

Tier

Make sure to use Basic. It's like with the databases, Standard is high availability, recommended in some (not all) cases for production use, and it comes with a greater price tag.

Region and Zone

Same as with the database, europe-west3 and europe-west3-a respectively.

Capacity and version

The smallest available capacity (1 GB) is good enough for us, and version 4.0 is great.

Authorised network

Here you select pingpong-site1-net as with the database. This will attach the cache network with your services network making it easy to access from there just like the database.

Get connected

As in the previous chapter we are going to abandon the previous version of pingpong-service and move on to a new one. Navigate into the pingpong-service-postgres-redis folder of the project, this is were we want to be from now on.

As you may have guessed this version is pretty much like the previous one but rather than fetching a new meme from the database on every request the service has a new request parameter, cache, that, when set to true, caches a meme once it is fetched and then keeps it in the cache for 10 seconds. So any subsequent requests (for 10 seconds) will get the same meme. Then a new meme will be fetched and so on...

Configure database and cache IP:s in the k8s-config

The k8s folder of the project now contains three folders:

-rw-r--r--  1 user  group  161 Feb 23 21:54 kustomization.yaml
drwxr-xr-x  5 user  group  160 Feb 23 22:38 managed-postgres
drwxr-xr-x  5 user  group  160 Feb 23 22:38 managed-redis
-rw-r--r--  1 user  group   57 Feb 15 16:36 namespace.yaml
drwxr-xr-x  5 user  group  160 Feb 23 22:38 single-exposed-deployment

Both the the managed-postgres and the managed-redis folder contains a similar endpoints.yaml. You need to do the same replacement for the two as you did with the database in the previous chapter.

kind: Endpoints
apiVersion: v1
metadata:
  name: pingpong-database-service
subsets:
  - addresses:
      - ip: <YOUR-DB-IP>
    ports:
      - port: 5432
kind: Endpoints
apiVersion: v1
metadata:
  name: pingpong-cache-service
subsets:
  - addresses:
      - ip: <YOUR-CACHE-IP>
    ports:
      - port: 6379

Once this is done you can run kustomize again in from the k8s folder. Check your public IP with a get services query to see your public IP and then in your browser navigate to:

http://<YOUR-LOADBALANCER-IP>:8080/ping?name=John&meme=true&cache=true

You will see no immediate difference but if you refresh your browser a few times you will notice that you receive the same meme over and over again until about 10 seconds has passed and then you get a new meme and so on... Just as intended.

Congrats! You've successfully combined a micro service with a database and a data cache. That’s another real world scenario there!

⚠️ **GitHub.com Fallback** ⚠️