gcp logging - ghdrako/doc_snipets GitHub Wiki
gcloud logging logs list # View the available logs on the system
gcloud logging logs list --filter="compute" # View the logs that relate to compute resources
gcloud logging read "resource.type=gce_instance" --limit 5 # Read the logs related to the resource type of gce_instance
gcloud logging read "resource.type=gce_instance AND labels.instance_name='gcelab2'" --limit 5 # Read the logs for a specific virtual machine:
https://cloud.google.com/logging/docs/view/query-library-preview?hl=it#sql-filters
resource.type="cloudsql_database"
resource.labels.database_id="<<project-name>>:<<instance-name>>"
textPayload:"select"
Important: Audit logs are temporarily written to the disk of their instance, taking up disk space. Disk space is affected before logs are sent to Cloud Logging.
Selectively record and track SQL operations performed against a given database instance. The extension provides you with auditing capabilities to monitor and record a select subset of operations.
Enable the pgaudit flags https://cloud.google.com/sql/docs/postgres/pg-audit
INSTANCE_NAME=sql-playground
gcloud sql instances patch $INSTANCE_NAME --database-flags cloudsql.enable_pgaudit=on`
gcloud sql connect $INSTANCE_NAME
Once inside the DB, run this:
> CREATE EXTENSION pgaudit;
Then back in gcloud land:
gcloud sql instances patch $INSTANCE_NAME --database-flags \
cloudsql.enable_pgaudit=on,pgaudit.log=all
resource.type="cloudsql_database"
logName="projects/<your-project-name>/logs/cloudaudit.googleapis.com%2Fdata_access"
protoPayload.request.@type="type.googleapis.com/google.cloud.sql.audit.v1.PgAuditEntry"
Disabling auditing
gcloud sql instances patch $INSTANCE_NAME --database-flags \
cloudsql.enable_pgaudit=off
DROP EXTENSION pgaudit;
- https://cloud.google.com/bigquery/docs/reference/auditlogs
- https://cloud.google.com/logging/docs/export/configure_export_v2
Sinks control how Cloud Logging routes logs. Using sinks, you can route some or all of your logs to supported destinations.
Cloud Logging provides two predefined sinks for each Google Cloud project, billing account, folder, and organization: _Required and _Default. All logs that are generated in a resource are automatically processed through these two sinks and then are stored either in the correspondingly named _Required or _Default buckets.
Sinks act independently of each other.
Which log entries are routed by a sink is controlled by configuring the sink's inclusion filter and exclusion filters. Depending on the sink's configuration, every log entry received by Cloud Logging falls into one or more of these categories:
- Stored in Cloud Logging and not routed elsewhere.
- Stored in Cloud Logging and routed to a supported destination.
- Not stored in Cloud Logging but routed to a supported destination.
- Neither stored in Cloud Logging nor routed elsewhere
You usually create sinks at the Google Cloud project level, but if you want to combine and route logs from the resources contained by a Google Cloud organization or folder, you can create aggregated sinks.
Sinks determine how Cloud Logging routes log entries. By using sinks, you can route some or all of your log entries to the following destinations:
- Cloud Logging bucket - A log bucket can store log entries that are received by multiple Google Cloud projects. The log bucket can be in the same project in which log entries originate, or in a different project. For information about viewing log entries stored in log buckets, see Query and view logs overview and View logs routed to Cloud Logging buckets
- BigQuery dataset - can be in the same project in which log entries originate, or in a different project. You can use big data analysis capabilities on the stored log entries. For information about viewing log entries routed to BigQuery, see View logs routed to BigQuery.
- Cloud Storage bucket
- Pub/Sub topic - Log entries are formatted into JSON and then routed to a Pub/Sub topic. For information about viewing log entries routed to Pub/Sub, see View logs routed to Pub/Sub.
- Google Cloud project: Route log entries to another Google Cloud project. In this configuration, the sinks in the destination project processes the log entries
gcloud logging sinks list
gcloud logging sinks describe SINK_NAME
gcloud logging sinks update SINK_NAME --disabled
gcloud logging sinks update SINK_NAME --no-disabled
gcloud logging sinks delete SINK_NAME
gcloud logging sinks create SINK_NAME SINK_DESTINATION
Destination path format:
logging.googleapis.com/projects/DESTINATION_PROJECT_ID/locations/LOCATION/buckets/BUCKET_NAME
bigquery.googleapis.com/projects/PROJECT_ID/datasets/PROJECT_ID.DATASET_ID
logging.googleapis.com/projects/DESTINATION_PROJECT_ID
storage.googleapis.com/BUCKET_NAME
pubsub.googleapis.com/projects/PROJECT_ID/topics/TOPIC_ID
Note: To be a valid sink destination, DATASET_ID must be write-enabled. Don't select a linked dataset, these datasets are read-only.
gcloud logging sinks create # it does not automatically create a dataset for you, only create the log sink
Example
gcloud logging sinks create cloudsql_pgaudit_to_bigquery \
bigquery.googleapis.com/projects/<PROJECT_ID>/datasets/<DATASET_NAME> \
--log-filter='resource.type="cloudsql_database" AND \
logName="projects/<your-project-name>/logs/cloudaudit.googleapis.com%2Fdata_access" AND \
rotoPayload.request.@type="type.googleapis.com/google.cloud.sql.audit.v1.PgAuditEntry" \
--description="Sink for Cloud SQL PGAudit logs to BigQuery"
Service accout using in sink should have minimal right: roles/logging.logWriter and roles/bigquery.dataEditor are sufficient, provided are bound to your log sink service account in the same project. To show which sa use sink:
gcloud logging sinks describe SINK_NAME
W BigQuery logi wysyłane z Cloud Logging będą w formacie zagnieżdżonego JSON-a.
SELECT
JSON_EXTRACT_SCALAR(jsonPayload, "$.statement") AS statement,
JSON_EXTRACT_SCALAR(jsonPayload, "$.action") AS action,
JSON_EXTRACT_SCALAR(jsonPayload, "$.user") AS user,
timestamp
FROM
`project_id.dataset_id.table_id`
WHERE
JSON_EXTRACT_SCALAR(jsonPayload, "$.module") = "pgaudit"
Aby uprościć analizę, zbuduj widok SQL, który automatycznie mapuje istotne pola na kolumny.
https://cloud.google.com/logging/docs/logs-based-metrics/ https://cloud.google.com/logging/docs/logs-based-metrics/counter-metrics
Log-based metrics derive metric data from the content of log entries. There are two kinds of log-based metrics:
- System-defined log-based metrics, provided by Cloud Logging for use by all Google Cloud projects.
-
User-defined log-based metrics, created by you to track things in your Google Cloud project that are of particular interest to you.For example, you might create a log-based metric to count the number of log entries that match a given filter.
- By default, user-defined log-based metrics are calculated from all logs received by the Logging API for the Cloud project, regardless of any inclusion filters or exclusion filters that might apply to the Cloud project.
- Preview: You can also create user-defined log-based metrics for a specific log bucket in a Cloud project. Bucket-level log-based metrics are calculated from all logs destined for the bucket, regardless of where they originated. For more information see Log-based metrics on log buckets.
gcloud logging metrics create METRIC_NAME \
--description "METRIC_DESCRIPTION" \
--log-filter "FILTER"
gcloud logging metrics create error_count \
--description "Errors in syslog." \
--log-filter "resource.type=gce_instance AND logName:logs/syslog AND severity>=ERROR"
gcloud logging metrics list
gcloud logging metrics describe METRIC_NAME
gcloud logging metrics update METRIC_NAME --description=DESCRIPTION --log-filter=FILTER
gcloud logging metrics delete METRIC_NAME
A metrics scope is a list of projects that are monitored by the project that hosts the metrics scope; the hosting project is called a scoping project. By default, each project hosts a metrics scope that includes only itself, so a project is a scoping project for itself. You can also create a multi-project metrics scope for the scoping project. With a multi-project metrics scope, the scoping project can see the metrics from all the projects in the metrics scope.
log-based metric types:
- Counters: these metrics count the number of log entries that match a specified filter.
- Distribution: these metrics also count values, but they collect the counts into ranges of values (histogram buckets).
- Boolean: these metrics capture whether or not a log entry matches a specified filter.
- https://cloud.google.com/logging/docs/view/logging-query-language
- https://cloud.google.com/logging/docs/view/building-queries
- https://www.codingninjas.com/studio/library/logging-query-language To write filters for sinks and logs-based metrics, you use the same syntax and expressions that are used to query your logs data.
expression = ["NOT"] comparison { ("AND" | "OR") ["NOT"] comparison }
A comparison is either a single value or a Boolean expression:
"The cat in the hat" # global restrictions - Each field of a log entry is compared to the value by implicitly using the has operator.
resource.type = "gae_app" # comparison that is a Boolean expression of the form [FIELD_NAME] [OP] [VALUE]
Comparisons [FIELD_NAME] [OP] [VALUE]
[FIELD_NAME]: is the path name of a log entry field. Examples of field names include:
- resource.type
- resource.labels.zone
- resource.labels.project_id
- insertId
- jsonPayload.httpRequest.protocol
- labels."compute.googleapis.com/resource_id"
[OP]: is one of the following comparison operators.
-
=
-- equal -
!=
-- not equal -
> < >= <=
-- numeric ordering -
:
-- "has" matches any substring in the log entry field -
=~
-- regular expression search for a pattern -
!~
-- regular expression search not for a pattern
Alternatively:
resource.type = ("gae_app" OR "gce_instance")
resource.type = "gae_app" AND (severity = "ERROR" OR "error")
jsonPayload.cat = ("longhair" OR "shorthair") # checks that the field cat has the value "longhair" or "shorthair"
jsonPayload.animal : ("nice" AND "pet") # checks that the value of the field animal contains both of the words "nice" and "pet", in any order
list of log entry fields