GCP Cloud Logging : How to Enable Data Access Audit For Selected Buckets

GCP Cloud Logging : How to Enable Data Access Audit For Selected Buckets

Introduction

Data Access Audit Logs are used to trace and monitor API calls that create, modify or read user data or metadata on any GCP resource. The Data Access audit logs are disabled by default (except for BigQuery Data Access audit logs) because they can cause an explosion in the volume of logs being generated that in turn can shoot up usage costs.

Click Here learn how to enable data access audit logs. You will notice that the workflow allows you to select the type of audit log to enable (Admin Read, Data Read and Data Write) and also allows you to specify principles to exempt from the audit logging. For e .g. you may have a Dataproc workload that reads from a GCS bucket as part of a schedule. If you don't want to enable data access auditing for the service account used by Dataproc, then specify it under exempted principles.

But that is all you can configure while enabling data access audit logs. What if you want to exempt certain buckets from the access audits or go a step further and prevent specific operations? In this post, we will look at how to enable data access audit for GCS buckets within a project while excluding certain buckets within the same project from being audited.

This additional layer of filtering will help keep logging costs under control.

A Quick Overview On How Cloud Logging Works

All logging activities in Google Cloud Platform (GCP) are routed through the Logging API. The Logging API uses a robust architecture to deliver the logs you need - reliably and on-time.

Log entries are sent to the Logging API which then pass through a Log Router. Log Routers contain sinks that define "Which" logs need to be sent "Where".

If the sinks are not configured properly, logging could become one of the highest components that you get billed for. This is especially true if you have data access audit logging enabled.

Solution

  • You will see 2 sinks already created and defined for you: _Default and _Requ ired
  • _Required is the sink that is used to route Admin Activity logs, System Audit logs and Access Transparency Logs. This sink can neither be edited nor disabled. You do not incur any charges for logs routed through this sink.
  • _Default is the sink that routes all other logs including data access audit logs. This sink can be edited and also disabled. Click on the 3-dot dropdown and select "Edit Sink"
  • Look for the section titled — Choose logs to filter out of sink. This is where you can define which logs you want the router to filter out and not send to the destination specified in the sink.
  • The exclusion filter (and also the inclusion filter) needs to be defined using the Logging Query Language
  • In the above example, I have defined a simple exclusion rule that filters out data access logs arising out of API activity in a bucket named — demo-bucket.
resource.type="gcs_bucket"
resource.labels.bucket_name="demo-bucket"
  • Detailed documentation on the Logging Query Language can be found HERE
  • If you prefer not to mess around with the pre-defined _Default sink, you can create a new sink and specify your custom rules and conditions there. Ensure that the _Default sink is disabled otherwise logs will be routed to both destinations resulting in an increase in costs.

What If You Want To Exclude Only Specific Operations?

The Logging Query Language allows you to quer y logs not just at a resource level but also at an operations level provided the metadata is defined in the payload. For e.g., you can use the below query to exclude all list operations on a bucket named — demo-bucket

resource.type="gcs_bucket"
resource.labels.bucket_name="demo-bucket"
logName="projects/<project-name>/logs/cloudaudit.googleapis.com%2Fdata_access"
protoPayload.methodName="storage.objects.list"

Logging Query Language provides the opportunities to write a variety of complex queries to make your filters go as generic or as granular as you need.

Things to Remember

  • Log entries excluded from a sink will continue to use the entries.write API quota since filtering happens after the Logging API receives the entry
  • Exclusion filters take precedence over inclusion filters. So if any log entry overlaps an Exclusion and an Inclusion filter, the entry will get excluded regardl ess.
  • If no filters are specified, then all logs are routed by default.
  • When you exclude a log entry, it neither incurs ingestion charges nor storage charges.

GCP Cloud Logging : How to Enable Data Access Audit For Selected Buckets was originally published in Google Cloud - Community on Medium, where people are continuing the conversation by highlighting and responding to this story.

Namaste Devops is a one stop solution view, read and learn Devops Articles selected from worlds Top Devops content publishers inclusing AWS, Azure and others. All the credit/appreciations/issues apart from the Clean UI and faster loading time goes to original author.

Comments

Did you find the article or blog useful? Please share this among your dev friends or network.

An android app or website on your mind?

We build blazing fast Rest APIs and web-apps and love to discuss and develop on great product ideas over a Google meet call. Let's connect for a free consultation or project development.

Contact Us

Trending DevOps Articles

Working with System.Random and threads safely in .NET Core and .NET Framework

Popular DevOps Categories

Docker aws cdk application load balancer AWS CDK Application security AWS CDK application Application Load Balancers with DevOps Guru Auto scale group Automation Autoscale EC2 Autoscale VPC Autoscaling AWS Azure DevOps Big Data BigQuery CAMS DevOps Containers Data Observability Frequently Asked Devops Questions in Interviews GCP Large Table Export GCP Serverless Dataproc DB Export GTmetrix Page Speed 100% Google Page Speed 100% Healthy CI/CD Pipelines How to use AWS Developer Tools IDL web services Infrastructure as code Istio App Deploy Istio Gateways Istio Installation Istio Official Docs Istio Service Istio Traffic Management Java Database Export with GCP Jenkin K8 Kubernetes Large DB Export GCP Linux MSSQL March announcement MySQL Networking Popular DevOps Tools PostgreSQL Puppet Python Database Export with GCP Python GCP Large Table Export Python GCP Serverless Dataproc DB Export Python Postgres DB Export to BigQuery Sprint Top 100 Devops Questions TypeScript Client Generator anti-patterns of DevOps application performance monitoring (APM) aws amplify deploy blazor webassembly aws cdk application load balancer security group aws cdk construct example aws cdk l2 constructs aws cdk web application firewall aws codeguru reviewer cli command aws devops guru performance management aws service catalog best practices aws service catalog ci/cd aws service catalog examples azure Devops use cases azure devops whitepaper codeguru aws cli deploy asp.net core blazor webassembly devops guru for rds devops guru rds performance devops project explanation devops project ideas devops real time examples devops real time scenarios devops whitepaper aws docker-compose.yml health aware ci/cd pipeline example host and deploy asp.net core blazor webassembly on AWS scalable and secure CI/CD pipelines security vulnerabilities ci cd pipeline security vulnerabilities ci cd pipeline aws smithy code generation smithy server generator
Show more