Approach2- Mirroring CSR and Gitlab Repository to Create Cloud Build Triggers and automate the CICD…

Approach2- Mirroring CSR and Gitlab Repository to Create Cloud Build Triggers and automate the CICD pipelines with GKE.

Hello everyone 👋, In this blog, we will be the creating CI/CD Pipelines with Cloud Build Triggers for Gitlab as a source repository.

We will be discussing on how to build the Gitlab repository by setting up mirroring between Google Source Repository and Gitlab repository.
As we have discussed in approach 1, that by creating webhook, we can trigger the ClouBuild for Gitlab as our Source Repository , but it lacks some features that we often use for our development and production environments.

Photo by Amy Hirschi on Unsplash

Need of Mirroring CSR with Gitlab Repository

Cloud Build Triggers doesn't directly support Gitlab as a source repository. Also to overcome the limitations of webhook approach and features like $SHORT_SHA, $COMMIT_SHA, $TAG_NAME , etc and other substitution variables with our builds , we have mirrored the Gitlab repositories to use with GSR.

Now all the actions performed will be directly happening with CSR as a source . All our branches, tags, commits will be copied from Gitlab repository to Google Cloud Source Repository.

Architectural Flow (Approach 2)

  • Whenever a developer will push his/her code into Gitlab repository, the repository will first mirrored to Google Source Repository.
  • After th at Cloud build Trigger will trigger the build based on the events like push to branch/tag and take source as Google Source Repository.
  • The steps inside the cloudbuild.yaml will be executed to build the cached image , tag it and push to a gcr/artifact registry and in final step the pushed image will be deployed to our GKE Cluster.
Architectural flow for CICD Process

Prerequisite:

  • Enable the following API from the below Shell Script.
  • Provision the infrastructure such as VPC, Subnets, Service Accounts,Google Kubernetes Engine from the below repository.

Clone the below repository:

Terraform Resources
git clone https://github.com/sanketbisne/gcp-terraform-resources.git
cd gcp-terraform-resources
git checkout main

GitHub - sanketbisne/gcp-terraform-resources

Run the below shell script which will enable all the APIs needed to perform the below tasks.

./enable-api.sh

Step 1: Creating the repositories in Gitlab and Google cloud Source Repository.
Create a repo in Gitlab and CSR.
Go the the shell / console and execute below commands

gcloud init
gcloud source repos create REPOSITORY_NAME

Step 2: Authentication between Google Source Repository and Gitlab
Go the the Cloud Source Repository.

Section for manually generated credentials.
  • Go on Manually generated credentials
  • Click on 1."Generate and store your Git credentials"
  • This will pop up a window to authenticate with your gcp account
Authetication step for our user

Click on Allow, we will be redirected to configure Git.
Choose your way of authentication , we can choose cloud shell or our local terminal to store your username and password .

Why we need to store username and password?
To tell Gitlab repository that , this is the repository where you have to copy the whole contents we need some authentication mechanism.
We need username and password . Our username and password will be tied to our Google Acco unt user, If incase by any chance the user access is been revoked from cloud identity or Google Cloud Project. Mirroring between google source repository and Gitlab repository stops. Because service accounts are not supported here.
We have to create a user in our cloud identity whose access will be forever in the organisation and grant it (SOURCE REPOSITORY ADMIN/WRITER) on project level.
  • Configure Git from below block of code . this will auto populate the username and password in .gitcookies file created in the home directory.
Configuring Git with a cookie in your cloud shell

Step 3: Creating our Credentials.

Copy code from above block into your terminal. Below image is just for reference example.

Now, we need to open our cloud shell / local terminal< /p>

  mkdir Credentials && cd Credentials 
Copy this code block and enter into cloud shell
  • Our credentials like username and password will be stored in the .gitcookie file located in the home directory.
cat .gitcookies
  • Here we can see in the last column that we have 2 fields here, first one is the user id and another one is our password.
    We will extract both username and password and enter this in our Gitlab repository section.
Credentials generated in .gitcookies file

Extract the password from below commands.

grep 'source.developers.google.com' ~/.gitcookies | tail -1 | cut -d= -f2
Extract the password from .gitcookies

Store the Username in environment Variable

CSR_USER=$(grep 'source.developers.google.com' ~/.gitcookies | \
tail -1 | cut -d$'\t' -f7 | cut -d= -f1)

Store the Source Repository value in CSR_REPO

CSR_REPO=$(gcloud source repos describe gitlab-csr-mirror --format="value(url)")

Fetch the username from the below command.

echo $CSR_REPO | sed "s/:\/\//:\/\/${CSR_USER}@/"
Extract the username from .gitcookies

Copy the above generated username and password, enter into the Gitlab mirroring repository section

Below is the reference example how your username and password will look  like.
username : https://git-<your-username>@source.developers.google.com/p/<project_id>/r/<your-google-cloud-source-repository-name>
Password: 1//0<random_id>

Step4: Adding the above generated Credentials to Gitlab Repository.

Go the Gitlab >> Repository >> Mirroring Repository

Enter the username and password, keep other options by default.
and click on Mirror Repository.

Adding Username and Password in Gitlab Mirroring Section

We can now see the status of mirroring , if it is sync with Google Source Repository.

Sync between Google Cloud Source repository and Gitlab reposi tory
Gitlab sync with GCS every 7 minutes, it keeps track on every branch, commits, and changes occurring .We can force sync the changes by clicking on Sync now button.

Verify that the branches , Git Tags, Data , Folders, commits are copied from Gitlab to Google Source repository.
Push some changes it to Gitlab and it will automatically copies all data to CSR.

Git tags and Branches

We can match the latest short commit sha on the right hand side ie 6c58e647

Same tags are copied from Gitlab repo to Cloud source repository

As we can see both data , commits , branches and tags are copied from Gitlab to Google Source Repositories.

Now we have ou r code copied to CSR ,let's start by configure our Cloud Build triggers.

Setting up the Cloud Build triggers and automating deployments with Cloud Build

Step 1: Preparing the cloudbuild.yaml file
Cloudbuild.yaml has 3 steps

  • Building our code and tagging it with latest+short_sha+git_tag
  • Pushing our code into Google container registry/ Artifacts registry.
  • Deploying the image into our kubernetes cluster in a particular namespace.

    For Best practice , all the variables are substituted and can be reused with a new repository.

Step 2: Creating a Service Account for Cloud Build and granting the required permissions to it.

Roles granted to Service Account:

  • Kubernetes Engine Developer — Provides access to Kubernetes API objects inside clusters.
  • Logs Writer— Provides the permissions to write log entries.
Roles granted for cloud build service account
  • Source Repository Reader— Provides permissions to list, clone, fetch, and browse repositories
  • Storage Admin — Gives access to push images into GCR and Grants full control of objects and buckets.

Creating triggers for development environment.

  • Click on Triggers-> Create a New Trigger.
    Select Event — Push to a Branch

We will use push to a branch event to deploy our workloads for development/staging deployments. and push new tag event for our production deployments

Cloud Build T rigger
  • Select the repository and branch where the Dockerfile resides and place the cloudbuild.yaml inside the specified directory.
Adding the Substitution variables.
  • Select the cloudbuild.yaml file and service account
  • Enable and Disable the Trigger as per requirement
Enable the trigger -> if we want to Trigger the cloudbuild steps specified in cloudbuild.yaml file whenever there is code change, we can set the trigger to be enabled.
Disable the Trigger -> If we don't want to trigger our cloudbuild whenever there is code change , we can set the trigger to be disabled.

Click on Trigger and push some changes into repository

NOTE: Whenever we push some changes into our Gitlab Repos itory, It takes 7 minutes to get Sync which Google Cloud Repository.

As soon as the Cloud Build gets triggered we can view the time executed to build the steps in the history section.

  • Click on Build id and we can see the steps executed during Runtime.
History of Build Triggered automatically

We can see the build summary for our executed builds , and the duration of the builds to execute the steps .

Build steps details.
  • Go into GCR and see the the image tagged with latest and commit id (SHORT_SHA)
  • This was the Trigger build for development environment which automatically builds , tag, pushes, and deploy our image with (latest+short_sha) in GKE Cluster in a particular namespace.
Image pushed in GCR with tag [ short_sha and latest ]

Creating triggers for production environments based on tag(Regex pattern).

  • We can select tag events and filter it out to allow only (PRODUCTION_RELEASE_*) to get triggered.
  • Event will be Push with a new tag
  • Create a tag with following command and push the tag in the Gitlab repository
- git tag PRODUCTION_RELASE_2022_08_17
- git push origin PRODUCTION_RELASE_2022_08_17

Whenever we push the code with this prefix PRODUCTION_RELEASE*, only then our build will be triggered by cloud build Trigger, other prefixes or tags will be discarded.

For production environment we will push the events based o n git tags. For building and Pushing the image with Tag -> PRODUCTION_RELEASE* , we can see the tags values are automatically populated when we select our tags.

Tag based Trigger

Tag Based Trigger.

This trigger invokes the cloudbuild to only build, tag and push the image to Google Container Registry.

Steps for deploying the image in GKE Cluster.

​Here is the cloudbuild.yaml files describe the steps to builds the cached image, tag the image and push the image to GCR and deploy our image to GKE cluster in a particular namespace whenever we push our code into Gitlab repository.

Go to GCR an d see the tags for the image. we can see the following output.
The image is now tagged with SHORT_SHA, latest and PRODUCTION_RELEASE* tags.

Tagged image with latest + short_sha + PRODUCTION_RELEASE* Asterisk

Hence, we have set up the triggers for the development and production environment .and automatic deployments based on branch and tags.

Summary

In this blog, we overcome the limitations of webhook approach by introducing the mirroring apporach and learnt how we can use the features of Cloud Source Repositories to build our CI/CD pipelines for Staging and Production environments based on branch and tags.

References:

Have Any Questions? : If you have any questions, I'll be happy to read them in the comments. Follow me on medium or LinkedIn.

Thank you , have a great day ahead 😊


Approach2- Mirroring CSR and Gitlab Repository to Create Cloud Build Triggers and automate the CICD… was originally published in Google Cloud - Community on Medium, where people are continuing the conversation by highlighting and responding to this story.

Namaste Devops is a one stop solution view, read and learn Devops Articles selected from worlds Top Devops content publishers inclusing AWS, Azure and others. All the credit/appreciations/issues apart from the Clean UI and faster loading time goes to original author.

Comments

Did you find the article or blog useful? Please share this among your dev friends or network.

An android app or website on your mind?

We build blazing fast Rest APIs and web-apps and love to discuss and develop on great product ideas over a Google meet call. Let's connect for a free consultation or project development.

Contact Us

Trending DevOps Articles

Working with System.Random and threads safely in .NET Core and .NET Framework

Popular DevOps Categories

Docker aws cdk application load balancer AWS CDK Application security AWS CDK application Application Load Balancers with DevOps Guru Auto scale group Automation Autoscale EC2 Autoscale VPC Autoscaling AWS Azure DevOps Big Data BigQuery CAMS DevOps Containers Data Observability Frequently Asked Devops Questions in Interviews GCP Large Table Export GCP Serverless Dataproc DB Export GTmetrix Page Speed 100% Google Page Speed 100% Healthy CI/CD Pipelines How to use AWS Developer Tools IDL web services Infrastructure as code Istio App Deploy Istio Gateways Istio Installation Istio Official Docs Istio Service Istio Traffic Management Java Database Export with GCP Jenkin K8 Kubernetes Large DB Export GCP Linux MSSQL March announcement MySQL Networking Popular DevOps Tools PostgreSQL Puppet Python Database Export with GCP Python GCP Large Table Export Python GCP Serverless Dataproc DB Export Python Postgres DB Export to BigQuery Sprint Top 100 Devops Questions TypeScript Client Generator anti-patterns of DevOps application performance monitoring (APM) aws amplify deploy blazor webassembly aws cdk application load balancer security group aws cdk construct example aws cdk l2 constructs aws cdk web application firewall aws codeguru reviewer cli command aws devops guru performance management aws service catalog best practices aws service catalog ci/cd aws service catalog examples azure Devops use cases azure devops whitepaper codeguru aws cli deploy asp.net core blazor webassembly devops guru for rds devops guru rds performance devops project explanation devops project ideas devops real time examples devops real time scenarios devops whitepaper aws docker-compose.yml health aware ci/cd pipeline example host and deploy asp.net core blazor webassembly on AWS scalable and secure CI/CD pipelines security vulnerabilities ci cd pipeline security vulnerabilities ci cd pipeline aws smithy code generation smithy server generator
Show more