How Postman and Google Cloud Build helps you ship Cloud Functions faster!

A Google Cloud Function is a Function-as-a-Service (FaaS) which runs with zero server management. What serverless really means is that as a developer, you can write and deploy code without the hassle of managing the underlying infrastructure. Instead of running an over-provisioned server infrastructure, you’re just running a piece of code. Yeah, we are really living in the future now!

In an agile development environment, small teams work autonomously and add a lot of churn to the codebase. Each developer works on different aspects of the project and commits code frequently. This is a healthy practice, but to minimize manual scrutiny and redundant communication across teams, we need to invest in automating CI/CD processes. This is exactly the point where Postman and Cloud Build helps to ship cloud functions faster with quality in a team space.

Let’s start with the checklist.

Google Cloud Platform

Google Cloud Platform (GCP), offered by Google, is a suite of cloud computing services that runs on the same infrastructure that Google uses internally for its end-user products. You can signup with the Free Tier if already haven’t. You will get $300 credit and access to 20+ GCP products for the first year. Pretty decent!

Postman

Postman is a collaboration platform for API development and testing. The features are designed in a way to enable you to deliver better APIs faster. In our case, we will be testing the deployed cloud functions using its command-line collection runner, called Newman. You can get the latest version of the Postman or try their latest web version. Believe me, Postman’s free tier is a real gem!

Google Cloud Build

Cloud Build is a service that executes your builds on Google Cloud Platform infrastructure. Once you get access to the GCP console, you get the access for Cloud build automatically. It basically resembles the nature of any CI/CD tools available in the market (like CircleCI, Travis, etc). While it is lagging behind in the UI features from the competitors, it has a definite edge over the pricing. Surprisingly, it provides you 120 build minutes free per day. If your other infrastructure is already based on GCP, Cloud Build probably the best option to look at.

You will also need the CLI tool called GCloud.

Now, if you already have ticked the items in the checklist, let's start the real game.

You should check out this Github repository where I have designed a sample cloud function and added a postman collection to test it. I will use heavy references from this repository.

Create your first project in GCP

Google Cloud projects form the basis for creating, enabling, and using all Google Cloud services including managing APIs, enabling billing, adding and removing collaborators, and managing permissions for Google Cloud resources. You can easily create your first project following these steps. Once the project is created, your dashboard will look similar to

Let's define our API

I am going to use NodeJS environment to build my backend. Specifically, I am using ExpressJS as a Middleware that has access to the request object (req), the response object (res). If you are not familiarized with ExpressJs, it is a light-weight web application framework to help organize your web application into an MVC architecture on the server-side. So, this is what the Application-Level-Middleware (app.js)looks like for my sample project.

Once the middleware is defined, let's define the routes. Again, I am using Express as Routing-Level-Middleware here.

So, we have explicitly set a route as portfolio to get specific data. Now, this is the time to run it locally and check there is no glitches. If you already cloned the template project, you just need to hit the below commands

# install dependencies
> npm i
# run the local server to host the backend
> npm start

It should listen to the port 8081 in your localhost . So, if you navigate to http://localhost:8081/api/portfolio , you should be able to receive the data returned by the API.

Create unit tests

Unit tests are really important to test the service in isolation by mocking out the dependencies. I have used chai-http to construct the mocks and call the middleware.

To execute the tests, hit the below command:

npm run test:unit

Tests are passing! Yahoo! You achieved all the success in the world! just kidding, we haven’t started the fun part yet. Kick start your Postman to write some dope system level API tests.

Create a collection using Postman

Once the service is running in your localhost, it’s a better idea to develop solid API tests. These tests will be handy to check the functionalities post-deployment. Fire up the Postman client.

  • First of all, I created an Environment and store the localhost asurl
  • I created a Collection
  • Added a Get request in the Collection and used the url param defined in the Environment
  • Defined a few tests using intuitive pm helper provide by Postman Runtime
  • Finally, hit the request and verify the run results

See how easy it is to write API tests, no hasstle of maintaining your own API framework! Postman is fun as always!

Once you are happy with your collection, Export the collection along with the environment. Store it in a folder in your code repository. You probably noticed, we have a step in our package.json to test the collection locally.

So, here we are using Newman to run the collections. Notice how we are setting the url param dynamically with an environment variable called URL . Post-deployment, we can easily swap URL with a hosted URL.

Now we are ready to move to the Cloud! Give me a high-five.

Deploy the Cloud Function from the local machine

Once you are satisfied with your local development, you can make use of gcloudCLI tool to deploy the cloud function. It sounds fancy and complex, but it’s a matter of a single command. hail, Google!

gcloud functions deploy myWebsiteBackend --trigger-http --runtime nodejs10

I named my function as myWebsiteBackend . You need to choose a couple of options like the region and IAM access. For keeping things simple, provide anonymous access to the deployed function. Should be good for testing out stuff! If you move to the Cloud Function console in GCP, you can see the function is already deployed.

Don’t forget to grab the hosted URL of the cloud function. We need it later.

Create the Cloud Build configuration

CI/CD enablers always provide you the ability to configure the build, test, and deploy steps. Cloud Build is no different. Here you need to create a YAML/JSON configuration file to store the desired steps. Here is how our configuration file looks like:

Basically, we are telling Cloud Build to install the dependencies first, followed by the deployment of the Function. once the deployment is completed, it will fire up the Postman tests automatically. I know, you already noticed we are setting the env URL dynamically. We will set the substituted value when we set the trigger — the most fun part.

Setup the Trigger for Cloud Build

We now have what we need to make the process automated. Cloud Build is so smart that it can easily identify whether a new commit has been pushed to the Github repository or any pull request is merged. This ability enables it to trigger a new build on the incoming event. Let’s set a new trigger.

  • Go to your GCP console and select the Cloud Build Trigger from the side menu.
  • Create a New Trigger
  • Select the Github Repository and explicitly select the Cloud Build configuration
  • Now, provide the user substitution variable value. Provide the hosted URL as :
https://YOUR_GCF_REGION-YOUR_GCP_PROJECT_ID.cloudfunctions.net/FUNCTION-NAME
  • Save the trigger

You are all set now

It’s time to verify everything is working. Update a file in the repository, and push a commit. It should trigger the event of re-deploying the cloud function and test the function with Postman collections.

Send the Build notification to Slack

Cloud Build can notify you of updates to your build status by sending you notifications to desired channels. Cloud Build sends all build status updates, along with build metadata, to Pub/Sub on the cloud-builds topic. Cloud Build notifiers can be configured to listen to that topic, filter the messages it receives, and send messages to your service. Setting up Slack notifications is not straightforward though. You need to follow a number of steps to get it done. I have added a sample slack.yml configuration file to help you out in your journey. Once the configuration is done, you get the build notification directly on your favorite channel.

Last-minute talk

Phew! We have come a long way. The serverless CI/CD is still walking on the path to maturity. But, selective tooling and approaches can help us deliver robust cloud functions with quality. I hope you’ve enjoyed the journey.

Happy coding!

--

--

--

Tech Lead Manager @Postman 🚀 | Space Movie Lover 🪐 | Coder 👨‍💻 | Traveller ⛰️

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

Democratizing Big Data Processing for the Education Data Portal

Pseudo File systems In Linux

Working with image registries and containerd in Kubernetes

Understand and use the RelayState parameter in SAML SSO with Salesforce

So… How to start a new job (as a Senior)?

Introducing Neu Icons

From Campus Coding to Engineering at Africa’s Talking.

Function Arguments — *args and **kwargs

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Abhinaba Ghosh

Abhinaba Ghosh

Tech Lead Manager @Postman 🚀 | Space Movie Lover 🪐 | Coder 👨‍💻 | Traveller ⛰️

More from Medium

Google Cloud — Cloud Functions

How to Pass GCP Professional Cloud Developer

Using Workload Identity to access Google Cloud Pub/Sub from Google Kubernetes Engine

IAC with Google Cloud Scheduler