In this post, we will walk through using Terraform to create a Google Cloud function that triggers when a file is uploaded and outputs the file details in the log. This process has wide-ranging applications and could be beneficial in a few business environments, such as:
- Real-time data processing in an online shopping/e-commerce environment when a customer orders a product, this method could trigger a confirmation email, tracking email, etc., to the user.
- Image processing in a social media platform where users upload images, and this method could detect the photos being uploaded and trigger the cloud function to update data for the user’s profile.
- A part of a large data processing pipeline where the cloud function can be triggered to sanitize or format the data a certain way before it gets sent to the next stage of processing.
Let’s jump right in.
Tools Overview
Terraform is an infrastructure as code (IaC) tool that lets you define your cloud resources in configuration files. It works across different cloud providers and on-premises infrastructure, and you describe the desired state of your setup, with Terraform figuring out the steps to make it happen. This automation and configuration-based approach helps you manage your cloud infrastructure efficiently and consistently.
As mentioned above, the cloud provider we will use today is Google Cloud Platform. It is a suite of cloud computing services offered by Google. It provides various on-demand resources like virtual machines, storage, databases, and machine learning tools. Do note that Google Cloud offers a free trial, so if you don’t have an account already, you can create one here: https://cloud.google.com/.
We will also use Google Cloud Shell editor https://console.cloud.google.com/cloudshelleditor to edit our code and run commands—which already has all the tools and commands installed, so we don’t have to worry about downloading anything locally.
Creating the Service Account
Once you have logged into Google Cloud, navigate to the cloud shell editor. Open the terminal, which will automatically be in the root directory.
Let’s create a new Terraform folder to do our work in. Type: mkdir terraform
and then cd terraform
. Now, you will have created a new folder called Terraform and navigated into it.
Next, let’s create a new file named accounts.tf
. We will use this to create a service account for our cloud functions and give it the proper permissions to run it. Go ahead and open the cloud editor and use it to create that file and type the following:
resource "google_service_account" "my_service_account" { project = "[your project id]" account_id = "terraform-alert-sa" display_name = "My Terraform Account" }
To break down the content:
- project: this is the name of your project, which you can find on the home page or in the projects dropdown on the navigation bar
- account_id: name for your service account; this can be anything you want
- display_name: this is just a readable name for humans
After you have created this file with the above code, go ahead a switch over to the terminal view and run the following commands:terraform init
and terraform validate
.
- terraform init: basically initializes the current directory in the TerraForm environment
- terraform validate: verify whether a configuration is syntactically valid and internally consistent
If your code is valid, the terminal will output a success message for both. If you see errors, they are usually caused by misspellings and other syntax issues. Fix those errors with what Terraform suggests and rerun those commands.
Now, let’s run terraform apply
, which will actually execute the code that was defined in the file, thus modifying resources on Google Cloud. If the apply step fails, there will also be errors that will output in the console. These errors are a bit harder to debug, but Terraform tries to suggest fixes, such as adding the right permissions. Also, it is possible that some resources take a bit to spin up, thus leading to race conditions inside the Terraform code, in which case, just try to run the apply command again.
Now, if everything is successful, you should be able to go to the Google Cloud IAM management page and see the new service account that was created. Remember this service account email, as we will reference it in the next part. You will also notice a .tfstate file was created; this is a state file that Terraform uses to track and manage the infrastructure it creates or provisions. We can ignore this file.
Creating the cloud function alert
The two main Google products that we will use to create this project are Google Cloud Function and Google Cloud Storage. Here is a brief overview of each:
- Google Cloud Function: serverless execution environment that allows you to run code in response to events. For this project, we are using cloud function 2, which is the latest product offering and has newer features. Also, we will be using a Node.js runtime to run the code.
- Google Cloud Storage: service for storing objects in Google Cloud, which are immutable pieces of data consisting of a file of any format.
Now, let’s go ahead and create a new file in the same directory as accounts.tf
; name it main.tf
. For the purpose of this post, we will put most of the code in the main.tf
file for simplicity. Note that on large projects, this code can be separated into their individual files.
Open up the main.tf
file in the cloud editor and add the following code:
locals { project_id = [your project id] region = "us-central1" service_account_email = [your account that you created] service_account_roles = [ "roles/datastore.owner", "roles/logging.configWriter", "roles/logging.logWriter", "roles/serviceusage.serviceUsageAdmin", "roles/storage.admin", "roles/cloudkms.admin", "roles/iam.serviceAccountAdmin", "roles/compute.viewer", "roles/iam.serviceAccountKeyAdmin", "roles/iam.workloadIdentityPoolAdmin", "roles/iam.roleAdmin", "roles/pubsub.admin", "roles/cloudfunctions.admin", "roles/iam.serviceAccountUser", "roles/cloudbuild.builds.builder", "roles/pubsub.publisher", "roles/eventarc.eventReceiver", "roles/run.invoker" ] } resource "google_project_iam_member" "runner-sa-roles" { for_each = toset(local.service_account_roles) role = each.value member = "serviceAccount:${local.service_account_email}" project = local.project_id } # Enable APIs module "project-services" { source = "terraform-google-modules/project-factory/google//modules/project_services" version = "~> 14.4" project_id = local.project_id activate_apis = [ "logging.googleapis.com", "iam.googleapis.com", "cloudkms.googleapis.com", "iamcredentials.googleapis.com", "cloudresourcemanager.googleapis.com", "sts.googleapis.com", "monitoring.googleapis.com", "cloudfunctions.googleapis.com", "pubsub.googleapis.com", "eventarc.googleapis.com", "run.googleapis.com", "cloudbuild.googleapis.com", "storage.googleapis.com" ] disable_services_on_destroy = false } # Google storage bucket that contains the code for the cloud function resource "google_storage_bucket" "cloud_function_source_bucket" { name = "cloud-function-alert-${local.project_id}" location = local.region force_destroy = true uniform_bucket_level_access = true } # The input bucket that will trigger the cloud function resource "google_storage_bucket" "input_bucket" { name = "cloud-alert-input-${local.project_id}" location = local.region uniform_bucket_level_access = true } # Zip up the source code data "archive_file" "source" { type = "zip" output_path = "${path.module}/src/alert_source.zip" source_dir = "src/" } # Add source code zip to the Cloud Function's bucket (Cloud_function_bucket) resource "google_storage_bucket_object" "zip" { source = data.archive_file.source.output_path content_type = "application/zip" name = "alert_source.zip" bucket = google_storage_bucket.cloud_function_source_bucket.name depends_on = [ google_storage_bucket.cloud_function_source_bucket, data.archive_file.source ] } # google auto creates some service accounts for the event triggers with # cloud storage, here we get them and give them the pubsub role so that they # can trigger the cloud function event data "google_storage_project_service_account" "gcs_account" { project = local.project_id } # Grant pubsub.publisher permission to storage project service account resource "google_project_iam_binding" "google_storage_project_service_account_is_pubsub_publisher" { project = local.project_id role = "roles/pubsub.publisher" members = [ "serviceAccount:${data.google_storage_project_service_account.gcs_account.email_address}", ] } resource "google_cloudfunctions2_function" "function" { name = "cloud-function-trigger-using-terraform-gen-2" location = local.region project = local.project_id description = "Cloud function gen2 trigger using terraform" build_config { runtime = "nodejs16" entry_point = "fileStorageAlert" source { storage_source { bucket = google_storage_bucket.cloud_function_source_bucket.name object = google_storage_bucket_object.zip.name } } } service_config { max_instance_count = 1 min_instance_count = 0 available_memory = "256M" timeout_seconds = 60 environment_variables = { SERVICE_CONFIG_TEST = "config_test" } ingress_settings = "ALLOW_INTERNAL_ONLY" all_traffic_on_latest_revision = true service_account_email = local.service_account_email } event_trigger { trigger_region = local.region event_type = "google.cloud.storage.object.v1.finalized" retry_policy = "RETRY_POLICY_DO_NOT_RETRY" service_account_email = local.service_account_email event_filters { attribute = "bucket" value = google_storage_bucket.input_bucket.name } } depends_on = [ google_storage_bucket.cloud_function_source_bucket, google_storage_bucket_object.zip, module.project-services, google_project_iam_binding.google_storage_project_service_account_is_pubsub_publisher ] }
Let’s go through each section:
- locals: defines local variables that we will reuse throughout the code
resource “google_project_iam_member” “runner-sa-roles”: gives the service account that we created the necessary permissions to interact with Google’s APIs that we need. Note: I have liberally given the service account more permissions than needed; this isn’t a good security practice, as accounts should have the least privilege possible. - module
project-services
: enables the APIs for the project, but we control access with the roles that we defined in the previous section - resource
google_storage_bucket
cloud_function_source_bucket
: this storage bucket contains the source code for the code function, which will be in a zip file - resource
google_storage_bucket
input_bucket
: this is the bucket that is attached to the event trigger to run the cloud function when an object is uploaded here - data
archive_file
source
: this zips up our source code - resource
google_storage_bucket_object
zip
: this uploads the zip file to the bucket - data
google_storage_project_service_account
gcs_account
: retrieves the storage service accounts that were auto created - resource
google_project_iam_binding
: give those accounts permissions to trigger events - resource
google_cloudfunctions2_function
function
: creates the cloud function and hooks up the trigger to the bucket
Each of these sections corresponds to gcloud commands, which allows us to use the command line to modify Google Cloud resources, and also all of these can be manually set through the Google Cloud console website. Terraform abstracts all of this as an IAC tool.
Next, let’s add the actual Cloud Function Node.js code.
1. Create a new folder src
in the terraform
folder
2. Create two new files, index.js and package.json, with the following content:
index.js const functions = require('@google-cloud/functions-framework'); functions.cloudEvent('fileStorageAlert', (cloudevent) => { console.log("cloud storage event"); console.log(cloudevent); });
package.json { "name": "file-storage-cloud-alert", "version": "0.0.1", "main": "index.js", "dependencies": { "@google-cloud/functions-framework": "^2.0.0" } }
Once you have added all the files and code, go ahead and close the editor and switch over to the terminal again. Run terraform init
and then terraform apply
(note: you can also run terraform validate
before applying, but that is not strictly necessary). If everything is successful, you should see two new buckets created in cloud storage and the cloud function deployed if you navigate to their respective product pages.
Finally, let’s test out the cloud function that we created. Open up the cloud shell editor and create a new file called test.txt
and populate it with some text. Then switch over to the terminal and run the following: gsutil cp test.txt gs://[your bucket name]
, where the bucket name is the input
bucket that we created in the Terraform code. This command will upload the file to the specified bucket.
Now, if that is successful, you can navigate to the Cloud Function product page, go to the cloud function that was created, and view the logs of the file that was uploaded.
Wrap Up
Thanks for taking the time to read this walkthrough of how to use Terraform to create a Google Cloud Function that triggers upon file uploads to Google Cloud Storage and logs data accordingly. This process could streamline various workflows, such as real-time data processing in e-commerce, image processing on social media platforms, and enhancing data processing pipelines.
By leveraging Terraform, we can define our cloud resources as code, enabling consistent and efficient infrastructure management. Using Google Cloud Shell Editor simplifies the setup by providing an integrated environment with all necessary tools pre-installed.
Here are the key steps we covered:
- Setting Up the Environment: Using Google Cloud Shell Editor to avoid local setup complications.
- Creating a Service Account: Ensuring the proper permissions are in place for our cloud function.
- Defining Infrastructure with Terraform: Writing Terraform configuration files to manage cloud resources.
- Writing the Cloud Function: Implementing a Node.js function to handle file upload events.
- Testing the Setup: Verifying the functionality by uploading a file and checking the logs.
This walkthrough illustrates the power and flexibility of combining Terraform with Google Cloud Platform, allowing you to automate and optimize your cloud infrastructure.
To learn more about these exciting technologies and references for the code in this walkthrough, please visit https://cloud.google.com/ and https://www.terraform.io/.
Thank you for following along with this tutorial. We hope it helps you streamline your workflows and harness the full potential of cloud automation. If you have any questions or need further assistance, feel free to reach out in the comments or connect with us through our contact page.