Building Event-Driven Cloud Functions on Google Cloud Platform
Google Cloud Platform (GCP) offers developers and organizations the ability to create and deploy serverless Cloud Run functions. This code – whether it’s written in Node.js, Python, C#, or another language – can interact with other parts of GCP’s infrastructure. By default, the function is triggered by a basic HTTPS request. However, that is not our only option.
In this blog, we’ll introduce the concept of event-driven functions—a powerful way to automatically respond to changes across GCP services with minimal code. You’ll see how these functions can connect services like Cloud Storage, Firestore, and Pub/Sub, creating efficient workflows with less manual intervention.
What Are Event-Driven Functions?
An event-driven function is exactly what it sounds like: a function that runs automatically when a specific event occurs. These events can originate from various GCP services, such as:
- A file being uploaded to Cloud Storage
- A new document being written to Firestore
- A message being published to a Pub/Sub topic
- A custom event from another GCP service via Eventarc
These automated triggers help eliminate the need for constant polling or scheduled checks and instead allow your systems to react in real time.
Real-World Use Cases
There are various types of “events” that an organization would want to accommodate. Here are a few scenarios where event-driven functions are especially useful:
- Receipt Processing: For instance, we might have a process that uploads a text file of a customer’s receipt to file storage and want to parse that information to save to a database.
- Inventory Updates: Alternatively, Firestore (Google’s own document database) could trigger an event-driven function, such as when inventory gets updated and another system needs to be notified.
With GCP’s built-in integrations, setting up these workflows is straightforward—making it easier than ever to build connected, reactive systems.
Setting Up an Event-Driven Cloud Function
When creating a new Cloud Run function in the GCP console, there is a “Trigger” section where you can specify the type of trigger that will call your function:
- Pub/Sub trigger – Function is triggered by a message sent to a Pub/Sub topic.
- Cloud Storage trigger – Function is triggered when a storage object in a bucket is finalized (finishes uploading), archived, deleted, or has its metadata updated.
- Firestore trigger – Function is triggered by updates made with GCP’s document-based Firestore database.
- Other Eventarc trigger – Function is triggered by another GCP service, a Google API method, or an event provided by a third party.
Permissions and API Access
Keep in mind that the function will require a service account with IAM permissions that correspond to the event that triggers it and the resources it calls. This can be configured in a menu that appears when the type of trigger is selected (the default account may not have the necessary permissions):
The Cloud Functions API will also have to be enabled: https://cloud.google.com/functions/docs/reference/rest
Working With cloudEvent
Within the Cloud Run function itself, a cloudEvent parameter is available, which will have various attributes that reflect the event’s data and metadata. This parameter is the center of the function. For example:
- In Cloud Storage, cloudEvent.data contains information about the uploaded file.
- In Firestore, it includes details of the document before and after the change.
Developers may find cloudEvent’s metadata useful as well, such as the time a file was created/updated in a bucket. The structure of this parameter and the availability of other parameters (such as a StorageObjectData instance for a .NET function) depends on the language of the implementation.
Event-Driven Functions Examples
Here are a couple of examples of event-driven functions using Node.js, with comments provided to explain the steps in each trigger’s process:
Example 1: Cloud Storage Trigger (Node.js)
This code sample references the use case of saving information from an uploaded receipt file to a database (we’ll use Firestore in this case). It also logs a message with the created/update timestamps of the receipt file to the console. The entry point of the function is called helloGCS, and we’ll assume environment variables have been defined for the Project ID and Bucket Name.
const functions = require(‘@google-cloud/functions-framework’); const {Firestore} = require(‘@google-cloud/firestore’); const Storage = require(‘@google-cloud/storage’); const projectId = process.env.PROJECT_ID; const firestore = new Firestore({projectId: projectId}); const storage = new Storage({projectId: projectId}); functions.cloudEvent(‘helloGCS’, async (cloudEvent) => { // retrieve and parse receipt’s JSON data from the GCS storage bucket based on filename from cloudEvent parameter const receiptData = await storage.bucket(process.env.BUCKET_NAME).file(cloudEvent.data.name).download(); const receiptDataParsed = JSON.parse(receiptData.toString(‘utf-8’)); // use cloudEvent parameter to log timing metadata console.log(`Uploaded file for Receipt ${receiptDataParsed.docId} created at ${cloudEvent.data.timeCreated}, updated at ${cloudEvent.data.updated}`); // save receipt data to firestore collection const docRef = firestore.collection(‘Receipts’).doc(receiptDataParsed.docId); await docRef.set(receiptDataParsed); });
Example 2: Firestore Trigger (Node.js)
This code sample references the use case of notifying an external system of a change in inventory from an update to a Firestore database. The example is adapted from the default code GCP provides for this trigger, which uses protobufjs to deserialize the data from Firestore (the data.proto file being loaded defines the Firestore document event structure being referenced). The entry point of the function is called helloFirestore, and we’ll use axios to send the inventory’s change in quantity to the system we want to notify.
const functions = require(‘@google-cloud/functions-framework’); const protobuf = require(‘protobufjs’); const axios = require(‘axios’); functions.cloudEvent(‘helloFirestore’, async cloudEvent => { const root = await protobuf.load(‘data.proto’); const DocumentEventData = root.lookupType( ‘google.events.cloud.firestore.v1.DocumentEventData’ ); // get data of the inventory firestore document that changed const inventoryEvent = DocumentEventData.decode(cloudEvent.data); // create an object with the old and new values of the inventory’s quantity const qtyChange = { previous: inventoryEvent.oldValue.qty, current: inventoryEvent.value.qty }; // notify another system of inventory change try { const response = await axios.post(”, qtyChange); console.log(‘Success:’, response.data); } catch (error) { console.error(‘Error:’, error); } });
Conclusion
Using the framework of event-driven functions can save developers a lot of time in building their projects when anticipating processes that are kicked off by other GCP services. Whether we’re coding around uploads to Cloud Storage, changes to a Firestore document, or some other trigger, this offering provides a more direct way of connecting Google’s building blocks to create a system with robust event architecture.
To learn more, see Google’s documentation on writing event-driven functions:
https://cloud.google.com/functions/docs/writing/write-event-driven-functions
If you’d like to learn more from us at Keyhole on related Google Cloud Platform topics, you might be interested in the following posts: