There are SDKs available for a number of languages, but not for Apps Script. . Since Google Cloud Storage is neither a built-in nor an advanced Apps Script service, for these examples I’ll be using an Apps Script library that uses the GCS JSON API behind the scenes. Once set up, reading and writing objects to cloud storage is as simple as this.
var value = handle.get ( key);
handle.put (key , value);

The motivation

This is a general-purpose library for simplifying access to storage, which can be quite complex with the vanilla API, especially as it introduces some concepts a little different than the ones found in Apps Script. My motivation for this started with some limitations I find in the properties and cache service of Apps Script.
In summary, these are
  • Size of object is limiting on both Properties and Cache service.
  • Overall space occupied in Properties store is limited.
  • The scope of the data in the store is limited to Document, User or Script – all within a single script. This means that sharing across scripts or tracking anonymous users across scripts is not possible, and you can’t see what’s been written to a user property which can make debugging difficult
  • Only strings can be written to the store
By using Cloud storage as an alternative for both the Cache and Property service (and as a bonus having a simple library to interact with other data in Cloud Storage), I can solve all of these issues. However it does introduce some new ones, so it may not be suitable for every case.
  • Google Cloud storage is not free. It is very cheap, but it’s not free – so you have to enable billing.
  • The Apps Script UrlFetch quota is only 100 mb a day. By removing the limit on the size of data you can write to the store, you introduce a different problem – that of running up against a UrlFetch daily limit issue. I’ve long argued that the quotas for UrlFetch have not kept up with the increasing number of Apis available and sophistication of Apps Script use cases. If you agree, then star this issue to try to improve the situation. In the library, I’ve limited object sizes that can be written to 5MB maximum, but you can soon hit your limit with too many of these.
  • The performance of properties and cache service is very good – about 60-100 ms is typical, and its obviously very well integrated into Apps Script. Accessing cloud storage via the JSON API is around 200-400ms. I’m using the cheapest storage cross-regionally so you can select better and more local storage to improve on this of course.
  • You have to use OUath2 and manage a project to contain your storage buckets. Luckily you can use Service Accounts, and my cGoa library makes it painless
We get other benefits too
  • You can devise any scope visibility scheme you like to share across scripts, users, documents or any group you can think of.
  • The data is visible and accessible via the Google Cloud browser, so you can easily examine what’s in each store, and use all the cloud storage tools to set very granular access rules.
  • You can write any kind of data, including blobs, and the mime type is preserved in the storage, so it can be handled appropriately by the browser (for example, and the image will be displayed as an image)
  • Because it’s in Cloud storage in native format, the same data can be shared between other languages and systems that know how to access GCS too.
  • I’ve built in the capability to enable automatic zipping to minimize storage occupied.
  • Because this is abstracted, I can easily extend the back end from Google Cloud storage to other solutions such as Azure and Amazon Web service, but still, us the same code within Apps using the library.
So that’s the background, let’s move on to how to get started. If you want to skip all the stuff, you can go straight to GcsStore examples