Another quick demo of data sharing

There are many ways to accomplish this of course, but using Caching across multiple Apps Script and Node projects using a selection of back end platforms is pretty fast to set up. To test this, here’s a challenge that shares the data in a spreadsheet with node, set up end to end in less than 5 minutes. This example is almost exactly the same as Sharing data with Upstash between Node and Apps Script – end to example in 5 minutes , and Sharing data between Apps Script and Node using Github back end – 5 minute example  except this time, instead of using Upstash (redis)  or GitHub – we’re going to use a Google Cloud Storage. Cloud storage allows a much bigger payload so this is a good choice (as it there will be less splitting across multiple records), but it does mean you have create a Cloud project and enable billing etc (although the free tier is very generous).


GCS credentials

Assume you’ve set up a service account for the Apps Script version. If not take a look at Google Cloud Storage platform cache plugin for Apps Script crusher. You can use the same service account for the Node end. You’ll also need to create up a cloud storage bucket in the cloud console to use for your store.

clud storage bucket creation

Create an apps script project, and find and add the necessary libraries

We can use scrviz to quickly find the library keys, which are normally a pain to find

  • find and add bmCrusher

  • find and add bmPreFiddler

Oauth2 for Apps Script

You’ll also need to add the goa library and setup the credentials. See Google Cloud Storage platform cache plugin for Apps Script crusher for how to do that.

Source data

We’re going to get a sheet from this spreadsheet that contains a bunch of stuff about the main airports worldwide.

so all we need is the spreadsheetId and the name of the sheet, get the data, and write to the upstash store.

Code for getting spreadsheet data and writing to cache

const testShareGcs = () => {

  // get the airports sheet
  const fiddler = bmPreFiddler.PreFiddler().getFiddler({
    id: '1h9IGIShgVBVUrUjjawk5MaCEQte_7t32XeEP1Z5jXKQ',
    sheetName: 'airport list',
    createIfMissing: false,

  const goa = cGoa.make('gcs_cache', PropertiesService.getUserProperties())
  // we can just re-use the cache plugin service as cloud storage library has same methods.
  const crusher = new bmCrusher.CrusherPluginGcsService().init({
    bucketName: 'bmcrusher-test-bucket-store',
    prefix: "/crusher/store",
    tokenService: () => goa.getToken(),
    uselz: true

  // put the data to the store so that node can get it
  // since it's just a test we'll expire it after 300 seconds - 5 minutes
  crusher.put('airports', fiddler.getData(), 1000)

get the data and write it to the crusher upstash store

Note that the only difference between backends is how they are initialized. And that’s all there is on the Apps Script side.

Cloud storage

Using the bucket you set up in the cloud storage console, here’s how it looks with some crusher data in it. For demonstration, I used a smaller chunksize so it would split up the object into multiple chunks, but using the plugin’s default chunksize will make it unlikely that the data will overflow into multiple chunks. In any case, that all happens behind the scenes and you don’t need to care about it.

cloud storage chunks

Node client

The node client is even briefer, but first install the bmcrusher-node module from npm using either npm or yarn

yarn add bmcrusher-node
add from npm


Here’s the code

I’m picking up the service account from a file. This is the same service account .json file you would have downloaded to drive for the Apps Script end. There are a number of strategies in Node to handle Google Cloud credentials, so pick the one you are most comfortable with

const { CrusherPluginGcsService } = require("bmcrusher-node");
const { getGcpCreds } = require("./private/secrets");

const gcsCrusher = new CrusherPluginGcsService().init({
  tokenService: () => getGcpCreds(),
  prefix: "/crusher/store",
  bucketName: "bmcrusher-test-bucket-store",

const testAirports = async (crusher) => {
  const airports= await crusher.get('airports')

get data into node


And we’re done – in less than 5 minutes –

    name: 'Vilnius International Airport',
    latitude_deg: 54.6341018676757,
    longitude_deg: 25.2858009338378,
    elevation_ft: 646,
    iso_country: 'LT',
    iso_region: 'LT-VL',
    municipality: 'Vilnius',
    iata_code: 'VNO',
    timestamp: ''
    name: 'Cape Town International Airport',
    latitude_deg: -33.9648017883,
    longitude_deg: 18.6016998291,
    elevation_ft: 151,
    iso_country: 'ZA',
    iso_region: 'ZA-WC',
    municipality: 'Cape Town',
    iata_code: 'CPT',
    timestamp: ''
  ... 456 more items
data from apps script airports

Of course you could process the data in some way if you want, and write it back for Apps Script to pick up, clean up the store etc.


scrviz Apps Script bmCrusher:

scrviz Apps Script bmPreFiddler:

scrviz Apps Script cGoa:

github bmcrusher-node: