If you use caching in Apps Script (and you should), you might hit the 100k limit on cache sizes. I’ve written about how to get round that by writing several cache entries and linking them to a single key, but another (better) approach, is to zip (compress) the data before writing it out. It’s a bit tricky to get all the blobs and b64’s and zips sorted out but here’s how to do it.

Let’s say you have a large array of objects you want to write to cache. You can write it like this

cache.put (cacheKey , Utilities.base64Encode(Utilities.zip ([Utilities.newBlob(JSON.stringify(yourObject))]).getBytes()));

and read it back again like this

var cached = cache.get (cacheKey);
if (cached) {
  var yourObject = JSON.parse(
    Utilities.unzip(Utilities.newBlob(Utilities.base64Decode(cached),'application/zip'))[0].getDataAsString());
	}

JSON is pretty compressible, so you’re likely to get a 60-70% reduction in size.
You can of course play this trick with the properties service too, which has a much lower property values size than cache service.

Library version

Since this is handful, it is implemented in my cUseful library.

Here’s the key for the cUseful library, and it’s also on github, or below.

Mcbr-v4SsYKJP7JMohttAZyz3TLx7pV4j

And the above code could be rewritten as follows using crush and uncrush.

cache.put (cacheKey , cUseful.Utils.crush (JSON.stringify(yourObject))); 

var cached = cache.get (cacheKey);
if (cached) {
  var yourObject = JSON.parse(cUseful.Utils.uncrush (cached));
  }

that’s all folks…

For more like this see Google Apps Scripts Snippets
Why not join our forum, follow the blog or follow me on twitter to ensure you get updates when they are available.