In Google Caching and faking jsonP, i showed how to use google cache to avoid multiple calls to the same json data. One of things i discovered was that the key value in google cache services has an undocumented maximum length, so the challenge is to figure out how to come up with a unique key that will not break any limits, given a URL of unpredictable length.
One way to shorten the key is to use the a cryptographic hash function. Google apps script has a variety of these built-in. I selected the SHA_1 function since it is pretty collision resistant (meaning it is unlikely that 2 keys will be hashed to the same result).
So here is a function to shorten a key to a manageable size.
function shortKeyHash (input) { return Utilities.base64Encode( Utilities.computeDigest( Utilities.DigestAlgorithm.SHA_1, input)); }
Testing.
Here’s a quick test, using the Google caching service. Here we’ll generate some strings of random length and contents, shorten their key, and write them off to cache. You can use this technique to shorten any strings yet keep their uniqueness, although this post is focusing on its use for Google Caching.
function kTest () { var cache = CacheService.getPublicCache(); var loops = 1000; var kl = 0; var il = 0; for (var i = 0; i < loops; i++ ) { // get some random string to test var input = mcpher.arbritraryString(mcpher.randBetween(1,500)); // shorten it var key = mcpher.shortKeyHash(input); // it shouldnt be already known mcpher.DebugAssert(!cache.get(key)); // put it to a 5 second cache cache.put(key,input,5); // for reporting il+=input.length; kl+=key.length; } Logger.log ("average data length:" + il/loops + " average key length:" + kl/loops); }The shortKeyHash() function plus a few others used in the test are available in the mcpher public library.
I forgot to mention, with an average key length of about 250 in this test, the average shortened key length was 25.