I have a Node project that looks at Google Cloud storage and figures out the dimensions of images stored there. This is handy for an API to be able to serve up the right one for the right device. I needed the same capability in Apps Script, but the Nodejs code uses some stuff not available in Apps Script. I could have spun up an htmlservice and done that using the DOM, but since dimensions are buried in the encoding for images I figured it must be possible to play around with an image blob instead. I came across this project from Kanshi TANAIKE, which is more extensive than I need, but it did contain the code to extract out the dimensions from various images.

It’s now in the Images namespace of my cUseful library – 1EbLSESpiGkI3PYmJqWh3-rmLkYKAtCNPi1L2YCtMgo2Ut8xMThfJ41Ex

Example

cUseful.Images.getInfo(blob) will work on any blob containing a supported image, so if that’s all you need then you can stop here, but for a more substantial demo of a workflow, I’m going to do all this in a few lines of code.

  • Use a service account to authenticate to Google Cloud Storage, using my goa library.
  • Get a list of objects in a cloud storage folder, and get their contents as a blob using my gcsstore library.
  • Get info about the images using the new namespace (Images) in my cuseful library
  • Populate a sheet with info about each of the found images.

The test uses 2 images – all exactly the same but in jpg, BMP, gif and png formats. For the full example, you’ll need these libraries

  • cUseful – 1EbLSESpiGkI3PYmJqWh3-rmLkYKAtCNPi1L2YCtMgo2Ut8xMThfJ41Ex
  • cGoa – 1v_l4xN3ICa0lAW315NQEzAHPSoNiFdWHsMEwj2qA5t9cgZ5VWci2Qxv2
  • cGcsStore – 1w0dgijlIMA_o5p63ajzcaa_LJeUMYnrrSgfOzLKHesKZJqDCzw36qorl

Setting up service account to access cloud storage

  • Create a service account for the project containing the cloud storage bucket you’ll use, give it storage admin role, download the .json credentials to Drive and grab the file ID
  • Run this code to register the service account in your script, substituting your drive ID. You can delete it once you’ve run it once.
function oneOffgcsDownload() {

// used by all using this script
var propertyStore = PropertiesService.getScriptProperties();
// DriveApp.createFile(blob)
// service account for cloud download
cGoa.GoaApp.setPackage (propertyStore ,
cGoa.GoaApp.createServiceAccount (DriveApp , {
packageName: 'gcs_download',
fileId:'15Nxxxxxxxxxx_6FFD4rKA',
scopes : cGoa.GoaApp.scopesGoogleExpand (['devstorage.read_only']),
service:'google_service'
}));
}

Create the demo.

This example does the whole thing starting from this in cloud storage

and finishing with this in a sheet

The embedded comments should explain the steps

function testImages () {
// set up oauth - I'm using a service account with storage admin role
// that ive previously set up and called gcs_download
const goa = cGoa.make ('gcs_download',PropertiesService.getScriptProperties());

// get a handle for cloud storage
// my data is in a bucket called test-bucket
// in a folder called testimages
const store = new cGcsStore.GcsStore()
.setAccessToken(goa.getToken())
.setBucket('test-bucket')
.setExpiryLog (false);

// get the sheet I'm writing the result to
const sheet = SpreadsheetApp
.openById('1xxxxxxxxxxx0A')
.getSheetByName('testimages');

// get the list of files in the folder
const files = store
.setFolderKey ('testimages')
.getObjects();

// now we have the fully qualified names, we can set the folderkey back to root
// filter out any non-images in the folder
// and get the blobs and info for each one
store.setFolderKey("");
const info = files
.filter (function (d) {
return d.contentType.match (/^image\//);
})
.map (function (d) {
return store.get(d.name);
})
.map (function (d) {
return cUseful.Images.getInfo(d);
});

// now write all that to a sheet after clearing the thing
// we don't need the blob back so delete that column
new cUseful.Fiddler(sheet.clearContents())
.setData (info)
.filterColumns (function (name) {
return name !== "blob";
})
.dumpValues();

}
For more like this see Google Apps Scripts Snippets

For help and more information join our community, follow the blog or follow me on twitter.