I am supporting CandidateX

CandidateX is a startup that focuses on creating inclusion-focused hiring solutions, designed to increase access to job opportunities for underestimated talent. Check them out if you have a few minutes to spare. They need visibility!

Inserting rows with the Google Apps Script FusionTables service is pretty slow. There are also a bunch of quotas to worry about. Here’s what is says.

Note: You can list up to 500 INSERT statements, separated by semicolons, in one request as long as the total size of the data does not exceed 1 MB and the total number of table cells being added does not exceed 10,000 cells. If you are inserting a large number of rows, use the import method instead, which will be faster and more reliable than using many SQL INSERT statements.

However, I can’t get anywhere close to that without getting the ‘server error’ problem. For the moment, I’m still trying to find the optimum combination, but here’s how to chunk up inserts. Note I’m also using Backing off on rate limiting to deal with rate limiting of fusion requests. Assuming that array q is full of insert statements, need to tweak these to get the best batching.

var MAXCHUNKSIZE = 1000;
var MAXINSERTS = 50;
// save it
        var  p=0;
        while (p < q.length) {
          var chunk = [];
          // this is about batching insert statements, but there are lots of quotas
          // we always do at least one, and let fusion deliver the size news if one transaction is too big
          for ( var chunkSize =0 ; chunk.length === 0 || (chunkSize < MAXCHUNKSIZE && chunk.length < MAXINSERTS && p+chunk.length < q.length ) ; p++ ) {
              chunkSize += q[p].length;

          sqlString = chunk.join(";")+";"
          // write it
          var r = parentHandler.rateLimitExpBackoff ( function () { 
            return FusionTables.Query.sql(sqlString); } ) ;
For more like this see Google Apps Scripts Snippets
For help and more information join our forum, follow the blog, follow me on twitter