Looking into how various operations perform as the size of the sheet increase provided some interesting data. The objective is to see whether the time to execute particular operations is directly proportional to the size of the sheet.
First of all, the test
Page Content

#### The test

A scheduled task ran every minute reading and writing, inserting and deleting data to a sheet, and gradually increasing the number of rows. It started at 1000 rows (20 columns), and each task removed 100 and added 250 rows – a net 150 row increase for each iteration. Since there is some kind of caching involved behind the scenes, the data was randomized to ensure that it actually got rewritten, and the test was repeated with and without a flush operation between each data manipulation.

#### The results

Here’s the first result, with no flushing.

#### Description

 operation what it does open opens the workbook and sheet read reads all the data in one go insert inserts 250 rows write writes data to 250 rows deleterows deletes 100 rows reread reads all the data again rewrite writes all the data

#### Findings

The rewrite operation is by far the most expensive, and is not proportional to the number of records. The other operations are pretty good. There is a strange pattern of a spike followed by a gradual increase till the next spike. Further analysis – this time normalizing by dividing the time by the number of rows gives an operation time per row.  If this was strictly linear, it would be a straight line. This highlights these spikes every 1000 rows, not just for the rewrite operation (although it is most pronounced), but for the others too.

Now with flushing – the same pattern of spikes every 1000 records. Clearly there is some kind of housekeeping done every now and again – watch this space for more analysis.

#### The Code

Here are the tests, triggered every minute.
```/**
* generate a sheet full of random data
*/
function generateData() {

// get some random data
var randomData = cUseful.getRandomSheetStrings(1000, 20, 2, 20);

// get the data sheeet

// clear it
sheet.clear();
if (sheet.getLastRow() > 1) {
sheet.deleteRows(2, sheet.getLastRow() -1);
}
// write it
sheet.getRange(1,1,randomData.length, randomData.length).setValues(randomData);

}
function doThingsFlush() {
doThings(true,'logflush');
}
function doThingsNoFlush() {
doThings(false,'lognoflush');
}

/**
* do some  things on the sheet
*/
function doThings(flush,name) {

// how many rows to add / delete each time
var addRows = 250, delRows = 100;

var now = new Date().getTime();
var stats = {rows:0};
// get the data sheeet
stats.open = new Date().getTime() - now;
var now = new Date().getTime();

var values = sheet.getDataRange().getValues();
stats.read = new Date().getTime() - now;
var now = new Date().getTime();

// insert some rows
var range  = sheet.getRange (Math.floor(values.length/2)+1, 1,  addRows ,values.length);
sheet.insertRowsAfter(range.getRow()-1, range.getNumRows());
stats.insert = new Date().getTime() - now;
var now = new Date().getTime();

var data = cUseful.getRandomSheetStrings(range.getNumRows(), range.getNumColumns(), 2, 20);
range.setValues(data);
stats.write = new Date().getTime() - now;
var now = new Date().getTime();

// delete some rows
var range  = sheet.getRange (Math.floor(values.length/3)+1, 1,  delRows ,values.length);
sheet.deleteRows(range.getRow(), range.getNumRows());
stats.deleterows = new Date().getTime() - now;
var now = new Date().getTime();

var data = sheet.getDataRange().getValues();
stats.reread = new Date().getTime() - now;
// generate different data to avoid any caching effects
var data = cUseful.getRandomSheetStrings(data.length, data.length, 2, 20)
var now = new Date().getTime();

// rewrite the data
var range  = sheet.getRange (1,1,data.length, data.length);
range.setValues(data);
stats.rewrite = new Date().getTime() - now;
stats.rows = data.length;
stats.columns = data.length;

// the results.