In Stream content to Google Cloud Storage I covered how you could stream content (in this case a video file) from a hosted URL to an object on cloud storage in preparation for labelling it with the Video Intelligence API.
Let’s see how to generalize that a bit so it can be used for all reading and writing of content to and from Cloud Storage.
Using streaming abstracts away the size of the content you are trying to stream, and also generalizes whether it is contained in memory in your app, or some content coming from a file, a request or some other piped resource. So I’ll be using streaming for all interactions with cloud storage.

General usage

Let’s say we have some content to get to the storage, we want to be able to simply express it like this.
const storageStream = require('./storagestream');
const result = await storageStream.streamContent({ name, content });

Storagestream

And here’s the streamContent function
/**
* convert a string or objects to a stream
*/
const streamContent = async ({ name, content }) => {
  const writeStream = await createWriteStream({ name });
  return new Promise ((resolve, reject) => {
    const str = typeof content === 'object' ? JSON.stringify(content) : content;
    sts(str).pipe(writeStream).on('finish', () => {
     resolve(name);
     console.log(name, 'uploaded to storage');
   });
 });
};

There’s a couple of dependencies there – (you can get this from

yarn add string-to-stream)
var sts = require('string-to-stream');

And the next is a couple of functions to create a writestream attached to a cloud blob. The credentials are a service account and bucket name that provide access to the cloud storage platform as discussed in Stream content to Google Cloud Storage

/**
* create the cloud storage stream
* the credentials/bucket name and filename are in the secrets file
* @param {string} name the filename to stream from
*/
const createWriteStream = async ({ name, mimeType }) => {
  const blob = await createStorageStream({ name });
  const stream = blob.createWriteStream({
    resumable: true,
    contentType: mimeType || blob.type
  });
  // this stream will be piped to
  return stream;
};
/**
* create the cloud storage stream
* the credentials/bucket name and filename are in the secrets file
* @param {string} name the filename to stream from
*/
const createStorageStream = async ({ name }) => {
  console.log('creating stream', name);
  // get storage params
  const creds = secrets.getGcpCreds();
  const { credentials, bucketName } = creds;
  const { projectId } = credentials;
  const storage = new Storage({
    projectId,
    credentials
  });

  const bucket = storage.bucket(bucketName);

  // we'll actually be streaming to/from this blob
  return bucket.file(name);
};

and that’s all that’s needed to stream content to cloud storage.

So let’s say your content is being piped to your app from an HTTP request or the files system or some other process.  Since the example above converts, the string to a stream then pipes it to the cloud storage writestream, it’s very much the same process. Here’s the Stream content to Google Cloud Storage example, using the same functions as above. All that’s required is the name (in this case the url of the video file to be streamed, and optionally the mimeType) from which a writestream can be constructed, and then piping the content to be downloaded directly to that writestream.

const writeStream = await storageStream.createWriteStream({
  name,
  mimeType: content.mimeType
});
const r = await storageStream.downloadFile({ url, stream: writeStream, mimeType: content.mimeType });
/**
 * stream video file direct from vimeo storage to google cloud storage
 * @param {string} url the file to stream
 * @param {stream} stream the stream to pipe it to
 * @param {stream} [mimeType] validate exprected mimeType
 */
const downloadFile = async ({ url, stream, mimeType }) => {

  return new Promise (( resolve, reject) => {
    // request the video files
    console.log('requesting download from', url);

    request.get(url)
    .on('error', err => reject(err))
    .on('response', response => {
      if (response.statusCode !== 200) {
        reject('unexpected status code:' + response.statusCode);
      }
      // if required, check mimetype is what was expected
      if (mimeType && response.headers['content-type'] !== mimeType) {
        reject('expected:' + mimeType + ' got:'  + response.headers['content-type']);
      }
    })
    .pipe(stream)
    .on('error', err => reject(err))
    .on('finish', () => resolve(url));
  });
};

Reading content

That resolves to a one-liner like this
const content = await storageStream.getContent({ name });

And here’s the function

/**
* get content from storage
* @param {string} name the file to get
*/
const getContent = async ({ name }) => {
  const readStream = await createReadStream({ name });
  return new Promise ((resolve, reject) => {
    let str = '';
    readStream.on('end', () => {
      try {
        resolve(JSON.parse(str));
      } catch (err) {
        resolve(content);
      }
    });
    readStream.on('error', err => reject(err));
    readStream.on('data', buf => str += buf.toString());
  });
};

Code

The storagestream code is on github. You’ll need to construct your own secrets file to provide credentials and bucket information.
More Google Cloud Platform topics below.