GCP Cloud Storage

Processing media in Google Compute Cloud Storage (GCP)

Media files stored on Google Cloud Storage can be used in media workflows directly. If your media already resides on cloud storage, you can process it there with Dolby.io APIs by providing either pre-signed URLs or GCP Service Account Credentials.

Google Cloud Storage with Pre-Signed URLs

There are a few ways to achieve this in media workflows.

Public File

A publicly available URL is accessible with a GET request that can be used as as an input parameter. You can do this by following Google's documentation instructions.

❗️

Using gs:// URLs

Unless you are using GCP Service Account Credentials, we do not currently support gs:// style URLs as an input / output parameter. You will get a validation error. Instead, you should generate a pre-signed URL as described below.

Pre-Signed URLs

A pre-signed URL can be a convenient way to provide a URL containing temporary credentials as a basic string. The Dolby.io Media APIs need to GET the input parameter when reading media and PUT on the output parameter to write media. Therefore, to make a request with pre-signed URLs, you'll need to generate a separate string for each parameter.

📘

Setting up GCP Authentication

Ensure that GOOGLE_APPLICATION_CREDENTIALS is set either at runtime, or in the environment variables as noted in the comments of the scripts below so that the GCP API calls work properly. For more information on how to do this, read the Google Cloud Documentation.

Here's some examples for how to do this:

Downloading a File

# pip install google-cloud-storage

import datetime
import os

from google.cloud import storage

def generate_download_signed_url_v4(bucket_name, blob_name):
    """Generates a v4 signed URL for downloading a blob.

    Note that this method requires a service account key file. You can not use
    this if you are using Application Default Credentials from Google Compute
    Engine or from the Google Cloud SDK.
    """
    # If you don't have the authentication file set to an environment variable
    # See: https://cloud.google.com/docs/authentication/getting-started for more information
    # os.environ["GOOGLE_APPLICATION_CREDENTIALS"]="<PATH_TO_GCP_KEY>.json"

    storage_client = storage.Client()
    bucket = storage_client.bucket(bucket_name)
    blob = bucket.blob(blob_name)

    url = blob.generate_signed_url(
        version="v4",
        # This URL is valid for 15 minutes
        expiration=datetime.timedelta(minutes=15),
        # Allow GET requests using this URL.
        method="GET",
    )

    print("Generated GET signed URL:")
    print(url)
    print("You can use this URL with any user agent, for example:")
    print("curl '{}'".format(url))
    return url

generate_download_signed_url_v4('your-bucket-name', 'filename.mp3')
// npm install @google-cloud/storage

// The ID of your GCS bucket
const bucketName = 'your-bucket-name';

// The full path of your file inside the GCS bucket, e.g. 'yourFile.jpg' or 'folder1/folder2/yourFile.jpg'
const fileName = 'filename.mp3';

// Imports the Google Cloud client library
const {Storage} = require('@google-cloud/storage');

// Creates a client
const storage = new Storage();

async function generateV4ReadSignedUrl() {
  // If you don't have the authentication file set to an environment variable
  // See: https://cloud.google.com/docs/authentication/getting-started for more information
  // process.env.GOOGLE_APPLICATION_CREDENTIALS = "<PATH_TO_GCP_KEY>.json";

  // These options will allow temporary read access to the file
  const options = {
    version: 'v4',
    action: 'read',
    expires: Date.now() + 15 * 60 * 1000, // 15 minutes
  };

  // Get a v4 signed URL for reading the file
  const [url] = await storage
    .bucket(bucketName)
    .file(fileName)
    .getSignedUrl(options);

  console.log('Generated GET signed URL:');
  console.log(url);
  console.log('You can use this URL with any user agent, for example:');
  console.log(`curl '${url}'`);
}

generateV4ReadSignedUrl().catch(console.error);

Uploading a File

# pip install google-cloud-storage

import datetime
import os

from google.cloud import storage

def generate_upload_signed_url_v4(bucket_name, blob_name):
    """Generates a v4 signed URL for uploading a blob using HTTP PUT.

    Note that this method requires a service account key file. You can not use
    this if you are using Application Default Credentials from Google Compute
    Engine or from the Google Cloud SDK.
    """
    # If you don't have the authentication file set to an environment variable
    # See: https://cloud.google.com/docs/authentication/getting-started for more information
    # os.environ["GOOGLE_APPLICATION_CREDENTIALS"]="<PATH_TO_GCP_KEY>.json"

    storage_client = storage.Client()
    bucket = storage_client.bucket(bucket_name)
    blob = bucket.blob(blob_name)

    url = blob.generate_signed_url(
        version="v4",
        # This URL is valid for 15 minutes
        expiration=datetime.timedelta(minutes=15),
        # Allow PUT requests using this URL.
        method="PUT",
    )

    print("Generated PUT signed URL:")
    print(url)
    print("You can use this URL with any user agent, for example:")
    print(
        "curl -X PUT -H 'Content-Type: application/octet-stream' "
        "--upload-file my-file '{}'".format(url)
    )
    return url

generate_upload_signed_url_v4('your-bucket-name', 'newfilename.mp3')
// npm install @google-cloud/storage

// The ID of your GCS bucket
const bucketName = 'your-bucket-name';

// The full path of your file inside the GCS bucket, e.g. 'yourFile.jpg' or 'folder1/folder2/yourFile.jpg'
const fileName = 'newfilename.mp3';

// Imports the Google Cloud client library
const {Storage} = require('@google-cloud/storage');

// Creates a client
const storage = new Storage();

async function generateV4UploadSignedUrl() {
  // If you don't have the authentication file set to an environment variable
  // See: https://cloud.google.com/docs/authentication/getting-started for more information
  // process.env.GOOGLE_APPLICATION_CREDENTIALS = "<PATH_TO_GCP_KEY>.json";

  // These options will allow temporary write access to the file
  const options = {
    version: 'v4',
    action: 'write',
    expires: Date.now() + 15 * 60 * 1000, // 15 minutes
  };

  // Get a v4 signed URL for uploading file
  const [url] = await storage
    .bucket(bucketName)
    .file(fileName)
    .getSignedUrl(options);

  console.log('Generated PUT signed URL:');
  console.log(url);
  console.log('You can use this URL with any user agent, for example:');
  console.log(
    "curl -X PUT --upload-file my-file '${url}'`
  );
}

generateV4UploadSignedUrl().catch(console.error);

For further languages, see Google's documentation.

🚧

Output Expiration Limit

When setting an expiration on a pre-signed URL, you need to allow enough time for the file to complete processing. For longer media files, there may be a delay when a job is queued before processing and it can take up to real-time before a file completes. You should take this into consideration when setting the expiration value of the output URL.

🚧

Cloud Storage Triggers

When using Google Cloud Storage Triggers, there is the possibility to create an endless loop when detecting a new file and sending it to Dolby.io for processing. Ensure your rules, path management, and/or file naming are robust enough to only trigger on unprocessed files.

Firebase Cloud Storage

Using Firebase for file storage works similarly to Google Cloud Storage, but needs to specifically be enabled for Firebase. This is because Service Accounts in Cloud Storage are independent from Service Accounts in Firebase without specifically granting that permission.

Ensure that the security rules for the file location in Firebase contains the read and/or write access desired for file handling as documented in Firebase. The input parameter requires read permissions and output requires write permissions.

Then, generate a private key as described in Firebase Documentation. Once you download those settings, set the environment variable GOOGLE_APPLICATION_CREDENTIALS to the location of your JSON key file.