NEWDolby Media Processing APIs are now the Dolby.io Media APIs Learn More >
X

GCP Cloud Storage

Processing media in GCP Cloud Storage

Audio files stored on Google Cloud Storage can be used in media workflows directly. If your media already resides on Cloud Storage, you can process it there with Dolby.io APIs by providing access credentials to your data.

Google Cloud Storage

There are a few ways to achieve this in media workflows.

Public File

A publicly available URL will be accessible with a GET request which then can be used as as an input parameter. You can do this by following Google's documentation instructions.

❗️

Using gs:// URLs

We do not currently support gs:// style URLs as an input / output parameter. You will get a validation error. Instead you should generate a pre-signed URL as described below.

Pre-Signed URLs

A pre-signed URL can be a convenient way to provide a URL containing temporary credentials as a basic string. The Dolby.io Media APIs will need to GET the input parameter when reading media and PUT on the output parameter to write media. Therefore, to make a request with pre-signed URLs you'll need to generate a separate string for each parameter.

📘

Setting up GCP Authentication

Ensure that GOOGLE_APPLICATION_CREDENTIALS is set either at runtime, or in the environment variables as noted in the comments of the scripts below so that the GCP API calls work properly. For more information on how to do this, read the Google Cloud Documentation.

Here's some examples for how to do this:

# pip install google-cloud-storage

import datetime
import os

from google.cloud import storage

def generate_download_signed_url_v4(bucket_name, blob_name):
    # If you don't have the authentication file set to an environment variable
    # See: https://cloud.google.com/docs/authentication/getting-started for more information
    # os.environ["GOOGLE_APPLICATION_CREDENTIALS"]="<PATH_TO_GCP_KEY>.json"

    storage_client = storage.Client()
    bucket = storage_client.bucket(bucket_name)
    blob = bucket.blob(blob_name)

    url = blob.generate_signed_url(
        version="v4",
        # This URL is valid for 60 minutes
        expiration=datetime.timedelta(minutes=60),
        # Allow GET requests using this URL.
        method="GET",
    )

    print("Generated GET signed URL:")
    print(url)
    print("You can use this URL with any user agent, for example:")
    print("curl '{}'".format(url))
    return url

generate_download_signed_url_v4('your-bucket-name', 'filename.mp3')
// npm install @google-cloud/storage

// The ID of your GCS bucket
const bucketName = 'your-bucket-name';

// The full path of your file inside the GCS bucket, e.g. 'yourFile.jpg' or 'folder1/folder2/yourFile.jpg'
const fileName = 'filename.mp3';

// Imports the Google Cloud client library
const {Storage} = require('@google-cloud/storage');

// Creates a client
const storage = new Storage();

async function generateV4UploadSignedUrl() {
  // If you don't have the authentication file set to an environment variable
  // See: https://cloud.google.com/docs/authentication/getting-started for more information
  // process.env.GOOGLE_APPLICATION_CREDENTIALS = "<PATH_TO_GCP_KEY>.json";

  const options = {
    version: 'v4',
    action: 'write',
    expires: Date.now() + 60 * 60 * 1000, // 60 minutes
  };

  // Get a v4 signed URL for uploading file
  const [url] = await storage
    .bucket(bucketName)
    .file(fileName)
    .getSignedUrl(options);

  console.log('Generated GET signed URL:');
  console.log(url);
  console.log('You can use this URL with any user agent, for example:');
  console.log(`curl '${url}'`);
}

generateV4UploadSignedUrl().catch(console.error);

For further languages, see Google's documentation.

🚧

Output Expiration Limit

When setting an expiration on a pre-signed URL you need to allow enough time for the file to complete processing. For longer media files, there may be a delay when a job is queued before processing and it can take up to real-time before a file completes. You should take this into consideration when setting the expiration value of the output url.

Firebase Cloud Storage

Using Firebase for file storage works similarly to Google Cloud Storage, but needs to specifically be enabled for Firebase. This is because Service Accounts in Cloud Storage are independent from Service Accounts in Firebase without specifically granting that permission.

Ensure that the security rules for the file location in Firebase contains the read and/or write access desired for file handling as documented in Firebase. The input parameter requires read permissions and output requires write permissions.

Then, generate a private key as described in Firebase Documentation. Once you download those settings, set the environment variable GOOGLE_APPLICATION_CREDENTIALS to the location of your json key file.

🚧

Cloud Storage Triggers

When using Google Cloud Storage Triggers, there is the possibility to create an endless loop when detecting a new file and sending it to Dolby.io for processing. Ensure your rules, path management, and/or file naming are robust enough to only trigger on unprocessed files.


Did this page help you?