How to upload files to Google Cloud Storage (GCS) Bucket

Samet Karadag
2 min readMar 20, 2020

--

TL:DR;

gsutil cmd to upload a big directory:

gsutil -m -o GSUtil:parallel_composite_upload_threshold=150M cp -r local_dir gs://your-bucket/data/

gsutil cmd to upload a big file:

gsutil -m -o GSUtil:parallel_composite_upload_threshold=150M cp ./filename gs://bucketname/
  1. You need to create a bucket in your GCP project:

There are 3 important things here;

  • Location type: region, multi-region, dual-region (affects pricing)
  • Location of the bucket: eu for multi-region or one of the options below for regional locations within Europe:
  • Storage class (affects pricing)

You can look at GCS pricing page for the details on pricing.

And you can use GCP pricing calculator to calculate the costs.

For testing; I would recommend a “regional” bucket (>= 99.9% availability SLA) with “standard” storage class for the storage only.

For production; I would recommend multi-regional bucket (eu >= 99.95% availability SLA).

You can see more details on GCS SLAs here.

There are nearline and coldline storage classes, for these classes performance is similar but pricing changes depending on access frequency. Nearline is suitable for objects accessed about once in 3 months and coldline is generally for archival you need once a year. Those options may make sense for the old/unused images in the future. You can see ILM page for the details on how to automatically change storage class.

You can create your bucket using console as described here:

https://cloud.google.com/storage/docs/creating-buckets

2. You can use GCP console or gsutil cp command line tool to upload files to your bucket.

https://cloud.google.com/storage/docs/uploading-objects#storage-upload-object-code-sample

I would recommend using gsutil command line tool for uploading big and multiple files, as it would be easier to upload files in parallel.

Please see https://cloud.google.com/storage/docs/gsutil_install to install gsutil.

You can find details on gsutil cp command here.

The following cmd will split big file (>150m) and parallely upload from your machine to GCS bucket:

gsutil -o GSUtil:parallel_composite_upload_threshold=150M cp ./bigfile gs://your-bucket

The following cmd will upload files in img_dir in parallel:

gsutil -m cp -r local_dir gs://my-bucket/data/

Combined:

gsutil -m -o GSUtil:parallel_composite_upload_threshold=150M cp -r local_dir gs://your-bucket/data/

--

--