Fun fact: you can host websites in S3. For single page apps (SPAs) I would always recommend this approach since it’s cheaper and more resilient than hosting nginx to serve mere static content. Put a CDN like Cloudfront in front of it and you have a fast, resilient, and cheap static site.
I wanted to see if this functionality can be done in GCP’s Cloud Storage. Short answer: sort of. Technically you can, but:
- dynamic routes (ReactRouter and friends) will return 404 status codes
- you can’t point GCP’s CDN to your static site so you’d end up using a 3rd party if you want HTTPS

The cheapest and easiest way to host a website in GCP right now would be through its App Engine. But if you really, really, really want to go the Cloud Storage route here’s the how-to:
1. Use the google/cloud-sdk image
docker: - image: google/cloud-sdk environment: STORAGE_BUCKET: mywebsite.com
Note the $STORAGE_BUCKET env variable that points to the bucket we’re deploying to.
2. Authenticate gcloud cli
echo $GCLOUD_SERVICE_KEY > ${HOME}/gcloud-service-key.json gcloud auth activate-service-account --key-file=${HOME}/gcloud-service-key.json gcloud --quiet config set project ${GOOGLE_PROJECT_ID}
The $GCLOUD_SERVICE_KEY and $GOOGLE_PROJECT_ID are env variables that you set in the CircleCI UI.
This script will save the service key to a file, use that file to authenticate to gcloud, and then set the project default project.
3. Call gsutils to sync your site
gsutil defacl ch -u AllUsers:READER gs://$STORAGE_BUCKET gsutil rsync -R /tmp/workspace/dist gs://$STORAGE_BUCKET gsutil setmeta -h "Cache-Control:private, max-age=0, no-transform" \ gs://$STORAGE_BUCKET/*.js
First, we set the bucket to be publicly readable (AllUsers:READER).
Then, we sync the output of my build to Storage.
Lastly, we make sure users get only the freshest js by disabling caching.
And that’s it! Checkout the complete configs here.