GCP Self Hosting (GKE)
This section explains how to setup and create resources for self-hosting Zerve on a pre-existing GKE cluster in a Google Cloud Platform project.
General Infrastructure
At this time, GCP Self-Hosting is possible only with an existing GKE cluster.

GCS Bucket
Zerves requires block storage in order to store block state and user files. We recommend creating a bucket for this purpose using the installation steps below.
Artifact Registry
Zerve needs a repository in the Artifact Registry to store Docker images.
IAM Identities
Application Service Account
An identity that represents Zerve application in your GCP project.
Zerve will use Service Account Impersonation to obtain short-lived credentials to perform operations within your GCP project, such as scheduling compute jobs or managing canvas storage.
Execution Service Account
An identity for compute jobs that Zerve schedules to execute code blocks.
This service account can be used to grant users' code blocks access to other GCP resources in your organization.
Build Service Account
An identity for build jobs.
This service account will be used to grant build jobs access to the GCS bucket and docker repository.
Google Kubernetes Engine
Zerve can use your existing GKE cluster to schedule build and compute jobs. This cluster does not have to be in the same project with the rest of Zerve's infrastructure.
Cluster requirements:
Version 1.28 or higher.
Enabled Workload Identity
DNS-based control plane endpoint
Setup Instructions
Cloud infrastructure
You can use gcloud CLI to provision the necessary infrastructure. You can do it in a separate GCP project within your organization.
Set some common env vars, used throughout the guide:
If using GKE for compute, set env vars for the project ID and node identity of the GKE cluster:
Point your gcloud CLI to the general infrastructure project:
Set up service accounts:
Set up the bucket:
Set up the docker repository:
Setup logging permissions:
If using GKE for compute:
Allow k8s service accounts to impersonate Zerve's IAM service accounts:
Allow k8s nodes to pull images from the docker repository:
Allow Zerve to connect to your EKS cluster:
Setup RBAC in your cluster by installing our helm chart:
Zerve Organization Self-Hosting Settings
Navigate to your organization's self-hosting settings in Zerve app.
Fill out the form with the following values:
Project ID: the project ID where Zerve's general infrastructure resides, e.g.
flying-banana-412312-r9.Region: the region where Zerve's general infrastructure resides, e.g.
europe-west2Bucket Name: the name of the GCS bucket , e.g.
zerve.Service Account: the email of the application service account, e.g.
[email protected].Docker Repository: the address of the docker repository, e.g.
europe-west2-docker.pkg.dev/flying-banana-412312-r9/zerve.
If using GKE for compute, check the Kubernetes box under Compute options and fill out the following values:
Namespace: the namespace where Zerve's helm chart was installed, e.g.
zerve.Endpoint: DNS-based control plane endpoint of the GKE cluster, e.g.
https://gke-723981544496.europe-west2.gke.googTo find it using
gcloudCLI, run the following command:
Service Account Token: leave empty
Certificate Authority Data: leave empty
Last updated

