Connect Akto with Google BigQuery for Vertex AI Custom Deployed Model Logs
Overview
Akto can automatically fetch Vertex AI Custom Deployed Model prediction logs from BigQuery and analyze them for security issues. This integration allows you to monitor your AI models deployed on Vertex AI by ingesting prediction logs into Akto.
This method only works for custom models deployed on Vertex AI. It does not support default provided models (e.g. Gemini) in the Vertex platform.
Prerequisites
A Google Cloud Platform (GCP) project with Vertex AI enabled.
A BigQuery dataset and table where prediction logs are stored.
Appropriate IAM permissions to read from BigQuery.
Enable the following Google Cloud APIs: Vertex AI API, BigQuery API, and Cloud Logging API.
Steps to Connect
1
Enable BigQuery Logging
First, configure your Vertex AI Custom Deployed Model endpoint to log predictions to BigQuery.
Via Console
Go to Vertex AI → Endpoints in GCP Console.
Select your deployed model endpoint.
Click Edit.
Under Logging, enable Request/Response logging.
Select BigQuery as the destination.
Choose or create a dataset (e.g., vertex_ai_logs).
Create a dedicated Service Account for Akto (e.g., akto-bq-reader). This account will need permissions to read from your BigQuery dataset and execute query jobs.
1. Create Service Account
Via Console:
Go to IAM & Admin → Service Accounts.
Click + CREATE SERVICE ACCOUNT.
Name: akto-bq-reader.
Description: Service account for Akto to read BigQuery logs.
BigQuery Job User (roles/bigquery.jobUser): Allows running query jobs.
BigQuery Data Viewer (roles/bigquery.dataViewer): Allows reading table data.
Via Console (on the Service Account creation page):
Under Select a role, search for BigQuery Job User.
Click ADD ANOTHER ROLE.
Search for BigQuery Data Viewer.
Click CONTINUE and DONE.
Via gcloud:
# Grant Job User role (permits running queries in the project)gcloudprojectsadd-iam-policy-bindingYOUR_PROJECT_ID\--member="serviceAccount:akto-bq-reader@YOUR_PROJECT_ID.iam.gserviceaccount.com"\--role="roles/bigquery.jobUser"# Grant Data Viewer role (permits reading dataset contents)gcloudprojectsadd-iam-policy-bindingYOUR_PROJECT_ID\--member="serviceAccount:akto-bq-reader@YOUR_PROJECT_ID.iam.gserviceaccount.com"\--role="roles/bigquery.dataViewer"
3
Configure Authentication
Choose an authentication method based on your deployment. Akto supports Application Default Credentials (ADC) and Service Account Key Files.
Option A: Use Application Default Credentials (Recommended for GKE/Cloud Run)
If Akto is running on GCP (GKE, Cloud Run, Compute Engine), you can use ADC. This is more secure as it avoids managing long-lived keys.
GKE (Workload Identity): Bind the GCP Service Account created in the previous step to the Kubernetes Service Account used by Akto.
# Link K8s SA to GCP SAgcloudiamservice-accountsadd-iam-policy-binding\akto-bq-reader@PROJECT_ID.iam.gserviceaccount.com\--role="roles/iam.workloadIdentityUser"\--member="serviceAccount:PROJECT_ID.svc.id.goog[akto/akto-sa]"# Annotate K8s SAkubectlannotateserviceaccountakto-sa-nakto\iam.gke.io/gcp-service-account=akto-bq-reader@PROJECT_ID.iam.gserviceaccount.com
No Configuration Needed: When configuring the job in Akto, leave the JSON Authentication File Path field empty. Akto acts as the service account automatically.
Option B: Use a Service Account Key File (External Deployment)
If Akto is running outside of GCP (e.g., On-Prem, AWS, Local Docker), use a Service Account Key.
Create Key:
Go to IAM & Admin → Service Accounts.
Select akto-bq-reader.
Go to the Keys tab -> ADD KEY -> Create new key -> JSON.
Download the file (e.g., akto-bq-key.json).
Mount Key: Ensure this file is accessible to the Akto container.