TypeScriptADK-TS

Google Cloud Tools

Build custom tools for Google Cloud services and use GcsArtifactService for artifact storage

Google Cloud tools enable agents to interact with Google Cloud Platform services. ADK-TS provides the GcsArtifactService for storing artifacts in Google Cloud Storage and patterns for building custom tools for any Google Cloud API.

Environment Variables

ADK-TS uses specific environment variables for Google Cloud integrations:

VariablePurposeRequired
GOOGLE_API_KEYAPI key for Gemini API accessYes (if using Gemini API)
GOOGLE_GENAI_USE_VERTEXAISet to "true" to use Vertex AI instead of Gemini APINo (defaults to Gemini API)
GOOGLE_CLOUD_PROJECTGoogle Cloud project ID for Vertex AI and evaluation servicesYes (if using Vertex AI)
GOOGLE_CLOUD_LOCATIONGoogle Cloud region for Vertex AI and evaluation servicesYes (if using Vertex AI)
GOOGLE_APPLICATION_CREDENTIALSPath to service account JSON key file for authenticationNo (ADC handles without this)
GOOGLE_CLOUD_AGENT_ENGINE_IDAgent Engine ID for telemetry trackingNo

For Google LLM integration, you must configure either Gemini API (using GOOGLE_API_KEY) or Vertex AI (using GOOGLE_GENAI_USE_VERTEXAI=true with GOOGLE_CLOUD_PROJECT and GOOGLE_CLOUD_LOCATION).

What's Available

ADK-TS provides:

  • GcsArtifactService - Store and retrieve agent artifacts in Google Cloud Storage with versioning support
  • Authentication patterns - Application Default Credentials, Service Accounts, and OAuth 2.0
  • Tool templates - Examples for building custom tools for Cloud Storage, Vertex AI, and other services

ADK-TS does not include pre-built tools for specific Google Cloud services. Instead, it provides authentication patterns and the GcsArtifactService, giving you flexibility to create tools for the specific Google Cloud APIs your agents need.

Quick Start

Create a simple Cloud Storage tool:

import { BaseTool } from "@iqai/adk";
import { Storage } from "@google-cloud/storage";

class CloudStorageTool extends BaseTool {
  private storage: Storage;

  constructor() {
    super({
      name: "list_buckets",
      description: "Lists Google Cloud Storage buckets in your project",
    });

    this.storage = new Storage(); // Uses Application Default Credentials
  }

  async runAsync(args: { projectId: string }) {
    try {
      const [buckets] = await this.storage.getBuckets({
        project: args.projectId,
      });

      return {
        success: true,
        buckets: buckets.map((b) => ({
          name: b.name,
          location: b.metadata.location,
        })),
      };
    } catch (error) {
      const message = error instanceof Error ? error.message : String(error);

      return {
        success: false,
        error: `Failed to list buckets: ${message}`,
      };
    }
  }
}

Authentication

Google Cloud tools require authentication to access services. ADK-TS supports three authentication methods:

🔑 Application Default Credentials

Automatic credential discovery for development and production environments

🔐 Service Accounts

Server-to-server authentication for automated workloads and background services

🎯 OAuth 2.0

User authentication for accessing user-specific resources with consent

Application Default Credentials (ADC)

ADC automatically discovers credentials based on your environment. This is the recommended approach for most use cases.

How It Works

ADC checks for credentials in this order:

  1. GOOGLE_APPLICATION_CREDENTIALS environment variable pointing to a service account key
  2. User credentials from gcloud auth application-default login (local development)
  3. Attached service account (when running on Google Cloud services)
  4. Workload Identity (when running on GKE with Workload Identity enabled)

Setup for Development

# Install Google Cloud CLI
# Visit: https://cloud.google.com/sdk/docs/install

# Authenticate with your Google account
gcloud auth application-default login

# Set your default project
gcloud config set project YOUR_PROJECT_ID

# Verify setup
gcloud auth application-default print-access-token

Using ADC in Tools

import { BaseTool } from "@iqai/adk";
import { Storage } from "@google-cloud/storage";

class StorageTool extends BaseTool {
  private storage: Storage;

  constructor() {
    super({
      name: "cloud_storage",
      description: "Manage files in Google Cloud Storage",
    });

    // Storage client automatically uses ADC
    this.storage = new Storage();
  }

  async runAsync(args: {
    bucket: string;
    operation: string;
    fileName?: string;
  }) {
    const bucket = this.storage.bucket(args.bucket);

    switch (args.operation) {
      case "list":
        const [files] = await bucket.getFiles();
        return { files: files.map((f) => f.name) };

      case "delete":
        if (!args.fileName) return { error: "fileName required" };
        await bucket.file(args.fileName).delete();
        return { success: true };

      default:
        return { error: "Unknown operation" };
    }
  }
}

Service Account Authentication

Service accounts provide server-to-server authentication for production workloads and automated processes.

Store service account keys securely. Grant only the minimum permissions needed and rotate keys regularly in production.

Setup

  1. Create a service account in the Google Cloud Console
  2. Assign IAM roles (e.g., roles/storage.objectAdmin for Cloud Storage access)
  3. Create and download a JSON key file
  4. Set the GOOGLE_APPLICATION_CREDENTIALS environment variable
export GOOGLE_APPLICATION_CREDENTIALS="/path/to/service-account-key.json"

Using Service Accounts

import { BaseTool } from "@iqai/adk";
import { GoogleAuth } from "google-auth-library";

class PubSubTool extends BaseTool {
  private auth: GoogleAuth;

  constructor() {
    super({
      name: "publish_message",
      description: "Publishes messages to Google Cloud Pub/Sub",
    });

    this.auth = new GoogleAuth({
      keyFilename: process.env.GOOGLE_APPLICATION_CREDENTIALS,
      scopes: ["https://www.googleapis.com/auth/pubsub"],
    });
  }

  async runAsync(args: { projectId: string; topic: string; message: string }) {
    try {
      const authClient = await this.auth.getClient();
      const url = `https://pubsub.googleapis.com/v1/projects/${args.projectId}/topics/${args.topic}:publish`;

      const response = await authClient.request({
        url,
        method: "POST",
        data: {
          messages: [{ data: Buffer.from(args.message).toString("base64") }],
        },
      });

      return {
        success: true,
        messageId: response.data.messageIds[0],
      };
    } catch (error) {
      const message = error instanceof Error ? error.message : String(error);

      return {
        success: false,
        error: `Failed to publish: ${message}`,
      };
    }
  }
}

OAuth 2.0 User Authentication

OAuth 2.0 enables agents to access user-specific resources with user consent. Use this when your agent needs to access personal user data like Google Drive files or Calendar events.

OAuth 2.0 requires web server implementation for the authorization flow. Consider using service accounts for server-to-server scenarios instead.

Basic OAuth 2.0 Pattern

import { OAuth2Client } from "google-auth-library";

const oauth2Client = new OAuth2Client(
  process.env.GOOGLE_CLIENT_ID,
  process.env.GOOGLE_CLIENT_SECRET,
  process.env.GOOGLE_REDIRECT_URI
);

const authorizationCode = process.env.AUTHORIZATION_CODE ?? "";

// Generate authorization URL
const authUrl = oauth2Client.generateAuthUrl({
  access_type: "offline",
  scope: ["https://www.googleapis.com/auth/drive.readonly"],
});

// After user authorizes, exchange code for tokens
const { tokens } = await oauth2Client.getToken(authorizationCode);
oauth2Client.setCredentials(tokens);

// Use in API requests
const response = await oauth2Client.request({
  url: "https://www.googleapis.com/drive/v3/files",
});

For complete OAuth 2.0 implementation details, see the Google Auth Library documentation.

Building Custom Tools

Create custom tools for any Google Cloud service by extending BaseTool and using Google Cloud client libraries. These examples show the pattern for building tools that interact with Google Cloud APIs.

Cloud Storage Tool

Example tool for performing Cloud Storage operations:

import { BaseTool } from "@iqai/adk";
import { Storage } from "@google-cloud/storage";

class CloudStorageTool extends BaseTool {
  private storage: Storage;

  constructor() {
    super({
      name: "cloud_storage",
      description: "Perform Google Cloud Storage operations",
    });

    this.storage = new Storage();
  }

  async runAsync(args: {
    operation: "upload" | "download" | "list" | "delete";
    bucket: string;
    fileName?: string;
    localPath?: string;
    prefix?: string;
  }) {
    try {
      const bucket = this.storage.bucket(args.bucket);

      switch (args.operation) {
        case "upload":
          if (!args.fileName || !args.localPath) {
            return { error: "fileName and localPath required for upload" };
          }
          await bucket.upload(args.localPath, { destination: args.fileName });
          return { success: true, message: `Uploaded ${args.fileName}` };

        case "download":
          if (!args.fileName || !args.localPath) {
            return { error: "fileName and localPath required for download" };
          }
          await bucket.file(args.fileName).download({
            destination: args.localPath,
          });
          return { success: true, message: `Downloaded ${args.fileName}` };

        case "list":
          const [files] = await bucket.getFiles({ prefix: args.prefix });
          return {
            success: true,
            files: files.map((f) => ({ name: f.name, size: f.metadata.size })),
          };

        case "delete":
          if (!args.fileName) {
            return { error: "fileName required for delete" };
          }
          await bucket.file(args.fileName).delete();
          return { success: true, message: `Deleted ${args.fileName}` };

        default:
          return { error: `Unknown operation: ${args.operation}` };
      }
    } catch (error) {
      const message = error instanceof Error ? error.message : String(error);

      return {
        success: false,
        error: `Cloud Storage error: ${message}`,
      };
    }
  }
}

Vertex AI Tool

Example tool for making predictions with Vertex AI models:

import { BaseTool } from "@iqai/adk";
import { PredictionServiceClient } from "@google-cloud/aiplatform";

class VertexAIPredictionTool extends BaseTool {
  private client: PredictionServiceClient;

  constructor() {
    super({
      name: "vertex_ai_prediction",
      description: "Make predictions using deployed Vertex AI models",
    });

    this.client = new PredictionServiceClient();
  }

  async runAsync(args: { endpoint: string; instances: any[] }) {
    try {
      const [response] = await this.client.predict({
        endpoint: args.endpoint,
        instances: args.instances,
      });

      return {
        success: true,
        predictions: response.predictions,
      };
    } catch (error) {
      const message = error instanceof Error ? error.message : String(error);

      return {
        success: false,
        error: `Vertex AI prediction failed: ${message}`,
      };
    }
  }
}

GcsArtifactService

The GcsArtifactService stores agent artifacts in Google Cloud Storage with automatic versioning.

Setup

# Install the Google Cloud Storage client library
npm install @google-cloud/storage

# Create a bucket
gsutil mb gs://my-artifacts-bucket

# Set up authentication (choose one)
export GOOGLE_APPLICATION_CREDENTIALS="/path/to/service-account.json"
# OR
gcloud auth application-default login

Basic Usage

import { GcsArtifactService, LlmAgent } from "@iqai/adk";
import { Part } from "@google/genai";

const artifactService = new GcsArtifactService("my-artifacts-bucket");

// Provide the base64 image data (replace the placeholder with real data or load from fs)
const imageBase64: string = "<BASE64_ENCODED_IMAGE_DATA>";

// Save an artifact
const artifact: Part = {
  inlineData: {
    data: imageBase64,
    mimeType: "image/jpeg",
  },
};

const version = await artifactService.saveArtifact({
  appName: "photo_app",
  userId: "user123",
  sessionId: "session456",
  filename: "profile.jpg",
  artifact,
});

// Load an artifact
const loadedArtifact = await artifactService.loadArtifact({
  appName: "photo_app",
  userId: "user123",
  sessionId: "session456",
  filename: "profile.jpg",
  // version: 2, // Optional: load specific version
});

// Use with an agent
const agent = new LlmAgent({
  name: "image_processor_agent",
  description: "Agent for processing images",
  model: "gemini-2.0-flash",
  artifactService: artifactService,
  // ... other agent configuration
});

Storage Structure

Artifacts are organized with versioning support:

my-artifacts-bucket/
├── appName/
│   ├── userId/
│   │   ├── sessionId/
│   │   │   ├── file1.jpg/
│   │   │   │   ├── 0    # Version 0
│   │   │   │   └── 1    # Version 1
│   │   │   └── data.json/
│   │   │       └── 0
│   │   └── user/         # User namespace for cross-session artifacts
│   │       └── user:profile.png/
│   │           ├── 0
│   │           └── 1

API Reference

MethodDescription
saveArtifact(args)Saves an artifact and returns the version number
loadArtifact(args)Loads an artifact (latest version or specific version)
listArtifactKeys(args)Lists all artifact filenames for a session
deleteArtifact(args)Deletes an artifact and all its versions
listVersions(args)Lists all versions of a specific artifact

All methods require appName, userId, sessionId, and filename parameters.

Security Best Practices

Always follow Google Cloud security best practices for credential management and access control.

  • Credential Protection - Never commit credentials to source code or expose them in client-side code
  • Least Privilege - Grant only the minimum permissions needed for each service account. Use scoped authentication to limit access to specific services
  • Regular Rotation - Rotate service account keys regularly in production environments
  • Monitoring - Monitor credential usage and set up alerts for unusual activity patterns
  • Secure Storage - Store service account keys in secure secret management systems, not in environment files committed to version control
  • Environment Separation - Use different service accounts for development, staging, and production environments

How is this guide?