GCP Cloud Monitoring Integration
Forward Google Cloud metrics to Qorrelate
Overview
Google Cloud Monitoring (formerly Stackdriver Monitoring) provides visibility into the performance of your cloud applications. This integration forwards GCP metrics to Qorrelate, enabling unified metrics analysis alongside logs and traces from all your infrastructure.
Prerequisites
- GCP project with Cloud Monitoring API enabled
- Service account with
monitoring.viewerrole - Service account key (JSON) for authentication
- Your Qorrelate API endpoint and organization ID
Configuration Steps
1. Create Service Account
Create a service account with monitoring read permissions:
# Create service account
gcloud iam service-accounts create qorrelate-metrics-reader \
--display-name="Qorrelate Metrics Reader"
# Grant monitoring viewer role
gcloud projects add-iam-policy-binding YOUR_PROJECT \
--member="serviceAccount:qorrelate-metrics-reader@YOUR_PROJECT.iam.gserviceaccount.com" \
--role="roles/monitoring.viewer"
# Create and download key
gcloud iam service-accounts keys create ~/qorrelate-sa-key.json \
--iam-account=qorrelate-metrics-reader@YOUR_PROJECT.iam.gserviceaccount.com
2. Deploy Metrics Exporter
Create a Cloud Function or Cloud Run service to export metrics:
from google.cloud import monitoring_v3
from google.protobuf.json_format import MessageToDict
import os
import requests
from datetime import datetime, timedelta
def export_metrics(request):
"""Export GCP metrics to Qorrelate."""
client = monitoring_v3.MetricServiceClient()
project_name = f"projects/{os.environ['GCP_PROJECT_ID']}"
# Time range: last 5 minutes
now = datetime.utcnow()
interval = monitoring_v3.TimeInterval({
'end_time': {'seconds': int(now.timestamp())},
'start_time': {'seconds': int((now - timedelta(minutes=5)).timestamp())}
})
# Metrics to export
metric_types = [
'compute.googleapis.com/instance/cpu/utilization',
'compute.googleapis.com/instance/disk/read_bytes_count',
'compute.googleapis.com/instance/disk/write_bytes_count',
'compute.googleapis.com/instance/network/received_bytes_count',
'compute.googleapis.com/instance/network/sent_bytes_count',
'run.googleapis.com/request_count',
'run.googleapis.com/request_latencies',
'cloudfunctions.googleapis.com/function/execution_count',
'cloudfunctions.googleapis.com/function/execution_times',
]
all_metrics = []
for metric_type in metric_types:
try:
results = client.list_time_series(
request={
'name': project_name,
'filter': f'metric.type = "{metric_type}"',
'interval': interval,
'view': monitoring_v3.ListTimeSeriesRequest.TimeSeriesView.FULL
}
)
for series in results:
series_dict = MessageToDict(series._pb)
metric_name = series_dict.get('metric', {}).get('type', '').split('/')[-1]
labels = series_dict.get('metric', {}).get('labels', {})
resource_labels = series_dict.get('resource', {}).get('labels', {})
for point in series_dict.get('points', []):
value = point.get('value', {})
# Extract value (could be int64Value, doubleValue, etc.)
metric_value = (
value.get('doubleValue') or
value.get('int64Value') or
value.get('distributionValue', {}).get('mean', 0)
)
all_metrics.append({
'name': f"gcp_{metric_name}",
'value': float(metric_value) if metric_value else 0,
'timestamp': point.get('interval', {}).get('endTime'),
'labels': {
'source': 'gcp-cloud-monitoring',
'project_id': resource_labels.get('project_id'),
'zone': resource_labels.get('zone'),
'instance_id': resource_labels.get('instance_id'),
**labels
}
})
except Exception as e:
print(f"Error fetching {metric_type}: {e}")
if all_metrics:
# Forward to Qorrelate
headers = {
"Content-Type": "application/json",
"Authorization": f"Bearer {os.environ['QORRELATE_API_KEY']}",
"X-Organization-Id": os.environ["QORRELATE_ORG_ID"]
}
response = requests.post(
f"{os.environ['QORRELATE_ENDPOINT']}/v1/metrics",
headers=headers,
json={"metrics": all_metrics}
)
return f"Exported {len(all_metrics)} metrics, status: {response.status_code}"
return "No metrics to export"
3. Schedule Metric Export
Use Cloud Scheduler to run the exporter every minute:
# Deploy the function
gcloud functions deploy export-gcp-metrics \
--runtime python311 \
--trigger-http \
--allow-unauthenticated \
--set-env-vars GCP_PROJECT_ID=your-project,QORRELATE_API_KEY=your_key,QORRELATE_ORG_ID=your_org,QORRELATE_ENDPOINT=https://qorrelate.io
# Create Cloud Scheduler job
gcloud scheduler jobs create http qorrelate-metrics-export \
--schedule="* * * * *" \
--uri="https://REGION-PROJECT.cloudfunctions.net/export-gcp-metrics" \
--http-method=GET
Available GCP Metrics
| Service | Key Metrics |
|---|---|
| Compute Engine | CPU utilization, disk I/O, network I/O |
| Cloud Run | Request count, latency, instance count |
| Cloud Functions | Execution count, execution time, memory usage |
| GKE | Pod CPU/memory, container restarts, node health |
| Cloud SQL | CPU, memory, connections, queries |
| Cloud Storage | Request count, bytes transferred, object count |
Using OpenTelemetry Collector (Alternative)
For a more robust solution, use the OpenTelemetry Collector with the GCP receiver:
# otel-collector-config.yaml
receivers:
googlecloudmonitoring:
collection_interval: 60s
project_id: your-project-id
metrics_list:
- metric_name_prefix: compute.googleapis.com/instance/cpu
- metric_name_prefix: compute.googleapis.com/instance/network
- metric_name_prefix: run.googleapis.com
exporters:
otlphttp:
endpoint: https://qorrelate.io/v1/metrics
headers:
Authorization: Bearer ${QORRELATE_API_KEY}
X-Organization-Id: ${QORRELATE_ORG_ID}
service:
pipelines:
metrics:
receivers: [googlecloudmonitoring]
exporters: [otlphttp]
Verifying the Integration
- Create the service account and download credentials
- Deploy the metrics export function
- Set up Cloud Scheduler for periodic execution
- Wait a few minutes for metrics to accumulate
- View metrics in Qorrelate's Metrics Explorer
- Filter by label
source:gcp-cloud-monitoring
💡 Pro Tip
Use metric descriptors to understand available metrics in your project:
gcloud monitoring metrics-descriptors list --filter="metric.type:compute"