Skip to main content

Documentation Index

Fetch the complete documentation index at: https://www.truefoundry.com/llms.txt

Use this file to discover all available pages before exploring further.

This guide provides instructions for integrating Traceloop with the TrueFoundry AI Gateway to export OpenTelemetry traces.
Traceloop ingests traces only via OTLP. LLM metrics such as token usage, latency, and cost are derived from trace span attributes and surfaced in the Traceloop dashboard — they do not require a separate metrics exporter. Keep the Otel Metrics Exporter disabled when using Traceloop.

What is Traceloop?

Traceloop is an LLM observability platform built on OpenLLMetry, its open-source OpenTelemetry-based instrumentation layer. It ingests OTLP traces from LLM applications and provides dashboards for monitoring prompt performance, token usage, latency, and model behaviour across environments. Traceloop also supports prompt versioning, regression testing, and environment-based API key management.

Key Features of Traceloop

  • LLM Trace Observability: Captures full request and response traces across OpenAI, Anthropic, and other LLM providers with standard gen_ai semantic conventions.
  • Multi-Environment API Keys: Supports separate API keys for Development, Staging, and Production environments to keep telemetry streams isolated.
  • OpenTelemetry Collector Support: Accepts traces forwarded from any OTLP/HTTP-compatible collector, making it easy to fan out from existing OTel pipelines.

Prerequisites

Before integrating Traceloop with TrueFoundry, ensure you have:
  1. TrueFoundry Account: Create a TrueFoundry account and follow the instructions in our Gateway Quick Start Guide.
  2. Traceloop Account: Sign up at traceloop.com and have access to the Traceloop dashboard to generate an API key.

Integration Steps

1

Generate a Traceloop API Key

  1. Log in to your Traceloop dashboard.
  2. In the left-hand navigation, click Environments.
  3. Select the environment you want to send data to (Development, Staging, or Production).
  4. Click Generate API Key.
  5. Click Copy Key immediately — API keys are only shown once and are not stored by Traceloop.
Traceloop Environments Management page showing the Generate API Key button
If you lose your API key, revoke it from the Environments page and generate a new one. There is no way to retrieve an existing key.
2

Configure OTEL Export in TrueFoundry

  1. Go to AI GatewayControlsSettings in the TrueFoundry dashboard.
  2. Scroll down to the OTEL Config section and click the edit (✏️) button.
TrueFoundry AI Gateway Settings page showing the OTEL Config section
  1. Enable the Otel Traces Exporter Configuration toggle and fill in:
FieldValue
ToggleEnabled
ProtocolHTTP Configuration
Endpointhttps://api.traceloop.com/v1/traces
EncodingProto
Header KeyAuthorization
Header ValueBearer <your-traceloop-api-key>
TrueFoundry OTEL Traces Exporter Configuration showing endpoint and Authorization header for Traceloop
  1. Click Save to apply the configuration.
3

Verify the Integration

  1. Make a request through the TrueFoundry AI Gateway.
  2. Log in to your Traceloop dashboard and navigate to the Traces section.
  3. Confirm that traces with the service name tfy-llm-gateway are appearing.
Traceloop traces page showing LLM call traces from TrueFoundry AI Gateway

Configuration Reference

ConfigurationValue
Traces Endpointhttps://api.traceloop.com/v1/traces
ProtocolHTTP
EncodingProto
Auth Header KeyAuthorization
Auth Header ValueBearer <your-traceloop-api-key>