Skip to main content

Documentation Index

Fetch the complete documentation index at: https://www.truefoundry.com/llms.txt

Use this file to discover all available pages before exploring further.

This guide provides instructions for integrating Middleware with the TrueFoundry AI Gateway to export OpenTelemetry traces.

What is Middleware?

Middleware provides observability and monitoring for cloud-native workloads. Connecting the TrueFoundry AI Gateway lets you ingest LLM and gateway spans into your Middleware environment for analysis alongside the rest of your stack.

Key Features of Middleware

  • Unified observability: Consolidate traces, logs, and infrastructure signals in one place
  • OpenTelemetry ingestion: Receive OTLP spans from gateways and SDKs compatible with OTLP HTTP
  • Operational visibility: Correlate latency, errors, and usage patterns across services and AI workloads

Prerequisites

Before integrating Middleware with TrueFoundry, ensure you have:
  1. TrueFoundry Account: Create a Truefoundry account and follow the instructions in our Gateway Quick Start Guide
  2. Middleware access: Log in to Middleware with your work email (the account your organization uses for Middleware).
  3. Middleware API key: Obtain the API key Middleware issued for OTLP trace ingestion (typically from Middleware project or organization settings — use the credential they designate for traces).
Middleware Settings with Ingestion key tab and API key list including Create New Key

Integration Steps

TrueFoundry AI Gateway supports exporting OpenTelemetry traces to Middleware using your tenant OTLP ingest URL.
1

Collect your Middleware API key

  1. Log in to Middleware with your work email and open your API key or organization settings (your admin may point you to the credential used for trace ingestion).
  2. Copy the secret and store it in a credential manager until you paste it into TrueFoundry gateway settings — you typically cannot retrieve the full secret again once generated.
If you are unsure which key fits this integration, confirm with Middleware support — this flow expects the key mapped to OTLP ingestion at https://<your-domain>.middleware.io:443/v1/traces (replace <your-domain> with your Middleware hostname prefix).
2

Configure OTEL Export in TrueFoundry

  1. In the TrueFoundry dashboard, go to AI EngineeringSettingsOTEL Config (under Organisation, in the AI Gateway section).
TrueFoundry AI Engineering and Settings navigation to Organisation OTEL Config with Middleware traces endpoint and authorization header
  1. Click edit on the OTEL Config section to open the exporter form (if it is not already open).
  2. Enable the OTEL Traces Exporter Configuration toggle.
  3. Select HTTP Configuration.
  4. Enter the Middleware traces endpoint: https://<your-domain>.middleware.io:443/v1/traces
  5. Set Encoding to Proto.
TrueFoundry OTEL Traces Exporter Configuration with Middleware HTTPS traces endpoint, Proto encoding, and Authorization header set to the Middleware API key
3

Configure Headers

Middleware expects authentication through the Authorization header only. Paste the raw API key as the header value (do not add a Bearer prefix unless Middleware explicitly instructed you otherwise).
HeaderValue
Authorization<YOUR_MIDDLEWARE_API_KEY>
Click Save to apply your configuration.
4

Verify the Integration

  1. Make some requests through the TrueFoundry AI Gateway.
  2. Navigate to the Monitor section in TrueFoundry to verify traces are being generated.
  3. Open Middleware and drill into traces for your project or environment to confirm spans sourced from TrueFoundry are appearing.
Middleware Traces dashboard showing gateway spans, metrics, and trace list from TrueFoundry

Configuration Options

Middleware Endpoint

Middleware exposes ingest configuration similar to the following (use your tenant hostname):
ConfigurationValue
Traces Endpointhttps://<your-domain>.middleware.io:443/v1/traces
ProtocolHTTP
EncodingProto
AuthenticationAuthorization set to your Middleware API key (no extra headers required)