Skip to main content

OpenInference OpenLLMetry Instrumentation

Project description

OpenInference OpenLLMetry (Traceloop)

Python auto-instrumentation library for OpenLLMetry. This library allows you to convert OpenLLMetry traces to OpenInference, which is OpenTelemetry compatible, and view those traces in Arize Phoenix.

Installation

pip install openinference-instrumentation-openllmetry

Quickstart

This quickstart shows you how to view your OpenLLMetry traces in Phoenix.

Install required packages.

pip install arize-phoenix opentelemetry-sdk opentelemetry-exporter-otlp opentelemetry-instrumentation-openai

Start Phoenix in the background as a collector. By default, it listens on http://localhost:6006. You can visit the app via a browser at the same address. (Phoenix does not send data over the internet. It only operates locally on your machine.)

phoenix serve

Here's a simple example that demonstrates how to view convert OpenLLMetry traces into OpenInference and view those traces in Phoenix:

import os
import grpc
import openai
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporter
from phoenix.otel import register
from openinference.instrumentation.openllmetry import OpenInferenceSpanProcessor
from opentelemetry.instrumentation.openai import OpenAIInstrumentor

# Set your OpenAI API key
os.environ["OPENAI_API_KEY"] = "YOUR_OPENAI_API_KEY"

# Set up the tracer provider
tracer_provider = register(
    project_name="default" #Phoenix project name
)

tracer_provider.add_span_processor(OpenInferenceSpanProcessor())
    
tracer_provider.add_span_processor(
    BatchSpanProcessor(
        OTLPSpanExporter(
            endpoint="http://localhost:4317", #if using phoenix cloud, change to phoenix cloud endpoint (phoenix cloud space -> settings -> endpoint/hostname)
            headers={},
            compression=grpc.Compression.Gzip,  # use enum instead of string
        )
    )
)


OpenAIInstrumentor().instrument(tracer_provider=tracer_provider)

# Define and invoke your OpenAI model
client = openai.OpenAI()

messages = [
        {"role": "user", "content": "What is the national food of Yemen?"}
    ]

response = client.chat.completions.create(
    model="gpt-4",
    messages=messages,
)

# Now view your converted OpenLLMetry traces in Phoenix!

This example:

  1. Uses OpenLLMetry Instrumentor to instrument the application.
  2. Defines a simple OpenAI model and runs a query
  3. Queries are exported to Phoenix using a span processor.

The traces will be visible in the Phoenix UI at http://localhost:6006.

More Info

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Built Distribution

File details

Details for the file openinference_instrumentation_openllmetry-0.1.2.tar.gz.

File metadata

File hashes

Hashes for openinference_instrumentation_openllmetry-0.1.2.tar.gz
Algorithm Hash digest
SHA256 13e4776a6bff0e16a2e4ddb6f8e32fa00e23897b2c21ae58af814455effa4a6a
MD5 33da99258dcdeac6ff42af4da459e389
BLAKE2b-256 216852a20ee12fb97853b09411e6074bd77d56ddbe5f45e81243da6f4aee5077

See more details on using hashes here.

File details

Details for the file openinference_instrumentation_openllmetry-0.1.2-py3-none-any.whl.

File metadata

File hashes

Hashes for openinference_instrumentation_openllmetry-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 e623826f83285cb7de1d7b2fe6d725c402c9cadc75d432e3f87433c1392145b5
MD5 33f8c176362f25052bd400a5d960c7b0
BLAKE2b-256 4e8400c546c7dd7dcb23488ca7f2a84fe38ee9631630e5a621ac849d1e96ce58

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page