Skip to main content

OpenInference Groq Instrumentation

Project description

OpenInference Groq Instrumentation

Python autoinstrumentation library for the Groq package

This package implements OpenInference tracing for both Groq and AsyncGroq clients.

These traces are fully OpenTelemetry compatible and can be sent to an OpenTelemetry collector for viewing, such as Arize phoenix.

Installation

pip install openinference-instrumentation-groq

Quickstart

Through your terminal, install required packages.

pip install openinference-instrumentation-groq groq arize-phoenix opentelemetry-sdk opentelemetry-exporter-otlp

You can start Phoenix with the following terminal command:

python -m phoenix.server.main serve

By default, Phoenix listens on http://localhost:6006. You can visit the app via a browser at the same address. (Phoenix does not send data over the internet. It only operates locally on your machine.)

Try the following code in a Python file.

  1. Set up GroqInstrumentor to trace your application and sends the traces to Phoenix.
  2. Then, set your Groq API key as an environment variable.
  3. Lastly, create a Groq client, make a request, then go see your results in Phoenix at http://localhost:6006!
import os
from groq import Groq
from openinference.instrumentation.groq import GroqInstrumentor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.trace.export import SimpleSpanProcessor

# Configure GroqInstrumentor with Phoenix endpoint
endpoint = "http://127.0.0.1:6006/v1/traces"
tracer_provider = trace_sdk.TracerProvider()
tracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint)))

GroqInstrumentor().instrument(tracer_provider=tracer_provider)

os.environ["GROQ_API_KEY"] = "YOUR_KEY_HERE"

client = Groq()

chat_completion = client.chat.completions.create(
    messages=[
        {
            "role": "user",
            "content": "Explain the importance of low latency LLMs",
        }
    ],
    model="llama3-8b-8192",
)

if __name__ == "__main__":
    print(chat_completion.choices[0].message.content)

Now, on the Phoenix UI on your browser, you should see the traces from your Groq application. Click on a trace, then the "Attributes" tab will provide you with in-depth information regarding execution!

More Info

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openinference_instrumentation_groq-0.1.11.tar.gz (11.8 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file openinference_instrumentation_groq-0.1.11.tar.gz.

File metadata

File hashes

Hashes for openinference_instrumentation_groq-0.1.11.tar.gz
Algorithm Hash digest
SHA256 d23d640822dd66682f779c490b7d9cebbf26895cc06ff75d6ae8e1df8210f1e8
MD5 784c28e0782b0ee07def99be42f3ad7f
BLAKE2b-256 4f774bafc9b2307325e97afc9857e18eaa2d952ebeedb4063f4f4ae82fa9624f

See more details on using hashes here.

File details

Details for the file openinference_instrumentation_groq-0.1.11-py3-none-any.whl.

File metadata

File hashes

Hashes for openinference_instrumentation_groq-0.1.11-py3-none-any.whl
Algorithm Hash digest
SHA256 b5303469870ba474dc3461558730c18cf5ef787b490386f72d20a250ede9cf36
MD5 9ccdecbae87012f384c4c8362bfc4224
BLAKE2b-256 641a073e8bea94af702bf554f0a1c53a403783ef942fdd07a5ec286a4d642d0a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page