Welcome to Portkey Forum

Updated 3 months ago

Tracking LLM Calls and Other Retrieval Calls

Hi,

I'm tracking my LLM calls now and waiting for custom span name. But I was working how can i track other calls like retrieval calls and so.

I saw this screenshot in the tracing documentation, but there is no mention on how we can do that?
Attachment
image.png
V
d
6 comments
Hey @darkprince tracing is fully supported for Langchain & Llamaindex at the moment with Portkey's custom callback handler, and is also supported when you insert a log using our logger endpoint

We are updating our /chat/completions and other Gateway endpoints to be able to do tracing for all your requests
I'll share the relevant working snippet with you here once it's live. To confirm, are you experimenting with tracing on raw OpenAI / LLM calls, or using a framework like Langchain etc?
Okay. Cool. The llm call logs are listed flat, any way to get that in tree view which makes it easy to track.
You're using Portkey's API to make calls right?
Yes. Not using any frameworks. Using portkey SDK for all the LLM calls, passing traceID and some meta data.
Perfect. Thanks, will get back to you!
Add a reply
Sign up and join the conversation on Discord