- Track data as it flows through your application
- Track metadata at call time
Tracking nested function calls
LLM-powered applications can contain multiple LLMs calls and additional data processing and validation logic that is important to monitor. Even deep nested call structures common in many apps, Weave will keep track of the parent-child relationships in nested functions as long asweave.op()
is added to every function you’d like to track.
Building on the quickstart example, the following code adds additional logic to count the returned items from the LLM and wrap them all in a higher level function. Additionally, the example uses weave.op()
to trace every function, its call order, and its parent-child relationship:
- Python
- TypeScript
extract_dinos
and count_dinos
), as well as the automatically-logged OpenAI trace.
Tracking metadata
You can track metadata by using theweave.attributes
context manager and passing it a dictionary of the metadata to track at call time.
Continuing our example from above:
- Python
- TypeScript
We recommend tracking metadata at run time, such as your user IDs and your code’s environment status (development, staging, or production).To track system settings, such as a system prompt, we recommend using Weave Models
What’s next?
- Follow the App Versioning tutorial to capture, version, and organize ad-hoc prompt, model, and application changes.