W&B Inference gives you access to leading open-source foundation models through W&B Weave and an OpenAI-compatible API.Documentation Index
Fetch the complete documentation index at: https://docs.wandb.ai/llms.txt
Use this file to discover all available pages before exploring further.
- Using Inference you can build AI applications and agents without signing up for a hosting provider or self-hosting a model.
- Using Weave, you can trace, evaluate, monitor, and improve your W&B Inference-powered applications.
Try out Inference in the UI
Navigate to https://wandb.ai/inference to explore available models and try them out in the Weave Playground. For more information on the web interface, see the UI Guide.Use Inference through the API
This Python example uses Inference to send a chat completion request to an LLM.Next steps
- Set up your account using the prerequisites.
- Review the available models and usage information and limits.
- Use the service through the API or UI.
- Try out supported models in the W&B Weave Playground.
- Try the usage examples.
For information about pricing, usage limits, and credits, see Usage Information and Limits.