import AIInstrumentationApproaches from "/snippets/ai-instrumentation-approaches.mdx"

In the Observe stage of the AI engineering lifecycle, the focus is on understanding how your deployed generative AI capabilities perform in the real world. After creating and evaluating a capability, observing its production behavior is crucial for identifying unexpected issues, tracking costs, and gathering the data needed for future improvements.

Instrument your app

Review production conversations

Review and Issues let domain experts inspect production conversations, capture human feedback, and group recurring failures into a tracked backlog.

  • Use Review conversations to work through flagged, recent, feedback-driven, errored, and previously reviewed conversations.
  • Use Track issues to consolidate repeated failures, inspect supporting evidence, and manage issue status over time.

Visualize traces in Console

Visualizing and making sense of this telemetry data is a core part of the Axiom Console experience:

  • A dedicated AI traces waterfall view visualizes single and multi-step LLM workflows, with clear input/output inspection at each stage.
  • A pre-built GenAI OTel dashboard automatically appears for any dataset receiving AI telemetry. It features elements for tracking cost per invocation, time-to-first-token, call counts by model, and error rates.

Access AI traces waterfall view

  1. Click the Query tab.

  2. Create an APL query about your GenAI dataset. For example:

    ['otel-demo-genai']
    | where ['attributes.gen_ai.operation.name'] == "chat"

    Run in Playground

  3. In the list of trace IDs, click the trace you want to explore.

  4. Explore how spans within the trace are related to each other in the waterfall view. To only display AI spans, click AI spans in the top left.

AI traces waterfall view

Access GenAI dashboard

Axiom automatically creates the GenAI dashboard if the field attributes.gen_ai.operation.name is present in your data.

To access the GenAI dashboard:

  1. Click the Dashboards tab.
  2. Click the dashboard Generative AI Overview (DATASET_NAME) where DATASET_NAME is the name of your GenAI dataset.

Run in Playground

The GenAI dashboard provides you with important insights about your GenAI app such as:

  • Vitals about requests, broken down by operation, capability, and step.
  • Token usage and cost analysis
  • Error analysis
  • Comparison of performance and reliability of different AI models
AI traces waterfall view

What’s next?

After capturing and analyzing production telemetry, use these insights to improve your capability. Learn more in Iterate.

Good afternoon

I'm here to help you with the docs.

I
AIBased on your context