LLM insights (beta)

Last updated:

|Edit this page

We've teamed up with various LLM platforms to track metrics for LLM apps. This makes it easy to answer questions like:

  • What are my LLM costs by customer, model, and in total?
  • How many of my users are interacting with my LLM features?
  • Are there generation latency spikes?
  • Does interacting with LLM features correlate with other metrics (retention, usage, revenue, etc.)?

Supported integrations

Currently, we support integrations for the following platforms:

Dashboard templates

Once you've installed an integration, dashboard templates help you quickly set up relevant insights. Here are examples for Langfuse, Helicone, Traceloop, and Keywords AI.

To create your own dashboard from a template:

  1. Go the dashboard tab in PostHog.
  2. Click the New dashboard button in the top right.
  3. Select LLM metrics – [name of the integration you installed] from the list of templates.

Questions?

Was this page useful?

Next article

Tutorials and guides

Got a question which isn't answered below? Head to the community forum to let us know! How to setup PostHog for AI How to set up LLM analytics for Cohere How to set up LLM analytics for Anthropic's Claude How to set up LLM analytics for ChatGPT How to monitor generative AI calls to AWS Bedrock How to compare AWS Bedrock prompts How to compare AWS Bedrock foundational models Using LLMs in analytics Product metrics to track for LLM apps How to analyze surveys with ChatGPT

Read next article