Forjinn Docs

Development Platform

Documentation v2.0
Made with
by Forjinn

Analytic

Learn about analytic and how to implement it effectively.

2 min read
🆕Recently updated
Last updated: 12/9/2025

Analytics & Trace Integrations

Modern AI workflows require full observability, traceability, and feedback. InnoSynth-Forjinn integrates with state-of-the-art Analytics and agent Tracing platforms—including Arize, LangFuse, LangSmith, Phoenix, Opik, and others—to give you deep insight into how agents run, why decisions are made, and how to continuously improve and debug your automations.


Why Analytics Integrations?

  • Trace every run: Detailed logs for each agent, tool, and node—inputs, outputs, branching, and errors.
  • Monitor performance and reliability: Latency, token usage, step durations, fail/success rates, and trending.
  • Close the loop for QA: Human/AI scoring, production eval, and automated feedback/ground truth.
  • Compliance: Satisfy audit, data access, and regulatory needs for model and agent usage.

Supported Analytics Integrations

Arize

  • Industry standard for AI/ML observability (model drift, validation, bias).
  • Automatic flow/agent traces.
  • Connect using your Arize API key and workspace name.

LangFuse

  • Designed for LLM tracing, prompt chaining, and runtime metadata insights.
  • Visualizes token usage, input/output diffs, retriever RAG traces.
  • Add in Settings or as a node; all agent runs export traces.

LangSmith

  • Focused on collaborative agent debugging & chain-of-thought visualization.
  • Real-time profiler for flows and chatbots.
  • Integrates with prompt engineering/experimentation tools.

Opik, Phoenix, Luna, LangWatch

  • Additional supported trace/logging platforms.
  • Allow cloud/on-prem integration, S3 export, or custom relay endpoints.

How to Configure Analytics

  • Add integration keys in platform Settings → Analytics/Monitoring
  • Or, set up dedicated Analytics node in the flow before/after agents/tools
  • Use tags or workspace-level configs to distinguish origins

Example: Enabling LangFuse for RAG QA

  1. Go to Settings → Analytics; paste your LangFuse API Key/ID.
  2. Every agent/tool run now exported to LangFuse—inspect via built-in dashboard.

Best Practices

  • Use a single integration per environment/workspace for clarity.
  • Always redact PII/sensitive info before export (choose which fields to send).
  • Review traces regularly for drift, error spikes, or prompt/agent model problems.
  • Use analytics to A/B test prompt/agent changes; iterate with data.

Troubleshooting

  • Traces not showing in dashboard: Check API key/config, confirm network reachability.
  • Data retention too short: Adjust platform or integration-side policy.
  • Invalid logs: Validate trace format—platform updates integrations as APIs evolve.

Analytics & Tracing are your window into real AI operations—use them to deliver reliable, explainable, and continuously-improving automation.