Skip to main content

Feature Comparison Matrix

A comprehensive comparison of what you get with each framework and integration approach. The columns progress from standalone frameworks on the left to full Waxell native on the right.

Legend

SymbolMeaning
YesFully supported out of the box
PartialPartially supported or requires significant custom work
NoNot available

Core Agent Capabilities

FeatureLangChainLangChain + ObserveCrewAICrewAI + ObserveCustom PythonCustom + ObserveWaxell Native
Declarative agent definitionsNoNoPartialPartialNoNoYes
Tool abstractionYesYesYesYesManualManualYes
Workflow orchestrationPartialPartialYesYesManualManualYes
Multi-agent coordinationPartialPartialYesYesManualManualYes
LLM model routingManualManualManualManualManualManualYes
Notes on partial support
  • LangChain declarative: LangChain Expression Language (LCEL) provides composable chains, but agent definitions remain imperative.
  • CrewAI declarative: Agent/Task/Crew classes are semi-declarative, but orchestration logic is still code-level.
  • LangChain workflow orchestration: LangGraph adds graph-based workflows, but lacks durability and governance.

Observability

FeatureLangChainLangChain + ObserveCrewAICrewAI + ObserveCustom PythonCustom + ObserveWaxell Native
Execution run trackingNoYesNoYesNoYesYes
LLM call trackingNoYesNoYesNoYesYes
Automatic token countingNoYesNoPartialNoPartialYes
Step-by-step execution trailNoYesNoYesNoYesYes
Input/output captureNoYesNoYesNoYesYes
Dashboard UINoYesNoYesNoYesYes
Notes on automatic token counting
  • LangChain + Observe: The WaxellLangChainHandler callback extracts token usage from LangChain's LLMResult automatically.
  • CrewAI + Observe / Custom + Observe: Token counts must be passed to ctx.record_llm_call() manually or extracted from your LLM client's response.
  • Waxell Native: The LLM router tracks all token usage automatically.

Cost Management

FeatureLangChainLangChain + ObserveCrewAICrewAI + ObserveCustom PythonCustom + ObserveWaxell Native
LLM cost estimationNoYesNoYesNoYesYes
Per-model pricing (20+ models)NoYesNoYesNoYesYes
Tenant-level cost overridesNoYesNoYesNoYesYes
Budget enforcementNoYesNoYesNoYesYes
Cost-per-run visibilityNoYesNoYesNoYesYes

Governance and Policy

FeatureLangChainLangChain + ObserveCrewAICrewAI + ObserveCustom PythonCustom + ObserveWaxell Native
Pre-execution policy checksNoYesNoYesNoYesYes
Budget limit policiesNoYesNoYesNoYesYes
Rate limiting / throttlingNoYesNoYesNoYesYes
Content filteringNoPartialNoPartialNoPartialYes
Approval workflowsNoNoNoNoNoNoYes
Dynamic policy managementNoNoNoNoNoNoYes
Full governance lifecycleNoNoNoNoNoNoYes
Notes on content filtering
  • +Observe: Policy checks can block execution based on agent name, workflow, or budget. Content-level filtering (inspecting prompts/responses) requires custom policy rules in the control plane.
  • Waxell Native: The DynamicPolicyManager evaluates policies at every governance hook point, including pre-execution, mid-workflow, and post-completion.

Durability and Infrastructure

FeatureLangChainLangChain + ObserveCrewAICrewAI + ObserveCustom PythonCustom + ObserveWaxell Native
Durable workflowsNoNoNoNoNoNoYes
Checkpoint / resumeNoNoNoNoNoNoYes
Pause / resume (human-in-the-loop)NoNoNoNoNoNoYes
Multi-tenancyNoPartialNoPartialNoPartialYes
Signal-driven execution (webhooks)NoNoNoNoNoNoYes
Production backends (Redis, Celery)NoNoNoNoNoNoYes
Audit trail with agent_traceNoNoNoNoNoNoYes
Generation layer (RAG, prompt versioning)NoNoNoNoNoNoYes
Notes on multi-tenancy
  • +Observe: Runs are scoped to a tenant via the control plane API key. Data isolation is at the API level.
  • Waxell Native: Full tenant isolation at the database level, with per-tenant policies, model configurations, and billing.

Summary

The progression from left to right represents increasing levels of governance and infrastructure:

  1. Standalone frameworks (LangChain, CrewAI, custom) give you agent capabilities but no governance.
  2. + Waxell Observe adds observability, cost tracking, and basic policy enforcement with minimal code changes.
  3. Waxell Native provides the full stack: declarative definitions, durable workflows, signal-driven execution, full governance lifecycle, and production infrastructure.

Each level delivers standalone value. See the Progressive Migration guide for how to move between them at your own pace.