Skip to main content

Docs / Integration

Integration

LogiRoot is provider-agnostic. You wrap each AI tool call with a governance evaluation before execution and record the decision. The sidecar is the only thing your application talks to; LogiRoot does not intercept your LLM provider calls directly.

Pre-call pattern

# Pseudocode — applies to any LLM provider you use today
def call_tool(tool_name, args, context):
    decision = logiroot.evaluate(
        tool_name=tool_name,
        args=args,
        context=context,
    )

    if decision.decision == "REJECT":
        raise GovernanceRejected(decision.policy_ids, decision.receipt_id)
    if decision.decision == "ESCALATE":
        return await_human_approval(decision.receipt_id)

    return execute_tool(tool_name, args)

Compatible providers

LogiRoot governs tool-call decisions; it works regardless of which model produced them. Customers commonly use it alongside major LLM providers and self-hosted models.

AWS-native deployment

Failure mode

LogiRoot fails closed by default. If the governance endpoint is unreachable, the SDK raises rather than silently approving. You can opt into a fail-open mode for non-critical paths via SDK configuration.

SDK

Official SDK quickstarts ship per-language. Until then, the API is small enough that a thin HTTP client works fine. See the API Reference.