11.6 Agena API Integration - ravkorsurv/kor-ai-core GitHub Wiki
Kor.ai uses the Agena.ai Cloud API to perform probabilistic inference on Bayesian Network models for market abuse detection. This document describes how we authenticate, structure payloads, interpret responses, and integrate inference into the alerting pipeline.
POST https://api.agena.ai/public/v1/calculate
-
Token Request
POST https://auth.agena.ai/realms/cloud/protocol/openid-connect/token
-
HTTP Headers
Authorization: Bearer <access_token>
Content-Type: application/json
-
Management
-
Tokens are securely stored in environment variables
-
Backend rotates tokens on expiry
-
Only scoped secrets are permitted in execution environments
{
"model": "InsiderDealingModel",
"dataSet": {
"Q1": true,
"Q3": "High",
"Q5": "Positive"
},
"sync-wait": true
}
Field | Description |
---|---|
model | Name of the Bayesian model hosted in Agena |
dataSet | Map of node IDs to observed evidence |
sync-wait | When true, returns the full result in the same call |
-
Max retries: 3
-
Retry delay: exponential backoff
-
Fallback trigger:
-
API timeout
-
5xx or 401 response
-
-
Fallback outcome:
-
Alert marked as
"scoreStatus": "API_Failure"
-
Notification logged for reprocessing
-
-
Support for Agenaโs asynchronous batch mode (
async
) -
Token auto-refresh handling via background job
-
Integration with local pgmpy engine as fallback or override
-
Per-alert cost tracking for API budget control
-
Unified format for all responses to support case management export