Everything talks to QuerIA
Cloud storage, identity providers, certified knowledge bases, LLMs, custom REST tools. Native integrations, no middleware.
Cloud Storage & Documents
Document sync from your enterprise clouds, with scheduling and incremental change detection.
Microsoft SharePoint
Folders, libraries, scheduled sync
OneDrive for Business
User and team folders
Google Drive
User OAuth, shared and My Drive
Collabora Online
Built-in viewer + document conversion
Webhook personalizzati
POST endpoint for scheduled ingestion
Identity & SSO
Enterprise login with your existing providers. Multi-tenant by default.
Microsoft Entra (Azure AD)
SSO via OAuth 2.0 / OIDC
Google Workspace
OAuth with corporate domain
Email + 2FA (TOTP)
For environments without central IdP
API Key per-tenant
For server-to-server integrations
External knowledge bases (federated RAG)
Specialized microservices feeding retrieval with certified sources.
Normattiva.it (Legal Sources)
live143k Italian legal documents + Neo4j GraphRAG
Open Food Facts
liveFood product database, Nutri-Score, NOVA, Eco-Score
ECHA / REACH
liveChemical substances, SDS, CLP, SVHC
FDA / ClinicalTrials.gov / PubMed
liveDrugs, clinical trials, adverse events, publications
Agenzia delle Entrate
liveItalian tax: IRPEF, VAT, IRES, IMU, filings
External Tools (OpenAPI federation)
Plug your existing REST services as tools the agents can call.
OpenAPI / Swagger import
Automatic operation discovery from your OpenAPI URL
REST manual
Manually configured endpoints: GET/POST/PUT/DELETE/PATCH
Auth flessibile
NONE / API_KEY / BEARER, encrypted per-tenant secrets
Cache + timeout
TTL and timeout configurable per operation
AI Providers (BYOK)
Multi-provider LLM Vault: bring your own keys, tenant by tenant. Encrypted at rest.
OpenAI
GPT-5, GPT-4.1, Embeddings
Anthropic Claude
Claude Opus / Sonnet / Haiku
Google Gemini
Gemini Pro / Flash
AWS Bedrock / SageMaker
Managed inference in your AWS VPC
Self-hosted vLLM
Kubernetes pods with GPU (any cluster)
Ollama Cloud
Fallback / playground
Groq
Llama 3 low-latency inference
Developers
Everything is an endpoint. Everything is composable.
REST API pubblica
Streaming chat, upload, retrieval — API key auth
Widget embeddable
JavaScript snippet to embed chat into any site
Webhook in/out
Triggers for ingestion from external sources
Canvas DSL JSON
Import/export pipelines as versionable JSON
Missing an integration you need?
The External Tools framework lets you wire up any REST API in minutes.