Product

Auri — the edge‑native Home Brain

Auri is Estincelle’s local multi‑agent platform that runs on in‑home hardware, learns household routines, and orchestrates devices across brands with strict latency and reliability guarantees.

What Auri does

Auri runs as a persistent service inside the home — on a hub, router, or mini‑PC. It maintains a live understanding of the home’s state and coordinates devices, scenes, and services on behalf of the user

Instead of dozens of brittle automations, you get one Home Brain that reasons about routines, goals, and context, and uses multiple agents to execute the right actions at the right time

Architecture at a glance

  1. AI layer — multi‑agent brain
    Strategy agents learn long‑term patterns; planner agents turn goals into plans; reflex agents handle sub‑second safety and device actions
  2. World model & orchestration
    A shared model of rooms, devices, occupants, and routines with a message bus and event log for learning and analytics
  3. Physical & ecosystem layer
    Connectors for existing hubs and protocols (Wi‑Fi, Zigbee, Matter, vendor APIs). Can run alongside or embedded within current stacks.

Designed for real constraints

Architecture diagram
  • Deterministic local behavior under WAN/cloud outages
  • Predictable latency for wake words and safety
  • Data stays in the home by default

Built as a multi‑agent system

Auri is not a monolithic model. It’s a system of specialized agents with clear responsibilities, shared memory, and measurable SLOs

  • Reflex agents — Fast, deterministic responses for safety and low‑level control
  • Planner agents — Multi‑step planning over seconds with schedules, preferences, and constraints
  • Strategy agents — Longer‑term adaptation across seasons, usage, and energy patterns
  • Tools & skills — Calendar, weather, energy prices, OEM APIs through a controlled interface

Auri — the voice of your home

Auri has an embedded conversational agent. It turns speech, text, and app interactions into structured intents that agents can act on, running as locally as possible (wake‑word, ASR, speaker‑aware behavior)

  • Natural language routines (“start my morning”, “I’m going to bed”)
  • Speaker‑aware behavior (who is asking)
  • Persistent goals and preferences stored in the world model
  • Works across voice, app, and physical controls — not voice‑only

Learn more

See the full architecture below and how Auri maps intents into agent actions [add image]

Integration & deployment

  • Deployment targets — Embedded on hubs/routers; sidecar on existing gateways; companion app on mini‑PC/NAS
  • Integrations — Connectors to existing platforms and OEM APIs; optional SDKs for domain‑specific agents
  • Pilot flow — Scope → reference deployment → limited field trial → scale‑up with support

Next steps

Auri is in active development and running in internal environments. We’re onboarding a small number of OEM partners

Discuss a pilot