Mirora AI

[ ARCHITECTURE ]

Service Architecture

FastAPI microservices orchestrated with Docker Compose across foundation, processing, and orchestration tiers.

Current Service Inventory

Active services in the `mirora-ai-microservices` platform.

Core AI Service

LLM gateway, prompt registry, guardrails, and shared request metrics.

localhost:8000

Document Processing

File ingestion, extraction, and semantic chunking for downstream AI workflows.

localhost:8001

Vector Store Service

ChromaDB-backed vector storage, collection management, and similarity search.

localhost:8002

RAG Service

Retrieve-augment-generate pipeline with citation support and caching.

localhost:8003

Evaluation Framework

Evaluation suites, datasets, and regression-focused service validation workflows.

localhost:8004

Knowledge Generation Service

Topic lifecycle management, freshness controls, and semantic search for reusable knowledge.

localhost:8005

Synthetic Data Service

Domain-specific synthetic entities, timelines, events, and document generation.

localhost:8010

Information Service

Schema-driven extraction, gap analysis, and question generation.

localhost:8020

Image Generation Service

Dedicated text-to-image generation API in the main microservices stack.

localhost:8040

Embedding Service

Standalone embedding API using sentence-transformers and OpenAI-compatible routes.

localhost:8006 (standalone)

Runtime Modes

Platform orchestration is profile-driven from the root Docker Compose file.

Default

Infrastructure-only startup for local dependencies (Redis + PostgreSQL).

services

Infrastructure plus core platform services for API and integration work.

full

Infrastructure, services, gateway, and monitoring stack for end-to-end validation.

monitoring

Infrastructure plus Prometheus/Grafana observability components.

Platform Stack

Compute

Dockerized FastAPI services orchestrated with Compose profiles.

Database

PostgreSQL 15 with SQLAlchemy async persistence where required.

Cache & Queue

Redis 7 for caching, queueing, and service-specific DB index partitioning.

Gateway & Monitoring

Traefik routing with Prometheus and Grafana for platform observability.

Deployment Status

ServiceEndpointStatusDependencies
MAIN COMPOSE SERVICES
Core AI Servicelocalhost:8000✅ ActivePostgreSQL, Redis
Document Processinglocalhost:8001✅ ActiveCore AI, Redis
Vector Store Servicelocalhost:8002✅ ActiveChromaDB
RAG Servicelocalhost:8003✅ ActiveCore AI, Embedding Service, Vector Store, Redis
Evaluation Frameworklocalhost:8004✅ ActiveCore AI, Document Processing, Information Service, Synthetic Data, PostgreSQL, Redis
Knowledge Generationlocalhost:8005✅ ActiveCore AI, Document Processing, Information Service, Vector Store, Redis, PostgreSQL
Embedding Servicelocalhost:8006✅ ActiveRedis (optional)
Synthetic Data Servicelocalhost:8010✅ ActiveCore AI, Redis
Information Servicelocalhost:8020✅ ActiveCore AI, Redis
Image Generationlocalhost:8040✅ ActiveService-specific model providers

Processing Pipelines

Document Ingestion

Upload → Document Processing → Text Extraction → Chunking → Embeddings → Vector Store

End-to-end document processing with vector storage for retrieval

Career Analysis

Resume Upload → Document Processing → Information Extraction → Gap Analysis → Role Matching

Powers the Career Analysis tool

LLM Request

Request → Core AI Gateway → Provider Selection → LLM Provider → Metrics + Response

Unified LLM access with caching and metrics

RAG Query

Query → Embedding → Vector Store Search → Context Retrieval → Core AI (LLM) → Response

Retrieval-augmented generation grounded in your documents

Knowledge Lifecycle

Topic Request → Knowledge Generation → Information + Vector Enrichment → Status Tracking → Ready Knowledge Base

Managed topic knowledge with freshness and lifecycle state transitions

Platform Standards

Framework

FastAPI with async support. Pydantic models. OpenAPI docs. Structured logging.

Containerization

Docker multi-stage builds. Health checks. Render Blueprint deployment.

Security

API key auth (X-API-Key). CORS configuration. TLS everywhere.

Observability

Structured JSON logging. Health endpoints. Prometheus-compatible metrics.

Ready to discuss your architecture needs?

START A CONVERSATION