AI Integration
Multi-provider AI support built with LangChain.
Supported Models
OpenAI (GPT-4, GPT-3.5)Google GeminiAnthropic Claude
The stack abstracts the complexity of different providers, allowing you to switch models with a simple configuration change or let users choose their preferred model.
Core Modules
Chat
Streaming chat interface with history support.
RAG (Retrieval-Augmented Generation)
Upload documents (PDF, TXT) and query them using vector search (ChromaDB).
Image Generation
Generate images from text prompts.
Summarization
Summarize long texts or documents.
Configuration
AI services are located in src/lib/ai. You need to configure the API keys for the providers you intend to use.
env
# AI keys
OPENAI_API_KEY=...
GOOGLE_API_KEY=...
ANTHROPIC_API_KEY=...Using the API
You can interact with the AI modules via the standardized API routes.
javascript
// Example: Chat API
GET /api/module/chat?message=Hello&conversationId=123
// Example: RAG Query
POST /api/module/rag
Form Data: { action: 'query', question: '...', ... }