Add CORS support for frontend development with configurable origins via
CORS_ORIGINS environment variable. Add /chat/stream endpoint for
Server-Sent Events streaming with true streaming support for OpenAI
adapter and fallback single-chunk behavior for other adapters.
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Add OpenAIAdapter class using official OpenAI SDK with async support
- Create custom exception hierarchy for LLM errors (authentication,
rate limit, connection, configuration, response errors)
- Refactor adapter factory to use FastAPI Depends() for dependency injection
- Update configuration to support 'openai' mode with API key and model settings
- Add proper HTTP error mapping for all LLM exception types
- Update Dockerfile with default OPENAI_MODEL environment variable
- Update .env.example with OpenAI configuration options
- POST /chat endpoint with message and conversation_id support
- GET /health endpoint for Cloud Run health checks
- Local and Remote LLM adapters with async httpx
- Pydantic schemas and environment-based config
- Dockerfile configured for Cloud Run deployment
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>