tyndale-ai-service/app
Danny 3324b6ac12 feat: add OpenAI integration with dependency injection support
- Add OpenAIAdapter class using official OpenAI SDK with async support
- Create custom exception hierarchy for LLM errors (authentication,
  rate limit, connection, configuration, response errors)
- Refactor adapter factory to use FastAPI Depends() for dependency injection
- Update configuration to support 'openai' mode with API key and model settings
- Add proper HTTP error mapping for all LLM exception types
- Update Dockerfile with default OPENAI_MODEL environment variable
- Update .env.example with OpenAI configuration options
2026-01-13 15:17:44 -06:00
..
llm feat: add OpenAI integration with dependency injection support 2026-01-13 15:17:44 -06:00
__init__.py feat: add FastAPI skeleton for LLM chat service 2026-01-07 19:32:57 -06:00
config.py feat: add OpenAI integration with dependency injection support 2026-01-13 15:17:44 -06:00
main.py feat: add OpenAI integration with dependency injection support 2026-01-13 15:17:44 -06:00
schemas.py feat: add OpenAI integration with dependency injection support 2026-01-13 15:17:44 -06:00