LLM - Summary
Lillo's language model system provides a flexible and extensible architecture for integrating multiple AI providers. The system is built around a factory pattern that supports dynamic model selection,
Core Components
src/lib/services/
βββ LLMFactory.ts # Main factory implementation
βββ ModelPreferencesService.ts # Model selection system
βββ AgentConfigService.ts # Agent configurationSupported Models
OpenAI (Primary)
Gemini
Grok
DeepSeek
Function Calling
Core Functions
Implementation
Model Selection
Per-Chat Preferences
Selection Criteria
Best Practices
Provider Selection
Error Handling
Performance
Related Documentation
Last updated