Factory
The LLM Factory implements a flexible provider pattern for managing multiple language model integrations. It provides a unified interface for generating responses while supporting provider-specific f
Architecture
Core Interfaces
interface LLMResponse {
content: string;
totalTokens?: number;
toolCalls?: Array<{
id: string;
type: 'function';
function: {
name: string;
arguments: string;
}
}>;
}
interface LLMProvider {
generateResponse(
messages: Array<{ role: string; content: string }>,
cleanContent: string
): Promise<LLMResponse>;
}
export type ModelType = 'OPENAI' | 'GEMINI' | 'GROK' | 'DEEPSEEK';Factory Implementation
Providers
OpenAI Provider
Gemini Provider
Grok Provider
DeepSeek Provider
Function Calling
Core Tools
Usage
Basic Usage
Model Selection
Error Handling
Best Practices
Provider Selection
Error Handling
Performance
Related Documentation
Last updated