Grok

Overview

Grok integration is implemented using the OpenAI SDK with a custom base URL, supporting the 'grok-2-latest' model and function calling capabilities.

Implementation

Provider Setup

class GrokProvider implements LLMProvider {
  private client: OpenAI;

  constructor() {
    this.client = new OpenAI({
      apiKey: process.env.XAI_API_KEY,
      baseURL: 'https://api.x.ai/v1',
    });
  }
}

Message Generation

async generateResponse(
  messages: Array<{ role: string; content: string }>, 
  cleanContent: string
): Promise<LLMResponse> {
  const formattedMessages = messages.map(msg => ({
    role: msg.role as 'user' | 'assistant' | 'system',
    content: msg.content
  }));

  const response = await this.client.chat.completions.create({
    model: 'grok-2-latest',
    messages: formattedMessages,
    tools: [/* function definitions */]
  });
}

Features

Supported Capabilities

  • OpenAI-compatible API

  • Function calling support

  • Role-based messaging

  • Custom base URL

Function Tools

  • Image generation

  • Weather data

  • Market data

  • Time information

Configuration

Environment Setup

XAI_API_KEY=your_api_key

API Configuration

  • Base URL: https://api.x.ai/v1

  • Model: grok-2-latest

  • OpenAI SDK compatibility

Best Practices

Setup

  • Secure API key management

  • Base URL configuration

  • Model version selection

  • Error handling

Message Processing

  • Role validation

  • Content formatting

  • Tool configuration

  • Response handling

Last updated