ChatClient API
The ChatClient is the main entry point for interacting with LLM providers.
Constructor
typescript
new ChatClient(config: ChatConfig)Parameters:
config.provider: The LLM provider ('openai'|'anthropic'|'google'|'ollama')config.model: The specific model to use (e.g.,'gpt-4o','claude-3-5-sonnet')config.apiKey?: API key (optional, falls back to environment variable)config.auth?: Authentication configuration (recommended overapiKey)config.baseUrl?: Custom base URL for API requestsconfig.conversationId?: Optional conversation ID for tracking
Methods
chat(message, options?)
Send a chat message and get a response.
Parameters:
message:string- The user message to sendoptions?:ChatOptions- Optional chat configuration
Returns: Promise<ChatResponse>
typescript
const response = await client.chat('Hello!', { temperature: 0.7 });
console.log(response.content);stream(message, options?)
Stream a chat response for real-time output.
Parameters:
message:stringoptions?:ChatOptions
Returns: AsyncIterator<StreamChunk>
typescript
for await (const chunk of client.stream('Tell me a story')) {
process.stdout.write(chunk.content);
}getHistory()
Retrieve the full conversation history for the current conversation.
Returns: Promise<Message[]>
clearHistory()
Clear the conversation history for the current conversation.
Returns: Promise<void>
