Seekr provides AI-powered solutions for structured, explainable, and transparent AI interactions.This guide provides a quick overview for getting started with Seekr chat models. For detailed documentation of all
ChatSeekrFlow features and configurations, head to the API reference.
Overview
ChatSeekrFlow class wraps a chat model endpoint hosted on SeekrFlow, enabling seamless integration with LangChain applications.
Integration Details
| Class | Package | Local | Serializable | Downloads | Version |
|---|---|---|---|---|---|
| ChatSeekrFlow | seekrai | ❌ | beta |
Model Features
| Tool calling | Structured output | JSON mode | Image input | Audio input | Video input | Token-level streaming | Native async | Token usage | Logprobs |
|---|---|---|---|---|---|---|---|---|---|
| ✅ | ✅ | ✅ | ❌ | ❌ | ❌ | ✅ | ❌ | ✅ | ❌ |
Supported Methods
ChatSeekrFlow supports all methods of ChatModel, except async APIs.
Endpoint Requirements
The serving endpointChatSeekrFlow wraps must have OpenAI-compatible chat input/output format. It can be used for:
- Fine-tuned Seekr models
- Custom SeekrFlow models
- RAG-enabled models using Seekr’s retrieval system
AsyncChatSeekrFlow (coming soon).
Getting Started with ChatSeekrFlow in LangChain
This notebook covers how to use SeekrFlow as a chat model in LangChain.Setup
Ensure you have the necessary dependencies installed:API Key Setup
You’ll need to set your API key as an environment variable to authenticate requests. Run the below cell. Or manually assign it before running queries:Instantiation
Invocation
Chaining
Error Handling & Debugging
API reference
ChatSeekrFlowclass:langchain_seekrflow.ChatSeekrFlow- PyPI package:
langchain-seekrflow
Connect these docs programmatically to Claude, VSCode, and more via MCP for real-time answers.