Open Bedrock Server
A unified, provider-agnostic chat completions API server supporting OpenAI and AWS Bedrock.
Find this useful? Star the repo to follow updates and show support!
Install from PyPI —
pip install open-bedrock-serverand runbedrock-chat serve. See CLI Reference for all options.
Quick Start
Installation
# From PyPI
pip install open-bedrock-server
# Or from source
git clone https://github.com/teabranch/open-bedrock-server.git
cd open-bedrock-server
uv pip install -e .
Configure
bedrock-chat config set
Or set environment variables:
export OPENAI_API_KEY=sk-your-key
export AWS_PROFILE=your-profile
export API_KEY=your-server-auth-key
Run
bedrock-chat serve --host 0.0.0.0 --port 8000
Verify:
curl http://localhost:8000/health
Basic Usage
curl -X POST http://localhost:8000/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer your-api-key" \
-d '{
"model": "gpt-4o-mini",
"messages": [{"role": "user", "content": "Hello!"}],
"stream": false
}'
Unified Endpoint
The /v1/chat/completions endpoint is the only endpoint you need. It:
- Auto-detects your input format (OpenAI, Bedrock Claude, Bedrock Titan)
- Routes to the appropriate provider based on model ID
- Converts between formats as needed
- Streams responses in real-time when requested
Format Combinations
| Input Format | Output Format | Use Case | Streaming |
|---|---|---|---|
| OpenAI | OpenAI | Standard OpenAI usage | Yes |
| OpenAI | Bedrock Claude | OpenAI clients to Bedrock response | Yes |
| OpenAI | Bedrock Titan | OpenAI clients to Titan response | Yes |
| Bedrock Claude | OpenAI | Bedrock clients to OpenAI response | Yes |
| Bedrock Claude | Bedrock Claude | Claude format preserved | Yes |
| Bedrock Titan | OpenAI | Titan clients to OpenAI response | Yes |
| Bedrock Titan | Bedrock Titan | Titan format preserved | Yes |
Documentation
Core
- Getting Started - Installation, setup, and first steps
- API Reference - Complete API documentation
- CLI Reference - Command-line interface guide
Advanced Features
- Knowledge Bases (RAG) - Bedrock Knowledge Bases integration
- Files API - File upload and management capabilities
Guides
- Usage Guide - Programming examples and use cases
- AWS Authentication - AWS credential configuration
- Architecture - System design and architecture
- Core Components - Detailed component documentation
- Testing - Testing strategies and coverage
- Packaging - Building and distributing the package
Development
- Development Guide - Extending and customizing the server
- Test Suite - Test suite organization
- Real API Testing - Real API integration tests
Key Features
Unified Interface
- Single Endpoint:
/v1/chat/completionshandles everything - Auto-Detection: Intelligent format detection
- Model-Based Routing: Automatic provider selection
- Format Conversion: Seamless translation between formats
File Query System
- File Upload: Upload documents to S3 storage with OpenAI-compatible API
- Smart Processing: Automatic content extraction from CSV, JSON, HTML, XML, Markdown, and text files
- Chat Integration: Use
file_idsparameter to include file content as context in conversations - File Management: Complete CRUD operations for uploaded files
Enterprise Ready
- Authentication: Secure API key-based authentication
- Streaming: Real-time response streaming
- Error Handling: Comprehensive error management
- Monitoring: Request/response logging and health checks
Developer Friendly
- CLI Tools: Interactive chat, configuration, and server management
- OpenAI Compatible: Drop-in replacement for OpenAI Chat Completions API
- Extensible: Easy to add new providers and formats
- Well Tested: Comprehensive test coverage
Quick Links
License
This project is licensed under the MIT License - see the LICENSE file for details.