Open Bedrock Server

A unified, provider-agnostic chat completions API server supporting OpenAI and AWS Bedrock.

Find this useful? Star the repo to follow updates and show support!

Install from PyPIpip install open-bedrock-server and run bedrock-chat serve. See CLI Reference for all options.


Quick Start

Installation

# From PyPI
pip install open-bedrock-server

# Or from source
git clone https://github.com/teabranch/open-bedrock-server.git
cd open-bedrock-server
uv pip install -e .

Configure

bedrock-chat config set

Or set environment variables:

export OPENAI_API_KEY=sk-your-key
export AWS_PROFILE=your-profile
export API_KEY=your-server-auth-key

Run

bedrock-chat serve --host 0.0.0.0 --port 8000

Verify:

curl http://localhost:8000/health

Basic Usage

curl -X POST http://localhost:8000/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer your-api-key" \
  -d '{
    "model": "gpt-4o-mini",
    "messages": [{"role": "user", "content": "Hello!"}],
    "stream": false
  }'

Unified Endpoint

The /v1/chat/completions endpoint is the only endpoint you need. It:

  1. Auto-detects your input format (OpenAI, Bedrock Claude, Bedrock Titan)
  2. Routes to the appropriate provider based on model ID
  3. Converts between formats as needed
  4. Streams responses in real-time when requested

Format Combinations

Input Format Output Format Use Case Streaming
OpenAI OpenAI Standard OpenAI usage Yes
OpenAI Bedrock Claude OpenAI clients to Bedrock response Yes
OpenAI Bedrock Titan OpenAI clients to Titan response Yes
Bedrock Claude OpenAI Bedrock clients to OpenAI response Yes
Bedrock Claude Bedrock Claude Claude format preserved Yes
Bedrock Titan OpenAI Titan clients to OpenAI response Yes
Bedrock Titan Bedrock Titan Titan format preserved Yes

Documentation

Core

Advanced Features

Guides

Development


Key Features

Unified Interface

  • Single Endpoint: /v1/chat/completions handles everything
  • Auto-Detection: Intelligent format detection
  • Model-Based Routing: Automatic provider selection
  • Format Conversion: Seamless translation between formats

File Query System

  • File Upload: Upload documents to S3 storage with OpenAI-compatible API
  • Smart Processing: Automatic content extraction from CSV, JSON, HTML, XML, Markdown, and text files
  • Chat Integration: Use file_ids parameter to include file content as context in conversations
  • File Management: Complete CRUD operations for uploaded files

Enterprise Ready

  • Authentication: Secure API key-based authentication
  • Streaming: Real-time response streaming
  • Error Handling: Comprehensive error management
  • Monitoring: Request/response logging and health checks

Developer Friendly

  • CLI Tools: Interactive chat, configuration, and server management
  • OpenAI Compatible: Drop-in replacement for OpenAI Chat Completions API
  • Extensible: Easy to add new providers and formats
  • Well Tested: Comprehensive test coverage


License

This project is licensed under the MIT License - see the LICENSE file for details.


Open Bedrock Server is an open-source project licensed under MIT. Not affiliated with or endorsed by OpenAI or AWS.

This site uses Just the Docs, a documentation theme for Jekyll.