Skip to main content

6 docs tagged with "user-guide"

View all tags

Configuration

COO-LLM uses YAML configuration files for all settings. The configuration is hierarchical and supports environment variable substitution, validation, and hot-reload.

COO-LLM Overview

Learn about COO-LLM, an intelligent load balancer and reverse proxy for Large Language Model APIs with OpenAI compatibility

Deployment

This guide covers deploying COO-LLM in various environments, from development to production.

Examples

Collection of code examples for common use cases with COO-LLM.

Practical Usage Guide

Practical guide to implementing COO-LLM in real applications with tips and best practices

Providers

COO-LLM supports multiple LLM providers through a plugin-based architecture. Each provider implements a common interface for seamless integration.