Skip to content

OSSS.ai.exceptions.llm_errors

OSSS.ai.exceptions.llm_errors

LLM-specific exceptions for OSSS.

This module defines exceptions related to Language Model interactions, API failures, and LLM provider-specific error conditions with intelligent retry policies and circuit breaker support.

LLMError

Bases: OSSSError

Base exception for LLM-related failures.

Represents errors that occur during interactions with Language Model providers, designed for intelligent retry and circuit breaker patterns.

LLMQuotaError

Bases: LLMError

Exception raised when LLM API quota is exhausted.

Represents quota/billing limit exceeded errors that require manual intervention (billing update) and should not be retried.

get_user_message()

Get user-friendly error message with billing guidance.

LLMAuthError

Bases: LLMError

Exception raised when LLM API authentication fails.

Represents authentication failures due to invalid API keys, expired tokens, or insufficient permissions.

get_user_message()

Get user-friendly error message with API key guidance.

LLMRateLimitError

Bases: LLMError

Exception raised when LLM API rate limits are hit.

Represents temporary rate limiting that should be retried with exponential backoff and circuit breaker patterns.

get_user_message()

Get user-friendly error message with rate limit guidance.

LLMTimeoutError

Bases: LLMError

Exception raised when LLM API requests time out.

Represents timeout failures that may be retried depending on the timeout duration and context.

get_user_message()

Get user-friendly error message with timeout guidance.

LLMContextLimitError

Bases: LLMError

Exception raised when LLM context/token limits are exceeded.

Represents input context that's too large for the model, requiring input reduction or chunking strategies.

get_user_message()

Get user-friendly error message with context limit guidance.

LLMModelNotFoundError

Bases: LLMError

Exception raised when the specified LLM model is not available.

Represents model availability issues, deprecation, or incorrect model names.

get_user_message()

Get user-friendly error message with model guidance.

LLMServerError

Bases: LLMError

Exception raised when LLM provider has server-side issues.

Represents 5xx errors from the LLM provider that should trigger circuit breaker patterns and aggressive retries.

get_user_message()

Get user-friendly error message with server error guidance.

LLMValidationError

Bases: LLMError

Exception raised when LLM response validation fails.

Represents structured response parsing failures when using Pydantic AI or other structured output formats. These errors indicate the LLM returned content that doesn't match the expected schema.

get_user_message()

Get user-friendly error message with validation guidance.