Supported LLMs#
ContextGem supports all LLM providers and models available through the LiteLLM integration. This means you can use models from major cloud providers like OpenAI, Anthropic, Google, and Azure, as well as run local models through providers like Ollama and LM Studio.
ContextGem works with both types of LLM architectures:
Reasoning/CoT-capable models (e.g.,
openai/o4-mini
,ollama/deepseek-r1:32b
)Non-reasoning models (e.g.,
openai/gpt-4.1
,ollama/llama3.1:8b
)
For a complete list of supported providers, see the LiteLLM Providers documentation.
☁️ Cloud-based LLMs#
You can initialize cloud-based LLMs by specifying the provider and model name in the format <provider>/<model_name>
:
from contextgem import DocumentLLM
# Pattern for using any cloud LLM provider
llm = DocumentLLM(
model="<provider>/<model_name>",
api_key="<api_key>",
)
# Example - Using OpenAI LLM
llm_openai = DocumentLLM(
model="openai/gpt-4.1-mini",
api_key="<api_key>",
# see DocumentLLM API reference for all configuration options
)
# Example - Using Azure OpenAI LLM
llm_azure_openai = DocumentLLM(
model="azure/o4-mini",
api_key="<api_key>",
api_version="<api_version>",
api_base="<api_base>",
# see DocumentLLM API reference for all configuration options
)
💻 Local LLMs#
For local LLMs, you’ll need to specify the provider, model name, and the appropriate API base URL:
from contextgem import DocumentLLM
local_llm = DocumentLLM(
model="ollama/<model_name>",
api_base="http://localhost:11434", # Default Ollama endpoint
)
# Example - Using Llama 3.1 LLM via Ollama
llm_llama = DocumentLLM(
model="ollama/llama3.1:8b",
api_base="http://localhost:11434",
# see DocumentLLM API reference for all configuration options
)
# Example - Using DeepSeek R1 reasoning model via Ollama
llm_deepseek = DocumentLLM(
model="ollama/deepseek-r1:32b",
api_base="http://localhost:11434",
# see DocumentLLM API reference for all configuration options
)
For a complete list of configuration options available when initializing DocumentLLM instances, see the next section Configuring LLM(s).