Models
Models serve as the core engine of Agents, and the Galadriel framework supports multiple LLM providers to offer flexibility and scalability. The following LLM providers are currently supported:
- LiteLLMModel
- HfApiModel
- TransformersModel
- AzureOpenAIServerModel
In the sections below, we provide details on each supported provider and, more importantly, how to use them.
LiteLLMModel
The LiteLLMModel
leverages LiteLLM to support over 100 LLMs from various providers. Here’s how to use it:
HfApiModel
This model interact with Hugging Face’s Inference API. Here’s how to use it:
TransformersModel
The TransformersModel
allows you to load and run Hugging Face models locally using the transformers
library. Ensure that both transformers
and torch
are installed before use.
AzureOpenAIServerModel
The AzureOpenAIServerModel
enables integration with any Azure OpenAI deployment. Below is an example of how to configure and use it:
Conclusion
Galadriel provides seamless integration with multiple LLM providers, allowing you to choose the best model for your needs—whether it’s via LiteLLM for broad model access, Hugging Face’s API for hosted inference, local execution with Transformers, or Azure OpenAI for enterprise deployments. By leveraging these options, you can build powerful, flexible AI agents tailored to your specific requirements.