Supported AI Models on Agent Platform¶
The Agent Platform offers flexible orchestration and runtime support for a wide range of AI Models across leading providers. Whether you're building real-time assistants, multi-agent systems, or background workflows, you can choose from Platform-hosted, open-source, externally integrated, or third-party models to meet your needs. Some advanced capabilities—like real-time voice interaction—are available only with specific models.
Supported AI Models for Agents and Supervisors¶
External Models¶
Agentic Apps supports Agent and Supervisor orchestration with the most widely used models from providers such as OpenAI, Azure OpenAI, Anthropic, and Google.
Model Provider | Model Variant |
OpenAI |
|
Azure OpenAI |
|
Anthropic |
|
|
Models with Real-Time Voice Support¶
While Agent and Supervisor Orchestration is supported for most commonly used LLMs, Real-Time Voice is currently available only with selected models that offer advanced voice capabilities.
Model Provider | Model Variant |
OpenAI |
|
Google Gemini |
|
Custom Model Support in Agentic Apps¶
Agentic Apps also support integrating with custom third party models via API Integration.
To use a custom model in Agentic Apps
- The model must support Tool Calling functionality.
- The request and response structure must follow the API reference of either Anthropic (Messages) or OpenAI (Chat Completions).
For detailed steps, see Add an External Model Using API Integration.
Supported Open-Source Models¶
Agent Platform currently supports over thirty open-source models and provides them as a service. Platform-hosted models can be optimized prior to deployment, making them ideal for private environments or specialized applications.
The supported models and their variants are given below:
Model Provider | Model Variant |
Amazon |
|
Argilla |
|
DeepSeek |
|
Eleutherai |
|
|
|
|
|
Helsinki-nlp |
|
Huggingfaceh4 |
|
Meta-llama |
|
Microsoft |
|
Mistralai |
|
OpenAI |
|
OpenAI Community |
|
Stable Diffusion |
|
T5 |
|
Tiiuae |
|
Xiaomi |
|
Supported Models for Structured Output¶
Kore-hosted open-source models can produce structured JSON responses, making outputs consistent and easy to parse.
- Structured output support depends on the optimization technique used: No optimization or vLLM.
- Models optimized with CT2, fine-tuned models, Hugging Face imports, and locally imported models are not supported.
The following table lists the models that support structured JSON output:
Model Name | vLLM | No Optimization |
---|---|---|
amazon/MistralLite | ❌ | ✅ |
argilla/notus-7b-v1 | ❌ | ✅ |
EleutherAI/gpt-j-6b | ❌ | ✅ |
facebook/opt-1.3b | ✅ | ✅ |
facebook/opt-2.7b | ✅ | ✅ |
facebook/opt-350m | ✅ | ✅ |
facebook/opt-6.7b | ✅ | ✅ |
google/gemma-2b | ❌ | ✅ |
google/gemma-2b-it | ❌ | ✅ |
google/gemma-7b | ❌ | ✅ |
google/gemma-7b-it | ❌ | ✅ |
HuggingFaceH4/zephyr-7b-alpha | ❌ | ✅ |
HuggingFaceH4/zephyr-7b-beta | ❌ | ✅ |
meta-llama/Llama-2-7b-chat-hf | ❌ | ❌ |
meta-llama/Llama-2-7b-hf | ❌ | ✅ |
meta-llama/Llama-3.2-1B | ✅ | ❌ |
meta-llama/Llama-3.2-1B-Instruct | ✅ | ❌ |
meta-llama/Llama-3.2-3B | ✅ | ❌ |
meta-llama/Llama-3.2-3B-Instruct | ✅ | ❌ |
meta-llama/Meta-Llama-3-8B | ✅ | ✅ |
meta-llama/Meta-Llama-3-8B-Instruct | ❌ | ✅ |
meta-llama/Meta-Llama-3.1-8B | ✅ | ✅ |
meta-llama/Meta-Llama-3.1-8B-Instruct | ❌ | ✅ |
microsoft/Phi-3-medium-128k-instruct | ❌ | ✅ |
microsoft/Phi-3-medium-4k-instruct | ❌ | ✅ |
microsoft/Phi-3-mini-128k-instruct | ✅ | ✅ |
microsoft/Phi-3-mini-4k-instruct | ❌ | ✅ |
microsoft/phi-1 | ❌ | ✅ |
microsoft/phi-1_5 | ❌ | ✅ |
microsoft/phi-2 | ✅ | ✅ |
mistralai/Mistral-7B-Instruct-v0.1 | ❌ | ✅ |
mistralai/Mistral-7B-Instruct-v0.2 | ❌ | ✅ |
mistralai/Mistral-7B-Instruct-v0.3 | ✅ | ✅ |
mistralai/Mistral-7B-v0.1 | ❌ | ✅ |
openai-community/gpt2-large | ❌ | ✅ |
openai-community/gpt2-medium | ❌ | ✅ |
openai-community/gpt2-xl | ❌ | ✅ |
tiiuae/falcon-7b | ❌ | ✅ |
tiiuae/falcon-7b-instruct | ❌ | ✅ |
tiiuae/falcon-rw-1b | ✅ | ✅ |
Supported External Models for Easy Integration¶
With Easy Integration, you can quickly connect to external model providers, such as OpenAI, Anthropic, Google, Cohere, and Amazon Bedrock. No infrastructure setup is needed—just authenticate and start deploying models within flows, tools, or agents.
Here is the list of all the external models supported in Agent Platform:
Model Provider | Model Variant |
Anthropic |
|
Azure Open AI |
|
Cohere |
|
|
|
Open AI |
|