Add an External Model using Easy Integration¶
Easily integrate models from popular providers like OpenAI, Anthropic, Google, Cohere, and Amazon Bedrock using the Easy Integration option in Agent Platform.
Integrate a Model from Anthropic¶
Steps to add the Anthropic Claude-V1 model using easy integration:
- Click Models on the top navigation bar of the application. The Models page is displayed.
-
Click the External models tab on the Models page.
-
Click Add a model under the External models tab. The Add an external model dialog is displayed.
-
Select the Easy integration option to integrate models from Open AI, Anthropic, Google, or Cohere and click Next.
-
Select a provider to integrate with and click Next.
A pop-up with the list of all the Anthropic models that are supported in Agent Platform is displayed.
For more information on the list of external models supported, see Supported models.
-
Select the required Model from the options listed and click Next.
-
Enter the respective API key you have received from the provider in the API key field and click Confirm to start the integration.
The model is integrated and is listed in the External models list.
Note
- You can click the 3 dots icon corresponding to the Model name in the list of external models and edit or delete the model.
- You can set the Inference option using the toggle button corresponding to the Model name. If the Inferencing toggle is ON, you can use this model across Agent Platform. If the toggle button is OFF, it means you cannot infer it anywhere in Agent Platform. For example, if you turn OFF the toggle button, then in the playground, an error message is displayed that the model is not active even though you have added it in the external models tab.
Integrate a Model from Amazon Bedrock¶
You can easily connect Amazon Bedrock models to the Agent Platform using a guided setup flow. This process enables secure role-based access using your own AWS credentials.
Important
Customers must create an IAM role within their AWS account with the necessary permissions in their AWS system (e.g., access to AWS Bedrock APIs). This role must include a trust policy that allows the Kore Agent Platform’s AWS principal (or a designated IAM role in a Kore AWS account) to assume it. For more information, see Configuring Amazon Bedrock models.
Steps to add Amazon Bedrock models using easy integration:
1. Start the Integration
- Click Models in the top navigation bar of the application.
- Go to the External Models tab and click Add a model.
-
Select Easy integration > AWS Bedrock and click Next.
2. Configure the Integration
In the AWS Bedrock dialog, configure the following:
-
Credentials:
- Identity Access Management (IAM) Role ARN: Enter the full ARN of your IAM role that has permission to invoke Amazon Bedrock models. This role allows secure cross-account access following least-privilege principles.
For more information, see Setting Up Credentials and Trust Policy (IAM Role & STS).
- Identity Access Management (IAM) Role ARN: Enter the full ARN of your IAM role that has permission to invoke Amazon Bedrock models. This role allows secure cross-account access following least-privilege principles.
-
Model Details:
- Model name: Enter a custom name to identify this model internally within your workflows.
- Model ID: Enter the Model ID or Endpoint ID of the Amazon Bedrock model you want to use. For more information, see Finding the Right Model ID and Region.
- Region: Specify the AWS region where the Bedrock model is deployed.
-
Headers (Optional): Provide any additional information to include with the HTTP request. Use this if your model requires custom headers for configuration or authentication.
For example: "Content-Type": "application/json" -
Variables: In the Prompt Variables section, define any input variables that will be used within your request payload. These are used to bind dynamic input values to your payload structure.
For example: {{prompt}}, {{system.prompt}} -
Body: Provide a sample JSON request body for invoking the model. Use the defined variable placeholders {{variableName}} (such as {{prompt}}) to bind input fields dynamically. For example:
Note: The structure of the request body should follow the model-specific API schema. Use only supported parameters for the selected Amazon Bedrock model.
3. Test the Configuration
- Test Response: Provide sample values for your variables and click Test to invoke the model and preview the response.
- Configure JSON Path: Define JSON paths to extract relevant output fields (for example, response text, token usage) from the model response.
4. Finalize the Configuration
- Click Save as draft to store the configuration without activating it.
- Or, click Confirm to finalize and add the model connection.
Once completed, your model appears in the External Models tab. You can now reference this model in your Prompts and Tools across the platform.