Configuring Amazon Bedrock Models¶
To ensure secure cross-account access, this setup follows the principle of least privilege. You must create an IAM Role that grants only the required permissions to invoke Bedrock models and explicitly trusts the platform to assume this role via AWS STS.
To integrate your Bedrock models, you will need to:
- Set up IAM credentials and a trust policy to allow access.
- Provide the correct Model ID and deployment region.
- Test the configuration and map the output.
Step 1. Setting Up Credentials and Trust Policy (IAM Role & STS)¶
1. Create the IAM Role in Your AWS Account
Create a new IAM role in your AWS account that grants access to invoke Amazon Bedrock models. This role will be assumed by the platform to make Bedrock API calls on your behalf.
You can follow the IAM role creation setup in the AWS IAM documentation.
For best practices on setting up IAM policies for Bedrock, see the AWS policy examples guide.
Assign the necessary permissions to the role. An example IAM policy is shown below:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"bedrock:InvokeModel",
"bedrock:ListFoundationModels"
],
"Resource": "*"
}
]
}
2. Set the Trust Policy in Your AWS Account
Set the trust policy to allow the platform to assume the IAM role. Replace
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": "<kore-arn>"
},
"Action": "sts:AssumeRole"
}
]
}
For private/on-prem deployments, the trust policy should point to your internal AWS IAM role.
3. Set the STS Endpoint
Use the STS endpoint for the region where your IAM role resides.
You can find the full list of STS endpoints in the AWS documentation.
For example:
Ensure the STS region matches the region of your IAM role — not necessarily the region of the model.
Step 2. Finding the Right Model ID and Region¶
Amazon Bedrock supports different model ID formats depending on how the model is deployed. This section outlines how to find the correct values for each case.
1. Base Foundation Models
If you're using base foundation models provided by Bedrock, you can use their standard model IDs directly. Refer to the complete list in the AWS Documentation.
2. Marketplace-Deployed Models
If you’ve subscribed to a model through the AWS Marketplace:
- Go to Bedrock Console → Model Access → Subscriptions.
- Locate the Model ARN or a Marketplace Model ID. For example:
- Enter only the model name part (after
foundation-model/
) into the Model ID field.
3. Models with Inference Profiles (Provisioned Throughput)
For models that do not support on-demand throughput (like Claude 3), you must create a Provisioned Throughput inference configuration.
- Go to the Bedrock Console > Provisioned Throughput.
- Select or create an inference configuration.
- Copy the Inference ARN or ID. For example:
- Use the
my-throughput-id
value in the Model ID field.
Step 3. Test and Map the Model¶
Once you’ve provided credentials and model details, you can test your configuration and map model responses.
1. Define Prompt Variables
In the Prompt Variables section, add any variables used in your request payload.
For example:
prompt
: user inputsystem.prompt
: system instructions
2. Define Request Body
Use {{variableName}} to bind input fields dynamically.
Example payload:
3. Test the Configuration
- Provide test values for the input variables.
- Click Test to invoke the model.
- Review the raw response.
Note: If the call fails, ensure the IAM Role, STS endpoint, and model ID are valid .
4. Map Output Fields
Configure JSON Paths to extract:
- Model output (e.g., response text)
- Input and output token counts
Example:
Field | JSONPath |
Output Text | $.output.text
|
Input Token Count | $.usage.input_tokens
|
Output Token Count | $.usage.output_tokens
|