Model | Model provider |
|---|---|
Amazon Nova Premier | Amazon Bedrock |
Amazon Nova Pro | Amazon Bedrock |
Anthropic Claude 3.5 Haiku | Amazon Bedrock |
Anthropic Claude 3.7 Sonnet | Amazon Bedrock |
Anthropic Claude 4.5 Haiku | Amazon Bedrock |
Anthropic Claude 4.5 Sonnet | Amazon Bedrock |
Anthropic Claude Opus 4.1 | Amazon Bedrock |
Azure Open AI 4o | Azure Open AI |
Azure Open AI 4.1 | Azure Open AI |
Azure Open AI 4.1 Nano | Azure Open AI |
Azure Open AI 4.1 Mini | Azure Open AI |
Azure Open AI gpt5-mini | Azure Open AI |
Cohere Command R | Amazon Bedrock |
Google Gemini 2.0 Flash | Google Gemini |
Google Gemini 2.5 Flash | Google Gemini |
Google Gemini 2.5 Pro | Google Gemini |
Google Gemini 3 Flash Preview | Google Gemini |
Google Gemini 3.1 Pro Preview | Google Gemini |
Property | Description |
|---|---|
Name | Name of the Azure OpenAI connection. |
Location | Project or folder to save your assets. By default, assets are saved to the Default project. |
Model Provider | Provider of the LLM model being configured. |
Description | Optional. Description of the model connection. |
Deployment Name | Name of the LLM deployment you've configured within Azure OpenAI. |
Azure Endpoint | Specific Azure OpenAI service endpoint URL for your deployed model. |
OpenAI API Version | Version of the Azure OpenAI API you intend to use. |
API Key | Authentication key or token provided by Azure to authorize your access. |
Property | Description |
|---|---|
Name | Name of the Amazon Bedrock connection. |
Location | Project or folder to save your assets. By default, assets are saved to the Default project. |
Model Provider | Provider of the model being configured. |
Description | Optional. Description of the model connection. |
Model | ID string of the Bedrock foundation model you want your AI agents to use for language processing tasks. If you use a cross-region inference type model, add the region prefix at the beginning of the model ID string. For example, to connect to Anthropic Claude Opus 4.1 in the United States, enter the following model ID: us.anthropic.claude-opus-4-1-20250805-v1:0 |
Region | AWS region where your Amazon Bedrock service is deployed, for example: us-east-1 |
Access Key | Access key ID for the IAM user that has permissions to access Bedrock. |
Secret Key | Secret access key for the IAM user that has permissions to access Bedrock. |
Property | Description |
|---|---|
Name | Name of the Google Gemini connection. |
Location | Project or folder to save your assets. By default, assets are saved to the Default project. |
Model Provider | Provider of the LLM model being configured. |
Description | Optional. Description of the model connection. |
Model | Specific Gemini LLM variant or deployment that you want to use. Select a value from the drop-down list or enter your own value. If you enter your own value, you need to enter the Gemini model code, for example, gemini-3.1-pro-preview. Note: If you enter an invalid model code, the connection fails. For more information about Google Gemini model codes, see "Models" in the Gemini API documentation. |
API Version | Google Gemini API version. This field is set to one of the following values based on the model you've selected:
Select a value from the drop-down list or enter your own value. You might want to enter your own value when a model is promoted from a preview version to a stable (GA) version. For example, you're creating a connection to Gemini 3.1 Pro Preview, so this value is set to v1beta. When Google promotes this model to a stable version, you can change the version to "v1." Or, if Google introduces a new API version, you can enter that value here. Note: If you enter an invalid API version or enter a version that isn't compatible with the model, the connection fails. For more information about Google Gemini API versions, see "API versions explained" in the Gemini API documentation. |
API Key | Authentication credentials required to access the Google Cloud API securely. |
Property | Description |
|---|---|
Name | Name of the model connection. |
Location | Project or folder to save your assets. By default, assets are saved to the Default project. |
Model Provider | Provider of the LLM being configured. |
Description | Optional. Description of the model connection. |
Model | Name of the specific model variant or deployment that you want to use. Enter the model name as you would in an API call, for example, llama-3.1-8b-instant, and not the plain text name "Llama 3.1 8B." Note: If you enter an invalid model name, the connection fails. For more information about the model name, see the model's documentation or API reference. |
Model API Format | Format of the model API. Enter the model API format as you would in an API call, for example, openai, and not the plain text name "OpenAI." Note: If you enter an invalid API format or enter an API format that isn't compatible with the model, the connection fails. For more information about the model API format, see documentation for the model. |
Base URL | Base URL for the model API endpoint, for example: https://api.groq.com/openai/v1 |
API Key | Authentication credentials required to access the model API securely. |
Advanced Properties | Add advanced properties to configure any custom properties that are specific to the LLM endpoint, for example, endpoint or header. Note: Be sure that the model supports all advanced properties that you configure. If you configure a property that the model doesn't support, the AI agent returns an error at runtime. |
Property | Value |
|---|---|
Name | My Llama 3.1 8B Connection |
Location | Default |
Model Provider | Generic LLM Model |
Description | My connection to Groq Llama 3.1 8B Instant using OpenAI |
Model | llama-3.1-8b-instant |
Model Provider | openai |
Base URL | https://api.groq.com/openai/v1 |
API Key | gsk_Example456abcdeF789gHIjk0123lLMnOPq456RsT789uVWxyZExample123 |