AI Agent Engineering > Create connections and tools > Model connections
  

Model connections

A model connection is a connection to a generative AI platform that uses a large language model (LLM). The model powers your AI agent’s reasoning and capabilities.
Note: Be aware that any model you configure has the potential to hallucinate, which can impact your AI agent's results.
Create a model connection for each LLM that your AI agents will use. The type of connection you configure is based on the model provider. For example, to create a connection to Amazon Nova Premier, configure an Amazon Bedrock connection.
You can also create a generic model connection to connect to an LLM that isn't natively supported.
The following table lists the natively supported LLMs and model providers:
Model
Model provider
Amazon Nova Premier
Amazon Bedrock
Amazon Nova Pro
Amazon Bedrock
Anthropic Claude 3.5 Haiku
Amazon Bedrock
Anthropic Claude 3.7 Sonnet
Amazon Bedrock
Anthropic Claude 4.5 Haiku
Amazon Bedrock
Anthropic Claude 4.5 Sonnet
Amazon Bedrock
Anthropic Claude Opus 4.1
Amazon Bedrock
Azure Open AI 4o
Azure Open AI
Azure Open AI 4.1
Azure Open AI
Azure Open AI 4.1 Nano
Azure Open AI
Azure Open AI 4.1 Mini
Azure Open AI
Azure Open AI gpt5-mini
Azure Open AI
Cohere Command R
Amazon Bedrock
Google Gemini 2.0 Flash
Google Gemini
Google Gemini 2.5 Flash
Google Gemini
Google Gemini 2.5 Pro
Google Gemini
Google Gemini 3 Flash Preview
Google Gemini
Google Gemini 3.1 Pro Preview
Google Gemini

Azure OpenAI connection details

You need to configure an Azure OpenAI connection before you can use it in an AI agent.
The following table describes the properties of an Azure OpenAI connection. These properties apply to every version of Azure OpenAI:
Property
Description
Name
Name of the Azure OpenAI connection.
Location
Project or folder to save your assets. By default, assets are saved to the Default project.
Model Provider
Provider of the LLM model being configured.
Description
Optional. Description of the model connection.
Deployment Name
Name of the LLM deployment you've configured within Azure OpenAI.
Azure Endpoint
Specific Azure OpenAI service endpoint URL for your deployed model.
OpenAI API Version
Version of the Azure OpenAI API you intend to use.
API Key
Authentication key or token provided by Azure to authorize your access.

Amazon Bedrock connection details

You need to configure an Amazon Bedrock connection before you can use one of its foundation models in an AI agent.
The following table describes the properties of an Amazon Bedrock connection. These properties apply to every type of Bedrock foundation model:
Property
Description
Name
Name of the Amazon Bedrock connection.
Location
Project or folder to save your assets. By default, assets are saved to the Default project.
Model Provider
Provider of the model being configured.
Description
Optional. Description of the model connection.
Model
ID string of the Bedrock foundation model you want your AI agents to use for language processing tasks.
If you use a cross-region inference type model, add the region prefix at the beginning of the model ID string. For example, to connect to Anthropic Claude Opus 4.1 in the United States, enter the following model ID:
us.anthropic.claude-opus-4-1-20250805-v1:0
Region
AWS region where your Amazon Bedrock service is deployed, for example: us-east-1
Access Key
Access key ID for the IAM user that has permissions to access Bedrock.
Secret Key
Secret access key for the IAM user that has permissions to access Bedrock.

Google Gemini connection details

You need to configure a Google Gemini connection before you can use it in an AI agent.
The following table describes the properties of a Google Gemini connection. These properties apply to every version of Google Gemini:
Property
Description
Name
Name of the Google Gemini connection.
Location
Project or folder to save your assets. By default, assets are saved to the Default project.
Model Provider
Provider of the LLM model being configured.
Description
Optional. Description of the model connection.
Model
Specific Gemini LLM variant or deployment that you want to use.
Select a value from the drop-down list or enter your own value. If you enter your own value, you need to enter the Gemini model code, for example, gemini-3.1-pro-preview.
Note: If you enter an invalid model code, the connection fails.
For more information about Google Gemini model codes, see "Models" in the Gemini API documentation.
API Version
Google Gemini API version. This field is set to one of the following values based on the model you've selected:
  • - v1. Indicates that the API is the stable, production version. This value is selected for stable (GA) versions of the Google Gemini LLM like Gemini 2.5 Flash.
  • - v1beta. Indicates that the API is a preview or beta version that includes early features that might be under development and are subject to breaking changes. This value is selected for preview versions of the Google Gemini LLM like Gemini 3.1 Pro Preview.
Select a value from the drop-down list or enter your own value. You might want to enter your own value when a model is promoted from a preview version to a stable (GA) version. For example, you're creating a connection to Gemini 3.1 Pro Preview, so this value is set to v1beta. When Google promotes this model to a stable version, you can change the version to "v1." Or, if Google introduces a new API version, you can enter that value here.
Note: If you enter an invalid API version or enter a version that isn't compatible with the model, the connection fails.
For more information about Google Gemini API versions, see "API versions explained" in the Gemini API documentation.
API Key
Authentication credentials required to access the Google Cloud API securely.

Generic model connection details

Configure a generic model connection to connect to a model that AI Agent Engineering doesn't natively support. You can configure a connection to any model that has OpenAI-compatible APIs like Groq Llama or GPT-4o. You need to configure the connection before you can use the model in an AI agent.
Tip: When you configure a generic model connection, be sure that the response time for a simple request is 30 seconds or less. Otherwise, the connection might time out.
The following table describes the properties of a generic model connection:
Property
Description
Name
Name of the model connection.
Location
Project or folder to save your assets. By default, assets are saved to the Default project.
Model Provider
Provider of the LLM being configured.
Description
Optional. Description of the model connection.
Model
Name of the specific model variant or deployment that you want to use. Enter the model name as you would in an API call, for example, llama-3.1-8b-instant, and not the plain text name "Llama 3.1 8B."
Note: If you enter an invalid model name, the connection fails.
For more information about the model name, see the model's documentation or API reference.
Model API Format
Format of the model API. Enter the model API format as you would in an API call, for example, openai, and not the plain text name "OpenAI."
Note: If you enter an invalid API format or enter an API format that isn't compatible with the model, the connection fails.
For more information about the model API format, see documentation for the model.
Base URL
Base URL for the model API endpoint, for example: https://api.groq.com/openai/v1
API Key
Authentication credentials required to access the model API securely.
Advanced Properties
Add advanced properties to configure any custom properties that are specific to the LLM endpoint, for example, endpoint or header.
Note: Be sure that the model supports all advanced properties that you configure. If you configure a property that the model doesn't support, the AI agent returns an error at runtime.

Example: Connecting to Groq Llama 3.1 8B

The following table shows an example of a generic model connection to Groq Llama 3.1 8B:
Property
Value
Name
My Llama 3.1 8B Connection
Location
Default
Model Provider
Generic LLM Model
Description
My connection to Groq Llama 3.1 8B Instant using OpenAI
Model
llama-3.1-8b-instant
Model Provider
openai
Base URL
https://api.groq.com/openai/v1
API Key
gsk_Example456abcdeF789gHIjk0123lLMnOPq456RsT789uVWxyZExample123

Creating a model connection

Create at least one model connection before you configure an AI agent.
    1From the navigation menu, click New.
    2In the New Asset dialog box, select Model from the New Asset list.
    3Select a model provider and then click Create.
    4Enter the connection details.
    5Click Test to verify that your connection is successful.
    6Click Save.