LLM Configuration¶
To use Generative AI features with AI for Work, you must configure the integration with a pre-built or custom LLM. By leveraging LLM and Generative AI capabilities, AI for work can create intelligent, human-like conversational experiences for your end-users.
General Purpose¶
Pre-built LLM Integration¶
AI for work offers seamless integration with leading AI services like Azure OpenAI, OpenAI, and Anthropic. You can effortlessly tap into these services' core capabilities using pre-configured prompts and APIs.
Note
This document outlines the procedure for configuring OpenAI. Similar steps apply for Azure OpenAI and Anthropic integration.
Steps to configure a pre-built LLM:
-
Go to Admin Console > Assist configuration > General purpose.
-
Click New and choose the LLM you want to configure from the dropdown.
-
Enter authorization details like Integration Name and API Key. Select Model Name from drop down.
-
Read the Policy Guidelines, select the checkbox, and click Save.
-
The configured model is listed in the General LLM Integrations.
Custom LLM Integration¶
AI for work enables enterprises to power up their virtual assistants with any Large Language Model (LLM) of their preference. The bring-your-own (BYO) model framework supports integrations with externally hosted models by third parties as well as models hosted by the enterprises themselves.
Steps to configure a Custom LLM:
- Go to Admin Console > Assist configurations > General purpose.
- Click New and choose Custom LLM from the dropdown.
-
Enter the details like Integration Name, Model Name, Endpoint URL, and Auth. Select the Method and Max request tokens from the dropdown.
-
Enter the test payload and then click Test to check the connection. If the LLM connection is successful during the test call, it displays a success message. If not, it shows an error message.
-
Read the Policy Guidelines, select the checkbox, and click Save.
- The success confirmation message is displayed on the screen. The configured model is listed in the General LLM Integrations.
Embedding models¶
This feature allows you to connect and configure models for generating embeddings. AI for work supports both pre-built models (OpenAI and Azure OpenAI) and custom LLMs for this purpose.
The procedure for Integrating Embedding Models is similar to integrating General-purpose LLMs.