LLM and Generative AI Features¶
The Kore.ai XO Platform offers a comprehensive solution for integrating Generative AI capabilities into conversational AI applications. By leveraging the Platform's Generative AI capabilities, users can create powerful, engaging, and human-like conversational experiences for their end-users.
Pre-built Integrations
The Platform seamlessly integrates with leading AI services, including OpenAI, Azure OpenAI, and Anthropic. These pre-built integrations come with pre-configured prompt templates, allowing users to quickly access their core capabilities while maintaining a standardized structure. Users can also create custom prompts tailored to their specific needs.
The pre-built integration framework also supports newly launched language models with required authentication, ensuring immediate availability. Users can integrate the models and create custom prompts for immediate use.
Bring Your Own (BYO) Model Framework
In addition to the out-of-the-box integrations, the Platform supports a bring-your-own (BYO) model framework. This allows platform users to integrate with externally hosted models by third parties as well as models hosted by enterprises themselves. The framework enables the creation of custom prompts optimized for specific purposes and models, working seamlessly with the platform's Auth Profiles module. It enables enterprises to use their preferred authentication mechanism.
Kore.ai XO GPT Models
The new Kore.ai XO GPT Models module provides fine-tuned large language models optimized for enterprise conversational AI applications. These models have been evaluated and fine-tuned to be accurate, safe, and efficient for production deployment. For more information, see Kore.ai XO GPT.
To configure LLM and Generative AI, click the Generative AI Tools icon in the left navigation.
Key Features¶
The Integration of LLM and Generative AI enables the following features:
-
Model Library: Connect to various Generative AI models using pre-built integrations, custom integrations, or the Kore.ai XO GPT Module. Learn more
-
Prompts Library: Create customized prompts for specific use cases. Learn more
-
GenAI - Designtime features: Automatic Dialog Generation, Conversation Test Cases Suggestion, Conversation Summary, NLP Batch Test Cases Suggestion, Training Utterance Suggestions, and Use Case Suggestions. Learn more
-
GenAI - Runtime features: Agent Node for entity collection, Answer from documents, Prompt Node for custom use cases, Rephrase Dialog Responses, Zero-shot and Few-shot ML Models, Repeat Responses, and Rephrase User Query. Learn more
-
Safeguards: Data Anonymization and Guardrails.
Benefits¶
All these features benefit VA developers, NLP developers, and testers as follows:
- Choose between custom and pre-built LLM integrations.
- Quickly create dialog tasks.
- Build custom use cases with Generative AI.
- Automate mundane tasks (dialog generation, training utterances).
- Get suggestions for better VA design and development.
- Customize connections to LLMs and optimize prompts for specific use cases.
Getting Started¶
- Integrate a pre-built or custom LLM or Kore.ai XO GPT in models library.
- Create New Prompts in the prompts library.
- Enable GenAI Features.
- (Optional) Enable Data Anonymization and Guardrails.