
Knowledge AI in AI Studio Now in Public Beta
We are excited to announce that Knowledge AI in AI Studio is now in public beta and accessible from the AI Studio canvas. With Knowledge AI and the new Q&A Node, you can reduce repetitive Q&A and automate tedious support workflows. It unifies your business content into one searchable knowledge base, so your assistant can respond with accurate, trusted answers.
Dependably Leverage LLMs to Improve Customer Experience
Generative AI is reshaping how businesses support customers. Companies have raced to implement AI’s cost-saving opportunities in their business processes. One powerful area was to augment virtual agents to provide a more comprehensive, natural support experience.
Building on AI Studio’s no-code, low-code platform, Knowledge AI simplifies the process even further. No longer do teams need to worry about intents, building training sets, or crafting individual responses. Leveraging generative AI allows virtual agents to handle a much wider array of customer requests, improving the customer experience.
With LLMs, users can speak more naturally by using their everyday language, without having to follow strict command formats or use specific keywords. Additionally, LLMs have the flexibility to create novel answers that can help explain the same information in many different ways. Almost like a human.
However, a challenge with LLMs and Generative AI is that they can be too creative! The AI wants to answer user questions, even when it doesn’t have the correct information to properly answer them. These are called hallucinations and present a big risk to a company.
For instance, if you run a bike shop, a user may ask, “How much does it cost for new brakes?”. But your website doesn’t include details for new brakes, maybe only the cost of full bikes. The LLM will likely give the user an answer based on information from its training data, even if it doesn’t include your shop’s prices. So the user will get a hallucination; a totally wrong answer, but given confidently, so they think it’s the real price. LLMs may also give disappointing answers by generating very general responses, lacking specific business relevance your users seek. This goes hand-in-hand with answers that are out of date. While fast and convenient, inaccurate or generic responses can frustrate users and erode trust. Vonage has built Knowledge AI to fix these LLM shortcomings.
Imagine again a user asking your virtual assistant, “How do I update my billing info?”. Instead of guessing or going off-script, it gives them the exact steps exclusively from your company’s help center. That’s Vonage’s Knowledge AI. It connects your assistant to real-time, trusted sources like CRMs and knowledge bases, so every response is grounded, accurate, and on-brand.
What is Knowledge AI?
Knowledge AI is a new way to leverage Large Language Models (LLMs) to scale your customer support and let your team focus on mission-critical tasks. Most importantly, Knowledge AI greatly reduces the hallucination problem so your Virtual Agents only provide trustworthy answers.
What Counts as a Knowledge Base?
To give up-to-date, business-specific responses Knowledge AI requires you to create a specialized knowledge base called an Index for your virtual agent. You do this by adding Sources to an Index. Sources can be files like PDF, txt, or HTML or they can be websites or publicly accessible files on a cloud server.
GIF showing the process of adding content sources in Vonage Knowledge AI to power accurate, evidence-based LLM responses.
To make use of your Index, you simply add the Q&A node into your agent’s flow. There you will select which parameter to send to Knowledge AI and which parameter to which the result should be stored. You also have the ability to configure three optional parameters,
When configuring the Q&A Node, you can:
Set a response time limit in milliseconds
Define an average answer length (minimum 20 words)
Add up to 5 response guidelines (100 words each)
This allows you to customize your agent’s tone, style, structure, and level of detail to match your brand voice or use case.
Some of the options for response guidelines include:
Tone - specify whether the assistant should respond in a formal, friendly, casual, or concise tone.
Topic Restriction - set guidelines for topics that should be avoided during the conversation to ensure the assistant stays on track.
Company-Specific Guidelines - ensure the assistant uses your company’s exact terminology, such as using "XYZ Corp" instead of "XYZ" or "we", to maintain brand consistency.
Custom Guardrails - implement additional rules based on issues or gaps identified during testing to fine-tune the assistant's responses.
Screenshot of the Vonage AI Studio interface showing KAI configuration settings including response parameters, wait time, and guidelines for generating clear, helpful answers.
Get Started With Knowledge AI
You can learn more about Knowledge AI by visiting the AI Studio documentation. The Knowledge AI feature is now fully available to all users of AI Studio, free of charge. This may change in the future. We’d love to hear how you are leveraging LLMs and low-code tools in your team! Join the Vonage Developer Community Slack or follow us on X, formerly known as Twitter to share your exciting projects and insights.
Additional Resources
Share:
)
Benjamin Aronov is a developer advocate at Vonage. He is a proven community builder with a background in Ruby on Rails. Benjamin enjoys the beaches of Tel Aviv which he calls home. His Tel Aviv base allows him to meet and learn from some of the world's best startup founders. Outside of tech, Benjamin loves traveling the world in search of the perfect pain au chocolat.