Skip to main content

FAQ

What LLM models does kapa use?

kapa.ai is model agnostic, meaning we're not tied to any single language model or provider. Our mission is to stay at the forefront of applied RAG, so you don't have to. We constantly evaluate and incorporate the latest academic research, models, and techniques to optimize our system for one primary goal: providing the most accurate and reliable answers to technical questions.

To achieve this, we work with multiple model providers, including but not limited to OpenAI, Anthropic, Cohere, and Voyage. We also run our own models when necessary. This flexible approach allows us to select the best-performing model for each specific use case and continuously improve our service as the field of AI rapidly evolves. To ensure data privacy and security we have DPAs and training opt-outs with all providers we work with.

What languages does kapa support?

kapa is optimized to hold conversations in English but can respond to users in a variety of languages. This is particularly beneficial for companies that previously lacked staff with the necessary language skills to answer questions from certain user bases. Importantly, the language of the user's question and the language of the content indexed by kapa do not have to be the same. For example, a user could ask a question in French, and kapa can utilize English knowledge sources to provide a response in French. Kapa can also index non English content.

Can I have separate instances of kapa with different knowledge sources?

Yes the kapa platform supports having multiple kapa instances under a single account. Each instance has its own isolated index which you can populate with different data for different purposes. A very common pattern is to have one external and one internal instance.

The external instance has access to all of your public facing knwoledge sources and answers developer questions via a widget on your documentation.

The internal instance has access to the same public sources plus internal knwowledge bases, guides and past support tickets. It has access to non-public data so it is only deployed internally in your company and assists your employees with their technical questions and helps them answer support tickets.

Why should I choose kapa?

  • Optimized & Automated Technical Knowledge Ingestion: kapa.ai automates the ingestion of diverse technical sources, ensuring an up-to-date knowledge base.
  • Domain-Specific Retrieval: The kapa platform relies on advanced domain-specific retrieval and neural search engines for precise, domain-specific content retrieval.
  • Minimizing Hallucinations: kapa.ai reduces inaccuracies by focusing on relevant content retrieval, providing citations, and maintaining topic relevance.
  • Off-The-Shelf Integrations: kapa.ai integrates with existing developer tools and workflows, including Slack, Discord, Zendesk, and more.
  • Incorporating Feedback for Improvement: The system integrates user feedback mechanisms and analysis tools for continuous enhancement of model performance.
  • Improve Your Documentation: kapa offers analytics to identify gaps in your documentation and areas to focus on.

Check out customer stories to see how Prisma, Langchain, Mapbox, CircleCI successfully use kapa.

What is the difference between kapa and ChatGPT?

kapa is a retrieval augmented generation (RAG) system, which means it retrieves relevant information from a company's documentation before generating a response, ensuring accurate and context-specific answers to developer questions. In contrast, ChatGPT is a large language model (LLM) that generates responses based on its training data without retrieving specific external information. Additionally, while ChatGPT is a general-purpose AI, kapa is purely optimized to answer technical questions.