Skip to main content

2 posts tagged with "LLMs"

View All Tags

How Mapbox, Monday.com and CircleCI Use In-App Product AI Assistants

· 4 min read

Many forward-thinking technical companies like OpenAI, CircleCI, Temporal, Mixpanel, and Docker are adopting Large Language Models (LLMs) trained on their documentation to improve their developer experience.

In addition to enhancing documentation, these companies are increasingly integrating AI bots directly into their products as AI assistants. This approach allows users to get answers to their questions within the product itself, reducing the need to consult documentation or open support tickets.

This approach offers several key benefits:

  • Instant, in-product support: Users can get help without leaving their workflow.
  • Reduced support burden: Many common questions can be answered automatically.
  • Improved documentation: Insights from user queries help identify areas for improvement.

Let's look at how some leading companies are implementing this technology to enhance their user experience.

Showcase #1: CircleCI's In-Product AI Assistant

CircleCI, a continuous integration and continuous delivery platform, has implemented an AI-powered chatbot using kapa.ai with access to their comprehensive documentation, YouTube tutorials, community forums, and internal knowledge base.

By embedding the kapa.ai widget within the CircleCI application, users can get instant help while actively working on their pipelines, reducing workflow disruptions and improving efficiency.

CircleCI In App Example

Deployment strategy: Custom "Ask AI" drop-down button that opens the kapa widget on the CircleCI Web App.

Showcase #2: Monday.com Developer RAG AI Assistant

Monday.com, a work operating system (Work OS) platform, has implemented an AI-powered chatbot with access to their developer documentation and API specifications.

The Monday.com team recently embedded the kapa.ai chatbot within their developer center, providing real-time support and enhancing the overall developer experience. This AI assistant saves developers an estimated 15-30 minutes per query, while also helping identify documentation gaps and offering multilingual support.

"At monday.com, we embedded the AI bot within our developer center, minimizing context switches and boosting productivity." - Daniel Hai, AI & API Product Manager @ Monday.com

Monday.com In App Example

Deployment strategy: Custom "Ask AI" menu button that opens the kapa widget on the Monday.com Developer Center.

Showcase #3: Mapbox Account Page Support LLM

Mapbox, a leading provider of customizable maps and location services, has integrated kapa.ai's AI-powered chatbot directly into their account page. This implementation allows logged-in users to access immediate support without leaving their workflow.

The account page features the standard kapa.ai "Ask AI" widget in the bottom right corner, implemented using an off-the-shelf script tag. This widget provides users with instant access to Mapbox's extensive knowledge base, including information from developer documentation, API references, and other technical resources.

This account page implementation, along with similar deployments across Mapbox's documentation and support sites, has contributed to a significant 20% reduction in monthly support tickets.

Mapbox In App Example

Deployment strategy: Off-the-shelf "Ask AI" chatbot that opens the kapa widget on the Mapbox Account page.


By integrating AI assistants directly into your products, you can significantly enhance user experience and reduce support burdens. If you're interested in implementing an AI assistant within your product, sign up here for a demo or reach out to the kapa team if you have questions about how to integrate AI assistants effectively.

kapa.ai is a platform designed to help developer-facing companies build AI support bots by ingesting technical knowledge from various sources and using Retrieval Augmented Generation (RAG) to provide accurate, contextual responses to user queries, ultimately improving customer experience and reducing support load. Trusted by over 100 companies, including OpenAI, Mapbox, Reddit, CircleCI, and Docker, kapa.ai is fully SOC II Type II certified.

Optimizing Technical Docs for LLMs

· 6 min read

We're seeing lots of forward-thinking technical companies like OpenAI, CircleCI, Temporal, Mixpanel, and Docker adopt Large Language Models (LLMs) trained on their documentation to improve their developer experience.

At kapa.ai we have worked with over 80 technical teams, including those mentioned above, to implement these LLM-based systems for their developers. In the process, We've learned a lot about how to structure documentation for LLMs and wanted to share some best practices to consider for others considering this approach.

1. Embrace Page Structure and Hierarchy

LLMs excel at navigating structured content and rely on context hints to understand the broader picture. A clear hierarchy of headings and subheadings on a page helps LLMs understand the relationships between different sections of your documentation.

Temporal Documentation Structure Example

A great example of this is how Temporal structures its documentation for their SDKs. Take Add a replay test within the Java SDK, which is an important feature related to workflow execution. The hierarchy of the documentation is as follows:

- Development
- Java SDK
- Develop for durability
- Add a replay test
- ...

This structure allows an LLM to more effectively navigate and understand the context when answering questions related to replay tests within the Java SDK. This is especially important as replay tests are also used in other SDKs.

2. Segment Documentation by Sub-products

To avoid LLMs confusing similar offerings, such as cloud versus open-source versions, it's also helpful to ensure that good documentation hierarchy extends to the product-level. We've seen that maintaining separate documentation for each sub-product can significantly improve the LLM's understanding of the context and the user's intent.

A great example of this is how Prisma divides their documentation into their three main offerings:

  • ORM: A Node.js and TypeScript ORM (core product)
  • Accelerate: A Global database cache (newly released)
  • Pulse: Managed change data capture (early access)
Prisma Documentation Segmentation Example

Segmenting docs per product in some cases also allows for deploying separate LLMs for each product, which can be further optimized for the specific use case.

3. Include Troubleshooting FAQs

Troubleshooting sections formatted as Q&A are an effective source for LLMs as they mirror the questions users often ask, making it easier for LLMs to understand and respond to similar questions.

OpenAI's documentation is a good example of this, particularly on their capabilities pages, where they have technical FAQs on the bottom of every page.

OpenAI Vision Documentation Example

The format that works best for LLMs is a clear question followed by a concise answer. For instance, a well-structured FAQ section might look like this:

### [Common User Questions]

[Concise 1-2 Sentence Answer]

When looking at metrics for how frequently specific sources are used in LLM responses, we've seen that technical FAQs are often the most frequently used source.

4. Provide Self-contained Example Code Snippets

Including small, self-standing code snippets can be helpful, especially for products that rely on large and often complex SDKs or APIs.

Mixpanel for example uses code snippets effectively across their documentation, which contains lots of tracking and analytics implementation code. For example, to increment numeric properties, they provide the following code snippet to showcase the mixpanel.people.increment method:

Mixpanel Documentation Example

Two other helpful tips for including code are to ensure that snippets (1) have a brief description above the code to clarify its purpose and usage, and (2) comments within the code to explain the logic and functionality. Both of these help LLMs further understand the context and purpose of the code snippet.

5. Build a Community Forum

Although less related to the structure of your documentation, this guide would be incomplete without mentioning the importance of building a community forum as a source for both developers and LLMs to get help on undocumented topics.

For example, CircleCI has an active and well maintained community forum where users can ask questions and get help from other users and CircleCI staff.

CircleCI Community Forum Example

Similar to FAQs, a technical forum works well because it mirrors the questions users often ask. A forum also works well as an interim solution for questions not yet covered in your official docs.

Note that care should be taken when including forum content. Applying filters such as only including responses marked resolved or accepted can help ensure the relevancy of the content and including links to the original forum thread ensures authors are properly attributed.

6. A Few More Practical Tips

In addition to above, here's a few tactical tips to solve common documentation-related issues we've seen with LLMs:

  • Avoid storing docs in files: Keep relevant content directly in your docs rather than in linked files such as PDFs, as LLMs have a harder time parsing these.
  • Write text descriptions for images: Ensure information conveyed through screenshots is also described in text, as LLMs parse text more efficiently.
  • Provide OpenAPI specs for REST APIs: Providing structured OpenAPI specifications makes it possible to leverage custom parsers, which can improve formatting for LLMs.
  • Include example requests and responses: Include these in your API descriptions to give LLMs concrete examples of how to use your APIs.
  • Define specific acronyms and terms: Clarify all acronyms and specialized terminology within your documentation to aid LLM comprehension.
  • Include necessary imports in code examples: This ensures code examples can run without additional context.

These tips can significantly improve LLMs' ability to understand and accurately respond to user queries.


By following these guidelines, you can significantly enhance the usefulness of your technical documentation and sources for LLMs, ultimately improving the developer experience.

If you're interested in testing out an LLM on your technical sources then sign up here for a quick demo on your content or reach out to the kapa team if you have questions about how to further optimize your technical documentation for LLMs.