Skip to main content

· 4 min read

Many forward-thinking technical companies like OpenAI, CircleCI, Temporal, Mixpanel, and Docker are adopting Large Language Models (LLMs) trained on their documentation to improve their developer experience.

In addition to enhancing documentation, these companies are increasingly integrating AI bots directly into their products as AI assistants. This approach allows users to get answers to their questions within the product itself, reducing the need to consult documentation or open support tickets.

This approach offers several key benefits:

  • Instant, in-product support: Users can get help without leaving their workflow.
  • Reduced support burden: Many common questions can be answered automatically.
  • Improved documentation: Insights from user queries help identify areas for improvement.

Let's look at how some leading companies are implementing this technology to enhance their user experience.

Showcase #1: CircleCI's In-Product AI Assistant

CircleCI, a continuous integration and continuous delivery platform, has implemented an AI-powered chatbot using kapa.ai with access to their comprehensive documentation, YouTube tutorials, community forums, and internal knowledge base.

By embedding the kapa.ai widget within the CircleCI application, users can get instant help while actively working on their pipelines, reducing workflow disruptions and improving efficiency.

CircleCI In App Example

Deployment strategy: Custom "Ask AI" drop-down button that opens the kapa widget on the CircleCI Web App.

Showcase #2: Monday.com Developer RAG AI Assistant

Monday.com, a work operating system (Work OS) platform, has implemented an AI-powered chatbot with access to their developer documentation and API specifications.

The Monday.com team recently embedded the kapa.ai chatbot within their developer center, providing real-time support and enhancing the overall developer experience. This AI assistant saves developers an estimated 15-30 minutes per query, while also helping identify documentation gaps and offering multilingual support.

"At monday.com, we embedded the AI bot within our developer center, minimizing context switches and boosting productivity." - Daniel Hai, AI & API Product Manager @ Monday.com

Monday.com In App Example

Deployment strategy: Custom "Ask AI" menu button that opens the kapa widget on the Monday.com Developer Center.

Showcase #3: Mapbox Account Page Support LLM

Mapbox, a leading provider of customizable maps and location services, has integrated kapa.ai's AI-powered chatbot directly into their account page. This implementation allows logged-in users to access immediate support without leaving their workflow.

The account page features the standard kapa.ai "Ask AI" widget in the bottom right corner, implemented using an off-the-shelf script tag. This widget provides users with instant access to Mapbox's extensive knowledge base, including information from developer documentation, API references, and other technical resources.

This account page implementation, along with similar deployments across Mapbox's documentation and support sites, has contributed to a significant 20% reduction in monthly support tickets.

Mapbox In App Example

Deployment strategy: Off-the-shelf "Ask AI" chatbot that opens the kapa widget on the Mapbox Account page.


By integrating AI assistants directly into your products, you can significantly enhance user experience and reduce support burdens. If you're interested in implementing an AI assistant within your product, sign up here for a demo or reach out to the kapa team if you have questions about how to integrate AI assistants effectively.

kapa.ai is a platform designed to help developer-facing companies build AI support bots by ingesting technical knowledge from various sources and using Retrieval Augmented Generation (RAG) to provide accurate, contextual responses to user queries, ultimately improving customer experience and reducing support load. Trusted by over 100 companies, including OpenAI, Mapbox, Reddit, CircleCI, and Docker, kapa.ai is fully SOC II Type II certified.

· 2 min read

As one of the world's leading Shopify theme development agencies, Clean Canvas receives a high volume of support inquiries through their contact form. Looking to improve the customer experience while reducing the burden on their support team, they turned to kapa's Conversation API to intelligently answer user questions before they submit the form as a first line of defense.

The Build: A Two-Step Contact Form Enhanced with AI

To implement this, Clean Canvas split their contact form flow into two parts:

  1. The first part gathers the product/theme, subject, and message - the key info needed for the Conversation API.
  2. Upon clicking to proceed, the API generates a reply to the question which is displayed to the user. Here's an example of calling the API with a user's question:
curl -X GET \
'<KAPA_API_ENDPOINT>/query/v1?query=How+do+I+change+my+favicon?' \
-H 'X-API-TOKEN: <KAPA_API_TOKEN>'
  1. The user can then exit if their question was answered, or continue to submit the form.
Cleancanvas contact form AI enhancement
This allows the AI to provide assistance in context, right where the customer's focus already is, allowing them to get an immediate answer without having to wait for a support rep.

The Results: A 9% Drop in Ticket Volume

In the 28 days following the deployment of the AI-enhanced contact form, Clean Canvas observed a significant reduction in form submissions. The page-hit/submit ratio decreased from a previous range of 52-56% to just 44%.

This 9% reduction in submissions, with approximately 2,000 visitors interacting with the contact form during that month, equates to an estimated 174 support tickets deflected. This means about 44 fewer tickets per week that the support staff needed to address.

Clean Canvas Ticket Deflection Chart

Next Steps: UX Polish and Smarter Prompts

After this successful initial trial, Clean Canvas now plans to further refine the UX and polish the design of the AI-enhanced form. They're also exploring improvements to the AI prompts to make the responses even more helpful.

By intelligently utilizing AI to assist customers at key touchpoints, Clean Canvas was able to significantly reduce their support volume while still providing timely help to their users. Tools like kapa's Conversation API make it easier than ever to reap the benefits of AI for support and beyond.

· 6 min read

We're seeing lots of forward-thinking technical companies like OpenAI, CircleCI, Temporal, Mixpanel, and Docker adopt Large Language Models (LLMs) trained on their documentation to improve their developer experience.

At kapa.ai we have worked with over 80 technical teams, including those mentioned above, to implement these LLM-based systems for their developers. In the process, We've learned a lot about how to structure documentation for LLMs and wanted to share some best practices to consider for others considering this approach.

1. Embrace Page Structure and Hierarchy

LLMs excel at navigating structured content and rely on context hints to understand the broader picture. A clear hierarchy of headings and subheadings on a page helps LLMs understand the relationships between different sections of your documentation.

Temporal Documentation Structure Example

A great example of this is how Temporal structures its documentation for their SDKs. Take Add a replay test within the Java SDK, which is an important feature related to workflow execution. The hierarchy of the documentation is as follows:

- Development
- Java SDK
- Develop for durability
- Add a replay test
- ...

This structure allows an LLM to more effectively navigate and understand the context when answering questions related to replay tests within the Java SDK. This is especially important as replay tests are also used in other SDKs.

2. Segment Documentation by Sub-products

To avoid LLMs confusing similar offerings, such as cloud versus open-source versions, it's also helpful to ensure that good documentation hierarchy extends to the product-level. We've seen that maintaining separate documentation for each sub-product can significantly improve the LLM's understanding of the context and the user's intent.

A great example of this is how Prisma divides their documentation into their three main offerings:

  • ORM: A Node.js and TypeScript ORM (core product)
  • Accelerate: A Global database cache (newly released)
  • Pulse: Managed change data capture (early access)
Prisma Documentation Segmentation Example

Segmenting docs per product in some cases also allows for deploying separate LLMs for each product, which can be further optimized for the specific use case.

3. Include Troubleshooting FAQs

Troubleshooting sections formatted as Q&A are an effective source for LLMs as they mirror the questions users often ask, making it easier for LLMs to understand and respond to similar questions.

OpenAI's documentation is a good example of this, particularly on their capabilities pages, where they have technical FAQs on the bottom of every page.

OpenAI Vision Documentation Example

The format that works best for LLMs is a clear question followed by a concise answer. For instance, a well-structured FAQ section might look like this:

### [Common User Questions]

[Concise 1-2 Sentence Answer]

When looking at metrics for how frequently specific sources are used in LLM responses, we've seen that technical FAQs are often the most frequently used source.

4. Provide Self-contained Example Code Snippets

Including small, self-standing code snippets can be helpful, especially for products that rely on large and often complex SDKs or APIs.

Mixpanel for example uses code snippets effectively across their documentation, which contains lots of tracking and analytics implementation code. For example, to increment numeric properties, they provide the following code snippet to showcase the mixpanel.people.increment method:

Mixpanel Documentation Example

Two other helpful tips for including code are to ensure that snippets (1) have a brief description above the code to clarify its purpose and usage, and (2) comments within the code to explain the logic and functionality. Both of these help LLMs further understand the context and purpose of the code snippet.

5. Build a Community Forum

Although less related to the structure of your documentation, this guide would be incomplete without mentioning the importance of building a community forum as a source for both developers and LLMs to get help on undocumented topics.

For example, CircleCI has an active and well maintained community forum where users can ask questions and get help from other users and CircleCI staff.

CircleCI Community Forum Example

Similar to FAQs, a technical forum works well because it mirrors the questions users often ask. A forum also works well as an interim solution for questions not yet covered in your official docs.

Note that care should be taken when including forum content. Applying filters such as only including responses marked resolved or accepted can help ensure the relevancy of the content and including links to the original forum thread ensures authors are properly attributed.

6. A Few More Practical Tips

In addition to above, here's a few tactical tips to solve common documentation-related issues we've seen with LLMs:

  • Avoid storing docs in files: Keep relevant content directly in your docs rather than in linked files such as PDFs, as LLMs have a harder time parsing these.
  • Write text descriptions for images: Ensure information conveyed through screenshots is also described in text, as LLMs parse text more efficiently.
  • Provide OpenAPI specs for REST APIs: Providing structured OpenAPI specifications makes it possible to leverage custom parsers, which can improve formatting for LLMs.
  • Include example requests and responses: Include these in your API descriptions to give LLMs concrete examples of how to use your APIs.
  • Define specific acronyms and terms: Clarify all acronyms and specialized terminology within your documentation to aid LLM comprehension.
  • Include necessary imports in code examples: This ensures code examples can run without additional context.

These tips can significantly improve LLMs' ability to understand and accurately respond to user queries.


By following these guidelines, you can significantly enhance the usefulness of your technical documentation and sources for LLMs, ultimately improving the developer experience.

If you're interested in testing out an LLM on your technical sources then sign up here for a quick demo on your content or reach out to the kapa team if you have questions about how to further optimize your technical documentation for LLMs.