Dragonfly

Running DocsGPT with Dragonfly: Adding AI to Your Documentation

Learn how to integrate DocsGPT with Dragonfly for efficient documentation queries, and see how AI enhances your project documentation.

June 18, 2024

cover

Introduction

It's a fact that developers spend an enormous amount of time in documentation, and we've found that Dragonfly community members are no different. However, we often get questions from community members that are answered in some detail in the documentation. And according to colleagues in tech and my own experience, this issue is not unique to Dragonfly. Whether it's because of a complicated information structure or simply a TL;DR situation, developers often have trouble finding what they need quickly in documentation.

Community Member:
How many backups does '--snapshot_cron' store?
Do I need to manually remove them?

Answer:
By default Dragonfly does not automatically remove old snapshots.
You can choose to override the same file by setting the '--dbfilename=dump' without timestamps.

Now, we love engaging with our Dragonfly community as much as any other tech team, but surely there's a way to make docs easier to use. We propose that AI, with the power of large language models (LLMs), could be a great solution. While AI still has a ways to go before it can fulfill some of the loftier goals talked about on a daily basis, gathering specific information from a vat of data is exactly in the wheelhouse of LLMs as they exist today.

So when we came across DocsGPT, we were excited about the possibilities! DocsGPT is a cutting-edge open-source solution designed to streamline the process of finding information within the documentation of a project. By leveraging the power of GPT models, DocsGPT allows developers to easily ask questions about the documentation and receive accurate, helpful answers. DocsGPT typically uses MongoDB and Celery as its backend storage and task queue. Celery, in turn, can use Dragonfly as the message broker and for result storage.

In other words, we thought we might be able to use Dragonfly to use DocsGPT to make Dragonfly's documentation easier to use. And with that nerdy inception, we set out to experiment and find out the best way to do this. If it worked, this process could be replicated seamlessly for other products' documentation as well, and we could write a post explaining how to do it.

Spoiler: It worked!


Setting up DocsGPT with Dragonfly

When we often say you need no code changes to use Dragonfly, that's indeed true. The setup is straightforward and requires only minimal modifications in terms of configuration. Here's how you can get started by following the simple steps below.

1. Clone the DocsGPT Project

Begin by cloning the DocsGPT repository from GitHub to your local machine. Open your terminal and run:

git git@github.com:arc53/DocsGPT.git
cd DocsGPT

2. Modify the docker-compose.yaml File

Next, you need to update the docker-compose.yaml file to replace the Redis image with the Dragonfly image. Open the file in your preferred text editor and find the Redis service section. Replace the Redis image line with the following:

redis:
  image: docker.dragonflydb.io/dragonflydb/dragonfly # Replace with Dragonfly image.
  ports:
    - 6379:6379

3. Run the Setup Script

Once you've made the necessary modifications, run the setup script (which works for Linux and macOS) to initialize the DocsGPT environment. If you're using Windows, follow the instructions here. For simplicity, we can choose the DocsGPT public API, which is simple and free, during the setup process. In your terminal, execute:

./setup.sh

# Do you want to:
# 1. Use DocsGPT public API (simple and free)
# 2. Download the language model locally (12GB)
# 3. Use the OpenAI API (requires an API key)
# Enter your choice (1, 2 or 3): 1
# ...

4. Start the Services

After running the setup script, Docker will pull the required images and start the services. This might take a few moments. Once completed, you should have a fully functional DocsGPT instance running locally. That's it! We changed nothing in the codebase except the Docker image URL, and DocsGPT is now using Dragonfly as the message broker and result storage for Celery.

Training DocsGPT with Dragonfly Integrations Docs

Let's try training DocsGPT with the Dragonfly documentation. Specifically, we will upload the Dragonfly Integrations documentation (which is open-sourced here) to DocsGPT and ask questions about it. Firstly, navigate to the sidebar and find the Source Docs options.

Click on the upload icon next to the Source Docs options. Browse and upload the markdown files containing the Dragonfly Integrations documentation.

Click on the Train button to begin the training process. This might take some time, depending on the size of the documentation. We can monitor the training progress, and once it's completed, click the Finish button. Now, the Dragonfly Integrations documentation is ready to use.

Go to New Chat and, from the sidebar, select the documentation we just uploaded. By asking a question about BullMQ, for instance, DocsGPT will provide effective answers based on what it has learned from the Dragonfly documentation.

All the steps above are demonstrated in the 60s video below:


Dragonfly Wings

Stay up to date on all things Dragonfly

Join our community for unparalleled support and insights

Join

Switch & save up to 80% 

Dragonfly is fully compatible with the Redis ecosystem and requires no code changes to implement. Instantly experience up to a 25X boost in performance and 80% reduction in cost