Build a Chatbot using Regolo.ai and Flowise

Introduction to Flowise and Installation

Flowise is a platform that allows you to build and deploy AI agents visually. It provides an intuitive interface for creating intelligent automation with ease. In this guide, we will walk you through the process of installing Flowise on your local machine using Docker, which is the quickest and most reliable method to achieve a complete and functional installation in just a few minutes.

To begin, download the Flowise repository using the following command:

git clone git@github.com:FlowiseAI/Flowise.gitCode language: Bash (bash)

Once the repository is cloned, navigate into the downloaded folder to start building the Docker image using:

docker build --no-cache -t flowise .Code language: Bash (bash)

After the build process is complete, you can launch Flowise with a simple command:

docker run -d --name flowise -p 3000:3000 flowiseCode language: Bash (bash)

This will allow you to access its interface via your browser at localhost:3000.

When you first access the newly created instance, you will be prompted to complete an initial configuration where you need to enter your credentials to be stored in its database.

Flowise enables the creation of various automation and agents using the JavaScript version of Langchain and Langraph in its first version. The platform now includes a second version which is less reliant on these external tools. In this brief tutorial, we will demonstrate how to create a QA system specifically for Regolo.ai’s documentation. Additionally, we’ll explore how to connect a data source, embed it, and transmit it to an LLM, all using Regolo.ai’s APIs.

The first step involves creating a Document Store to save all documents retrieved from our documentation site. You can either upload a markdown document or choose from the numerous available sources, both external and document-based.

After setting up the document store, select the document loader. For this task, we will use the Cheerio web scraper. Start by performing an initial scan from the root of the documentation site, following the links found to ensure all related pages are scanned. Remember to use HTMLtoMarkdown as a splitter since our site is in HTML. Once this is done, you can save and begin creating the actual flow.

Setting Up the Chatflow

With the foundational elements in place, we can now proceed to configure the chatflow that will enable seamless interaction with Regolo.ai’s capabilities. Begin by selecting “New Chatflow” in the Flowise interface, which opens up a workspace where you can design the architecture of your flow. In this workspace, position a “Document Store” component and select the previously created store, “Regolo Docs,” from the dropdown menu. This setup ensures that all relevant documents are readily accessible for processing.

Next, add an “OpenAI Embedding Custom” component, followed by a “Chat OpenAI Custom” component. These elements will handle the embedding and chat functionalities, respectively. To efficiently manage data vectors, incorporate an “In Memory Vector Store” into your setup. Conclude this arrangement by adding a “Retrieval QA Chain” component, which will facilitate the flow of questions and answers. Interconnect these components to mirror the structure depicted in the following image.

The next step involves configuring the language models to access Regolo.ai and select your preferred models. Start by entering your API key in the settings of the custom embedder, ensuring the key is saved for future connections, and label it as “Regolo.ai.” Choose “gte-Qwen2” as the model for the embedder. In the Additional Parameters, check the “strip new lines” option to format inputs correctly, and enter https://api.regolo.ai/v1 as the base URL before saving your configurations.

Similarly, configure the Chat OpenAI Custom component by entering your API key and saving the credentials. Select your preferred model from the options available on the Regolo Models page; for this example, “Llama-3.1-8B-Instruct” is used. In the Additional Parameters, set https://api.regolo.ai/v1 as the base URL and save.

Upserting Documents into the Vector Database and finalize the flow

With the chatflow setup complete, the next step is to transform our documents into vectors and insert them into the vector database. To begin this process, save your flow using the designated save button, ensuring to provide it with a meaningful name for future reference. Once the flow is saved, a green “Upsert DB” button will appear in the upper right corner, marked with a database icon. Clicking this button initiates the upserting operation, which may take several minutes depending on the size of your document store.

Upon completion, a popup will confirm the number of documents successfully inserted and provide a preview of the sources. After verifying this information, you can save the flow again and proceed to the actual testing phase.

Open the chat window by clicking the purple button in the top right corner and pose a query that pertains to the content within your documentation. You will notice that the bot now responds accurately, citing sources and keeping the conversation focused on the topics it has been trained on. This ensures a precise and reliable interaction tailored to your specific documentation needs.

Advice and Conclusions

In this brief tutorial, we’ve explored just one of the many possible flows enabled by Flowise, using one of the simplest configurations. However, the potential of Flowise extends far beyond this. For example, you could replace the vector database with more powerful options like Qdrant or Pinecone to achieve faster response times and handle larger volumes of documentation. Additionally, you could implement a conditional response flow, where an agent manages certain types of responses already within the documentation or redirects users to an agent that suggests more creative answers, or even to an email inbox for handling with a human-in-the-loop. Flowise also supports automated agent flows and integrations with various third-party services, exponentially increasing its potential.

Another noteworthy aspect is that once you have created a chat flow or an agent flow, if Flowise is running on a production server, you can easily expose the chat or agent by embedding it in a webpage or application with a simple click. This is facilitated by Flowise’s ability to manage and offer a public UI on demand, which is also highly customizable in terms of colors and icons.

After spending some hours exploring Flowise, I can confidently say it is worth delving deeper into. The integration with open models and privacy-oriented systems like Regolo.ai is not only straightforward but also very simple. This makes Flowise a powerful and versatile tool for anyone looking to maximize the potential of artificial intelligence in managing documentation and automating processes.