Home Machine Learning Integrating an Exterior API with a Chatbot Software utilizing LangChain and Chainlit | by Tahreem Rasul | Feb, 2024

Integrating an Exterior API with a Chatbot Software utilizing LangChain and Chainlit | by Tahreem Rasul | Feb, 2024

0
Integrating an Exterior API with a Chatbot Software utilizing LangChain and Chainlit | by Tahreem Rasul | Feb, 2024

[ad_1]

A sensible information to integrating exterior APIs for superior interactions with a chatbot software utilizing LangChain and Chainlit

On this tutorial, we are going to see how we are able to combine an exterior API with a customized chatbot software. In my earlier articles on constructing a customized chatbot software, we’ve coated the fundamentals of making a chatbot with particular functionalities utilizing LangChain and OpenAI, and construct the net software for our chatbot utilizing Chainlit.

Workflow of an API-Built-in Chatbot — Picture by creator

Should you’re new to this collection, I like to recommend testing my earlier articles for an in depth step-by-step information:

This tutorial will concentrate on enhancing our chatbot, Scoopsie, an ice-cream assistant, by connecting it to an exterior API. You may consider an API as an accessible approach to extract and share information inside and throughout applications. Customers could make requests to an API to fetch or ship information, and the API responds again with some data. We’ll join Scoopsie to an API to fetch data from a fictional ice-cream retailer and use these responses to supply data. For many chatbot purposes, linking your customized chatbot to an exterior API might be extremely helpful and, in some circumstances, even vital.

Right here’s a fast recap of the place we left off: at the moment our chatbot.py makes use of the LLMChain to question OpenAI’s GPT-3.5 mannequin to reply a consumer’s ice-cream associated queries:

import chainlit as cl
from langchain_openai import OpenAI
from langchain.chains import LLMChain
from prompts import ice_cream_assistant_prompt_template
from langchain.reminiscence.buffer import ConversationBufferMemory

from dotenv import load_dotenv

load_dotenv()

@cl.on_chat_start
def query_llm():
llm = OpenAI(mannequin='gpt-3.5-turbo-instruct',
temperature=0)
conversation_memory = ConversationBufferMemory(memory_key="chat_history",
max_len=50,
return_messages=True,
)
llm_chain = LLMChain(llm=llm,
immediate=ice_cream_assistant_prompt_template,
reminiscence=conversation_memory)
cl.user_session.set("llm_chain", llm_chain)

@cl.on_message
async def query_llm(message: cl.Message):
llm_chain = cl.user_session.get("llm_chain")
response = await llm_chain.acall(message.content material,
callbacks=[
cl.AsyncLangchainCallbackHandler()])

await cl.Message(response["text"]).ship()

Should you haven’t arrange a conda surroundings for the challenge but, you possibly can go forward and create one. Do not forget that Chainlit requires python>=3.8.

conda create --name chatbot_langchain python=3.10

Activate your surroundings with:

conda activate chatbot_langchain

To put in all dependencies, run:

pip set up -r necessities.txt

We’ll start by creating an API to connect with Scoopsie. This API represents a fictional ice-cream retailer to permit customers to retrieve the shop’s menu, together with different data comparable to customizations, consumer opinions and particular gives. We’ll make the most of Flask, a Python framework for internet growth, to encode the above data in several API endpoints. These embrace:

  1. /menu: aGET endpoint to retrieve the menu of flavors and toppings.
  2. /customizations: aGET endpoint to retrieve the customizations
  3. /special-offers: aGET endpoint to retrieve the particular gives.
  4. /user-reviews: aGET endpoint to retrieve the consumer opinions.

To maintain Scoopsie targeted on offering data moderately than dealing with transactions or processing orders, we’ll restrict our present scope to those informational endpoints. Nevertheless, you possibly can develop this API to incorporate different endpoints, comparable to a POST endpoint to permit the consumer to submit an order, or different GET endpoints.

Step 1

Let’s create a Python script named data_store.py to retailer static information just like the menu, particular gives, buyer opinions, and customization choices. This is how we are able to construction it:

# Instance menu, particular gives, buyer opinions, and customizations
menu = {
"flavors": [
{"flavorName": "Strawberry", "count": 50},
{"flavorName": "Chocolate", "count": 75}
],
"toppings": [
{"toppingName": "Hot Fudge", "count": 50},
{"toppingName": "Sprinkles", "count": 2000},
{"toppingName": "Whipped Cream", "count": 50}
]
}
special_offers = {
"gives": [
{"offerName": "Two for Tuesday", "details": "Buy one get one free on all ice cream flavors every Tuesday."},
{"offerName": "Winter Wonderland Discount", "details": "25% off on all orders above $20 during the winter season."}
]
}
customer_reviews = {
"opinions": [
{"userName": "andrew_1", "rating": 5, "comment": "Loved the chocolate flavor!"},
{"userName": "john", "rating": 4, "comment": "Great place, but always crowded."},
{"userName": "allison", "rating": 5, "comment": "Love the ice-creams and Scoopsie is super helpful!"}
]
}
customizations = {
"choices": [
{"customizationName": "Sugar-Free", "details": "Available for most flavors."},
{"customizationName": "Extra Toppings", "details": "Choose as many toppings as you want for an extra $5!"}
]
}

You may regulate the above script to higher suit your particular wants. These examples present potential attributes for every class. In sensible purposes, storing this information in a database for dynamic retrieval is extra appropriate.

Step 2

Let’s arrange our Flask software in a file named ice_cream_store_app.py, the place we’ll import the information from data_store.py. We are able to begin by importing the required libraries and initializing the Flask software:

from flask import Flask, jsonify
from data_store import menu, special_offers, customer_reviews, customizations

app = Flask(__name__)

Step 3

Now, let’s configure the API endpoints capabilities. In Flask, these capabilities reply on to internet requests without having express arguments, because of Flask’s routing mechanism. These capabilities are designed to:

  • routinely deal with requests with out direct argument passing, apart from the implicit self for class-based views, which we’re not utilizing right here.
  • return a tuple with two parts:
    – a dict transformed to JSON format by way of jsonify()
    – an HTTP standing code, sometimes 200 to point success.

Beneath are the endpoint capabilities:

@app.route('/menu', strategies=['GET'])
def get_menu():
"""
Retrieves the menu information.
Returns:
A tuple containing the menu information as JSON and the HTTP standing code.
"""
return jsonify(menu), 200

@app.route('/special-offers', strategies=['GET'])
def get_special_offers():
"""
Retrieves the particular gives information.
Returns:
A tuple containing the particular gives information as JSON and the HTTP standing code.
"""
return jsonify(special_offers), 200

@app.route('/customer-reviews', strategies=['GET'])
def get_customer_reviews():
"""
Retrieves buyer opinions information.
Returns:
A tuple containing the client opinions information as JSON and the HTTP standing code.
"""
return jsonify(customer_reviews), 200

@app.route('/customizations', strategies=['GET'])
def get_customizations():
"""
Retrieves the customizations information.
Returns:
A tuple containing the customizations information as JSON and the HTTP standing code.
"""
return jsonify(customizations), 200

For every operate above, jsonify() is used to show Python dictionaries into JSON format, which is then returned with a 200 standing code for profitable queries.

Step 4

Lastly, let’s add the next code to our ice_cream_store_app.py script:

if __name__ == '__main__':
app.run(debug=True)

The API might be began by working the next command in terminal:

python ice_cream_store_app.py
Beginning the Flask Server for the Ice-Cream Store API — Picture by Creator

As soon as the appliance is working, Scoopsie’s customized API might be accessible at http://127.0.0.1:5000/. To take a look at the assorted endpoints, you should utilize instruments like Postman or use an online browser to view a selected endpoint: http://127.0.0.1:5000/{endpoint_name}.

Ice-Cream Store API Endpoints — Picture by Creator
Ice-Cream Store API Endpoints in Motion — by Creator

Chains in LangChain simplify complicated duties by executing them as a sequence of easier, related operations. These chains sometimes incorporate parts like LLMs, PromptTemplates, output parsers, or exterior third-party APIs, which we’ll be specializing in on this tutorial. I dive into LangChain’s Chain performance in larger element in my first article on the collection, that you could entry right here.

Beforehand, we utilized LangChain’s LLMChain for direct interactions with the LLM. Now, to increase Scoopsie’s capabilities to work together with exterior APIs, we’ll use the APIChain. The APIChain is a LangChain module designed to format consumer inputs into API requests. This may allow our chatbot to ship requests to and obtain responses from an exterior API, broadening its performance.

The APIChain might be configured to deal with completely different HTTP strategies (GET, POST, PUT, DELETE, and so on.), set request headers, and handle the physique of the request. It additionally helps JSON payloads, that are generally utilized in RESTful API communications.

Step 1

Let’s first import LangChain’s APIChain module, alongwith the opposite required modules, in our chatbot.py file. This script will host all our software logic. You may arrange the mandatory surroundings variables, such because the OPENAI_API_KEY in a .env script, which might be accessed by the dotenv python library.

import chainlit as cl
from langchain_openai import OpenAI
from langchain.chains import LLMChain, APIChain
from langchain.reminiscence.buffer import ConversationBufferMemory
from dotenv import load_dotenv

load_dotenv()

Step 2

For the APIChain class, we want the exterior API’s documentation in string format to entry endpoint particulars. This documentation ought to define the API’s endpoints, strategies, parameters, and anticipated responses. This aids the LLM in formulating API requests and parsing the responses. It’s useful to outline this data as a dictionary after which convert it in to a string for later utilization.

Let’s create a brand new python script referred to as api_docs.py and add the docs for our fictional retailer’s API:

import json

scoopsie_api_docs = {
"base_url": "<http://127.0.0.1:5000/>",
"endpoints": {
"/menu": {
"methodology": "GET",
"description": "Retrieve the menu of flavors and customizations.",
"parameters": None,
"response": {
"description": "A JSON object containing out there flavors
and toppings together with their counts.",
"content_type": "software/json"
}
},
"/special-offers": {
"methodology": "GET",
"description": "Retrieve present particular gives and reductions.",
"parameters": None,
"response": {
"description": "A JSON object itemizing the present particular
gives and reductions.",
"content_type": "software/json"
}
},
"/customer-reviews": {
"methodology": "GET",
"description": "Retrieve buyer opinions for the ice cream retailer.",
"parameters": None,
"response": {
"description": "A JSON object containing buyer
opinions, scores, and feedback.",
"content_type": "software/json"
}
},
"/customizations": {
"methodology": "GET",
"description": "Retrieve out there ice cream customizations.",
"parameters": None,
"response": {
"description": "A JSON object itemizing out there
customizations like toppings and sugar-free
choices.",
"content_type": "software/json"
}
}
}
}

# Convert the dictionary to a JSON string
scoopsie_api_docs = json.dumps(scoopsie_api_docs, indent=2)

I’ve formatted our customized API’s documentation right into a Python dictionary referred to as scoopsie_api_docs. This dictionary contains the API’s base URL and particulars our 4 endpoints beneath the endpoints key. Every endpoint lists its HTTP methodology (all GET for us), a concise description, accepted parameters (none for these endpoints), and the anticipated response format—a JSON object with related information. The dictionary is then was a JSON string utilizing json.dumps, indented by 2 areas for readability.

Let’s import this API documentation in our chatbot.py script:

from api_docs import scoopsie_api_docs

Step 3

The APIChain requires two prompts: one for choosing the appropriate API endpoint and one other to create a concise reply to the consumer question primarily based on that endpoint. These prompts have default values, nevertheless, we might be creating our personal prompts to make sure a personalised interplay. We are able to add the next new prompts in our prompts.py file:

api_url_template = """
Given the next API Documentation for Scoopsie's official
ice cream retailer API: {api_docs}
Your process is to assemble essentially the most environment friendly API URL to reply
the consumer's query, guaranteeing the
name is optimized to incorporate solely vital data.
Query: {query}
API URL:
"""
api_url_prompt = PromptTemplate(input_variables=['api_docs', 'question'],
template=api_url_template)

api_response_template = """"
With the API Documentation for Scoopsie's official API: {api_docs}
and the precise consumer query: {query} in thoughts,
and given this API URL: {api_url} for querying, right here is the
response from Scoopsie's API: {api_response}.
Please present a abstract that straight addresses the consumer's query,
omitting technical particulars like response format, and
specializing in delivering the reply with readability and conciseness,
as if Scoopsie itself is offering this data.
Abstract:
"""
api_response_prompt = PromptTemplate(input_variables=['api_docs',
'question',
'api_url',
'api_response'],
template=api_response_template)

Right here, the api_url_prompt generates the precise API URL for queries utilizing the supplied API documentation (api_docs). After figuring out the proper endpoint with api_url_prompt, the APIChain makes use of the api_response_prompt to summarize the API’s response to reply the consumer’s question. Let’s import these prompts in our chatbot.py script:

from prompts import api_response_prompt, api_url_prompt

Step 4

Let’s arrange the APIChain to attach with our beforehand created fictional ice-cream retailer’s API. The APIChain module from LangChain gives the from_llm_and_api_docs() methodology, that lets us load a sequence from simply an LLM and the api docs outlined beforehand. We’ll proceed utilizing the gpt-3.5-turbo-instruct mannequin from OpenAI for our LLM.

# Initialize your LLM
llm = OpenAI(mannequin='gpt-3.5-turbo-instruct',
temperature=0)

api_chain = APIChain.from_llm_and_api_docs(
llm=llm,
api_docs=scoopsie_api_docs,
api_url_prompt=api_url_prompt,
api_response_prompt=api_response_prompt,
verbose=True,
limit_to_domains=["<http://127.0.0.1:5000/>"]
)

The parameter limit_to_domains within the code above limits the domains that may be accessed by the APIChain. In accordance with the official LangChain documentation, the default worth is an empty tuple. Which means no domains are allowed by default. By design this may elevate an error on instantiation. You may cross None if you wish to permit all domains by default. Nevertheless, this isn’t really useful for safety causes, as it might permit malicious customers to make requests to arbitrary URLs together with inner APIs accessible from the server. To permit our retailer’s API, we are able to specify its URL; this may make sure that our chain operates inside a managed surroundings.

Step 5

Within the earlier tutorials, we arrange an LLMChain to deal with normal ice-cream associated queries. We might nonetheless need to retain this performance, since Scoopsie is a useful conversational buddy, whereas additionally incorporating entry to our fictional retailer’s menu and customization choices by way of the APIChain. To mix these capabilities, we’ll use the llm_chain for normal queries and the api_chain for accessing the shop’s API. This requires adjusting our Chainlit setup to assist a number of chains from the beginning of a consumer session. This is how we are able to adapt the @cl.on_chat_start decorator:

@cl.on_chat_start
def setup_multiple_chains():
llm = OpenAI(mannequin='gpt-3.5-turbo-instruct',
temperature=0)
conversation_memory = ConversationBufferMemory(memory_key="chat_history",
max_len=200,
return_messages=True,
)
llm_chain = LLMChain(llm=llm, immediate=ice_cream_assistant_prompt,
reminiscence=conversation_memory)
cl.user_session.set("llm_chain", llm_chain)

api_chain = APIChain.from_llm_and_api_docs(
llm=llm,
api_docs=scoopsie_api_docs,
api_url_prompt=api_url_prompt,
api_response_prompt=api_response_prompt,
verbose=True,
limit_to_domains=["<http://127.0.0.1:5000/>"]
)
cl.user_session.set("api_chain", api_chain)

Upon initiating a brand new consumer session, this setup instantiates each llm_chain and api_chain, guaranteeing Scoopsie is provided to deal with a broad vary of queries. Every chain is saved within the consumer session for straightforward retrieval. For data on organising the llm_chain, you possibly can view my earlier article.

Step 6

Let’s now outline the wrapper operate across the @cl.on_message decorator:

@cl.on_message
async def handle_message(message: cl.Message):
user_message = message.content material.decrease()
llm_chain = cl.user_session.get("llm_chain")
api_chain = cl.user_session.get("api_chain")

if any(key phrase in user_message for key phrase in ["menu", "customization",
"offer", "review"]):
# If any of the key phrases are within the user_message, use api_chain
response = await api_chain.acall(user_message,
callbacks=[cl.AsyncLangchainCallbackHandler()])
else:
# Default to llm_chain for dealing with normal queries
response = await llm_chain.acall(user_message,
callbacks=[cl.AsyncLangchainCallbackHandler()])
response_key = "output" if "output" in response else "textual content"
await cl.Message(response.get(response_key, "")).ship()

On this setup, we retrieve each the llm_chain and api_chain objects. If the consumer message features a key phrase reflective of an endpoint of our fictional retailer’s API, the appliance will set off the APIChain. If not, we assume it’s a normal ice-cream associated question, and set off the LLMChain. This can be a easy use-case, however for extra complicated use-cases, you may want to put in writing extra elaborate logic to make sure the proper chain is triggered. For additional particulars on Chainlit’s decorators and successfully make the most of them, refer again to my earlier article the place I delve into these subjects extensively.

Step 7

Now that our software code is prepared, we are able to launch our chatbot. Open a terminal in your challenge listing and run the next command:

chainlit run chatbot.py -w --port 8000

You may entry the chatbot by navigating to http://localhost:8000 in your internet browser.

Scoopsie’s software interface is now prepared! Here’s a demo showcasing the chatbot in motion:

Scoopsie Chatbot Demo: Interactive Ice-Cream Assistant in Motion — by Creator

We’ve efficiently constructed an API for a fictional ice-cream retailer, and built-in it with our chatbot. As demonstrated above, you possibly can entry the net software of your chatbot utilizing Chainlit, the place each normal queries and the fictional retailer’s API endpoints might be accessed.

You will discover the code for this tutorial on this GitHub repo. The GitHub checkpoint for this tutorial will comprise all developed code up till this level.

You may comply with alongside as I share working demos, explanations and funky facet initiatives on issues within the AI area. Come say hello on LinkedIn and X! 👋



[ad_2]